What does it mean to you to be woke and why is it bad?

My understanding of being “woke” is just accepting and acknowledging uncomfortable truths. Conservatives claim to be all about the truth and not sugarcoating anything, so when I constantly see them complaining about “wokeness” I’m at a loss as to why they think it’s bad.

I see so many conservatives complaining about how “woke” Disney has become just because they’re including characters and stories about people who aren’t straight and white. Based on my understanding of what being “woke” is I don’t understand how A. This is a bad thing or B. How this is “woke”.

So please enlighten me because of all the things that confuse me the most about this ongoing culture war this whole “woke” thing and conservative opposition to it makes the least sense to me.

submitted by /u/thedeadsigh
[link] [comments]