8
0

No one will ever be woke enough, so don't even bother trying


 email                
2022 Jun 2, 6:19pm   1,486 views  51 comments

by Patrick   ➕follow (48)   💰tip ($0.75 in tips)  

https://notthebee.com/article/two-more-thrown-from-the-wokeness-gerbil-wheel

In case you're wondering, no, you will never be woke enough. It seems like I write this column about every couple weeks. ...

The latest to learn the hard way? Some individual named Kim Siever whose house received a fresh painting before the dawning of pride month:




Statement made for all the neighborhood to see, no? Unfortunately, it was also seen by the gatekeepers of wokeness over at some outfit called Project DROC. Identifying themselves as a group of "committed Black + racialized educators," they decided Kim needed to better understand his assignment:




There is such a better way to live than this never-ending, yet ultimately futile, gerbil wheel striving to attain sufficient wokeness. Yet so many keep spinning themselves silly.

On an episode of his own podcast last fall, comedian Ricky Gervais – the same guy that has been generating a lot of attention for his recently released, allegedly "transphobic" Netflix special – told neuroscientist Sam Harris:

I want to live long enough to see the younger generation not be woke enough for the next generation. It's going to happen. Don't they realize that, it's like, they're next. That's what's funny.

« First    « Previous    Comments 27 - 51 of 51   

28   richwicks   2022 Jul 21, 6:54pm  

HunterTits says


richwicks says


We did, but have the millennials even read a Dr. Seuss book or seen one of the cartoons?


Was wondering if someone would bring this up. :)

My take: fuck no



My friend has two young children. I'll make certain they are exposed to them. I loved Dr. Seuss as a kid, although my friend (Asian) said he was weirded out by what I considered the whimsical drawings.

In anycase... I'm an "archivist" - I collect stuff not in order to... um... NOT pirate it, but rather to preserve it. Several books of Dr. Seuss are now out of print, but I have them. I have the complete Loony Tunes archive, including the "problematic ones". They can burn books all they fucking like, but I have a 6 TB HD, and I'm just a nothing archivist. I know people with 60 TB. 20 years ago, 60 TB would have been the entire internet.

I have a huge video game archive, but it's almost pointless since there's so many of us, and I now hate video games. Interesting to see the evolution, it's like seeing the evolution of film.


original link

That still makes me laugh. Back in the day, Harold Lloyd was bigger than Charlie Chaplin, but he never licensed his films to television, so he's largely forgotten because his films weren't played in the 1950's on television. All his works are in the public domain now. Chaplin's too...
29   stereotomy   2022 Jul 21, 7:32pm  

richwicks says

HunterTits says



richwicks says



We did, but have the millennials even read a Dr. Seuss book or seen one of the cartoons?


Was wondering if someone would bring this up. :)

My take: fuck no




My friend has two young children. I'll make certain they are exposed to them. I loved Dr. Seuss as a kid, although my friend (Asian) said he was weirded out by what I considered the whimsical drawings.

In anycase... I'm an "archivist" - I collect stuff not in order to... um... NOT pirate it, but rather to preserve it. Several books of Dr. Seuss are now out of print, but I have them. I have the complete Loony Tunes archive, including the "problematic ones". They can burn books all they fucking like, but I have a 6 TB HD, and I'm just a nothing archivist. I know people ...

Cohen media group have been releasing 4K scans of Buster Keaton films. Volume 1 has a great scan of "The General." The stunts buster does are even more phenomenal with the high def scan. This movie was my son's favorite from when he was 3 until about 6 years old.
30   richwicks   2022 Jul 21, 8:07pm  

stereotomy says


Cohen media group have been releasing 4K scans of Buster Keaton films. Volume 1 has a great scan of "The General." The stunts buster does are even more phenomenal with the high def scan. This movie was my son's favorite from when he was 3 until about 6 years old.


I should look up Buster Keaton, I'm not really familiar with him.

However:

<rant>
4K is fine for archival, but I worked at RCA, and the reason RCA decided on 1080p is that it was basically the resolution that the eye could see. You CAN differentiate between 4K and 1080p, but it's really difficult even for people with perfect vision and better than perfect vision. 4K is the outer limit of what you can see, and if I played a 1080p (lossless) video on a 4K display and a 4K (lossless) video on the same display, I'd bet $10,000 you couldn't reliably differentiate. If you saw them side by side, you could EEK out the difference, but when 4K came out, they used a trick by increasing the saturation (color) of 4K to convince you to buy the 4K television. You wouldn't be able to see the higher resolution without it, it was a scam.

A great deal of research was done on this - what was the minimum bandwidth for "perfect" audio and visual quality. RCA invented MP3 too, although (frankly) it was quite obvious looking back. It's just a Fourier transform with high frequencies discarded. If I played back what was discarded from a Fourier transform of a song, you'd hear a hiss, it would sound like soft quiet static in a high pitch.

8K is ludicrous. That's more pixels than we have receptors in our retina. 4K will survive, not because it's better, but because really, nobody can see beyond that. The "retina display" of the iPhone is just a marketing gimmick. You can't see beyond 1080p on a PHONE, it's too small. The result of this "improvement" is shorter battery life since more calculations need to be made to produce the pixels - it's actually an overall determent. NOBODY needs better than 1080p (and that's a stretch) on a phone. 720p is MORE than enough, even if you're a mutant.

But as an engineer, I realize that marketing triumphs over science. Since it has more pixels, it's "better" even if you can't possibly see it. Perhaps an eagle can, but you can't. Most people can't differentiate between 720p (DVD) and 1080p (HQ) - I can, if I try, with my glasses on and if I know to look for it.
</rant>

To convince people of this, you need to put them in a double blind test. People naturally want "the best" and want to consider themselves "exceptional" - this is a normal human bias. They are embarrassed when they are shown to be "average". This is Apple's marketing angle, they DO have higher resolution, but you cannot see it. It reduces battery life, and consumes more power - it's a demonstrably and provably inferior system. I use a 2K monitor, and have owned a 4K one. I never thought I'd say this, but I prefer 2K - faster update than 4K, no noticeable difference in visibility. I've reached my limit on resolution. This screen fills almost my entire vision, at least the useful parts. Peripheral only notifies you of danger, it's not informative. You cannot read 30 degrees outside of your direct vision.

I hate and respect Apple. Sure, they have "superior" things, but my phone doesn't have to be charged for 3 days because it's lower resolution, and nobody will notice it. I respect their marketing, but from an engineering standpoint, they piss me off. It doesn't convey more information, you cannot see it, you don't experience it. It's not better from your viewpoint, it's only TECHNICALLY better. A MACHINE can see the difference, a human can't.

In any case, I archive only up to 1080p - beyond that, just drastically increased storage, for no real reason. I often "transcode" - that is reduce size with (minor) reduction in quality. I am able to compare the original with the transcoded version with only differences. When I do this, I see outlines (barely). A 40GB file versus a 2GB file, you can see the difference in outline, but that's it, when you do comparison, and it's just barely visible. Mostly what is removed is film grain. Very high quality encodes have the film grain visible, I just remove this since it's just showing error in the technology, and not actual information. Some people like film grain, but to me, it's just a distraction. It's a limitation of the technology of the time, and in my opinion, it's like recording tape hiss in audio. It's only there because they didn't have a better technology.
31   stereotomy   2022 Jul 21, 8:18pm  

richwicks says

stereotomy says


Cohen media group have been releasing 4K scans of Buster Keaton films. Volume 1 has a great scan of "The General." The stunts buster does are even more phenomenal with the high def scan. This movie was my son's favorite from when he was 3 until about 6 years old.


I should look up Buster Keaton, I'm not really familiar with him.

However:

<rant>
4K is fine for archival, but I worked at RCA, and the reason RCA decided on 1080p is that it was basically the resolution that the eye could see. You CAN differentiate between 4K and 1080p, but it's really difficult even for people with perfect vision and better than perfect vision. 4K is the outer limit of what you can see, and if I played a 1080p (lossless) video on a 4K display and a 4K (lossless) video on the same display, I'd bet $10,000 you couldn't reliably differentiate. If you saw them s...

I should clarify that although the Buster Keaton was a 4K scan, the disc is 1080P. Generally, 4K is sufficient to extract every bit of information from a 35mm negative emulsion. For large format (65mm) 8K is necessary for maximum image harvest.

I agree that unless you are viewing things from about 3 feet, you can't notice a difference in resolution. What you will notice is any enhanced color (4:4:4) instead of 4:2:0 color subsampling. HDR (high dynamic range) is mostly overrated at this point because most displays can't really reproduce HDR, and even if they can, HDR encoding is hit or miss; i.e., it's mostly crap, and there's a really big learning curve on the encoding end that few studios have even come close to achieving.
32   richwicks   2022 Jul 21, 8:32pm  

stereotomy says

I agree that unless you are viewing things from about 3 feet, you can't notice a difference in resolution. What you will notice is any enhanced color (4:4:4) instead of 4:2:0 color subsampling. HDR (high dynamic range) is mostly overrated at this point because most displays can't really reproduce HDR, and even if they can, HDR encoding is hit or miss; i.e., it's mostly crap, and there's a really big learning curve on the encoding end that few studios have even come close to achieving.


My opinion is record the work, don't record the relatively primitive technology.

There's options in ffmpeg to introduce film grain, but it's entirely artificial - in other words, you tell it to mimic film grain, but it's just producing noise to simulate film grain. It's not actually reproducing the true film grain of the original.

We've reached the end of fidelity of video and audio encoding, even in lossy versions of it. In 2000 years, if we can preserve the data, people living then will be able to see our society exactly as it was, now. That's like us being able to see the Romans, in 0 AD.

If we can preserve it. There will be no real advancements of video and audio archival from this point forward. If you can think of a permanent way to store digital data for 1,000 years, seriously, I'll make you a billionaire.
33   Patrick   2022 Aug 2, 10:52am  

https://notthebee.com/article/woke-mob-calls-for-queen-beyoncs-head-over-ableist-slur


Oh my goodness! What could make this woman – who had been called a "national treasure" for years – suddenly be unpopular with the mob that supposedly loves her?

Nope, it's not the overtly sexual language in the song. Nor is it any other raunchy, filthy, obscene word.

She used the word "spaz."
34   HeadSet   2022 Aug 2, 2:40pm  

richwicks says

If you can think of a permanent way to store digital data for 1,000 years, seriously, I'll make you a billionaire.


35   richwicks   2022 Aug 2, 2:44pm  

HeadSet says

richwicks says

If you can think of a permanent way to store digital data for 1,000 years, seriously, I'll make you a billionaire.


@HeadSet - I am not kidding.

I've looked into OTPs (One Time Programmable memories) but I need at least 1GB of storage. It's been suggested I store it online and I COULD do that, but it invites tampering. I could store just the SHA256SUM (and maybe the md5sum, and a few other cryptographic hashes). You might be able to solve one hash, but if you had 1/2 dozen of them, solving them all simultaneously becomes a very complex parametric equation which I expect to forever remain impossible to solve.
36   just_passing_through   2022 Aug 3, 10:19pm  

richwicks says


HeadSet says

richwicks says

If you can think of a permanent way to store digital data for 1,000 years, seriously, I'll make you a billionaire.

@HeadSet - I am not kidding.

I've looked into OTPs (One Time Programmable memories) but I need at least 1GB of storage. It's been suggested I store it online and I COULD do that, but it invites tampering. I could store just the SHA256SUM (and maybe the md5sum, and a few other cryptographic hashes). You might be able to solve one hash, but if you had 1/2 dozen of them, solving them all simultaneously becomes a very complex parametric equation which I expect to forever remain impossible to solve.


Pffft... How about several thousand. How about 100 thousand?

This is old dude, c'mon:

https://www.science.org/content/article/dna-could-store-all-worlds-data-one-room

This is from today:

https://finance.yahoo.com/news/twist-bioscience-present-flash-memory-120000492.html

Let me know if you'd like to see a white paper. In the past you never read white papers I linked you to al la bitcorn. Seriously, acted like what I said never happened.

That's somewhat old already as well. It's already moving along at Moore's Law speed.

Get with the program mister.

You really DO work in tech, no?
37   Patrick   2022 Aug 4, 9:42am  

https://spectatorworld.com/book-and-art/taylor-swift-faces-woke-mob-private-jet-ableism/


Taylor Swift finally faces the woke mob
Her response to reports about her private jet sparks ire

Alas, recent reports of Swift’s private jet usage finally suggest she’s not the woke hero she makes herself out to be. The digital marketing agency Yard discovered that Swift’s planes emit more CO2 in seven months than 1,184 average people would in an entire year. She apparently has two jets (how luxurious!) that took 170 trips in just 200 days this year, and the average flight time is just 80 minutes.

Yet rather than say she was sorry and promise to “educate herself” and “do better,” as the PR-crafted celebrity apologies usually go, a rep for Swift tried to pass the buck.

“Taylor’s jet is loaned out regularly to other individuals,” her spokesperson said. “To attribute most or all of these trips to her is blatantly incorrect.”

Ah, well, in that case everything is totally fine! The carbon emissions are offset, you see, by Swift’s generous lending of her private plane to her less wealthy friends. Isn’t charity great? ...

Several commenters then turned their sights on Swift, noting that she once changed the lyrics to her song “Picture to Burn” because they were deemed homophobic. Swift originally sang in the tune that if her ex-lover told his friends she was crazy, she would tell hers that he is gay. She later amended the line to say “that’s fine, you won’t mind if I say.”

If activists are now focused on removing ableism from songs, then Swift has plenty of objectionable material. In “Blank Space,” she says ex-lovers would describe her as “insane.” In “Me!” she says she “went psycho” during a phone call with a lover and duet partner Brendon Urie sings there’s a lot of “lame guys” out there. Back in 2019, a disabled fan zeroed in on the latter two examples, saying she was “disappointed” and had a “problem” with the “ableist” lyrics.

“While it’s trendy to use words referring to mental health terms like ‘psycho’ or ‘crazy,’ these words are often used negatively,” the fan wrote. “Using these words adds to the stigma that surrounds mental illness. This stigma is oppressive and makes people with mental illness feel like there is something wrong with them, which of course is not true.”


Uh, no. People with mental illness do have some thing wrong with them.

And to call a man gay is a serious insult. Taylor is right about that and used the insult correctly.
39   Ceffer   2022 Aug 24, 9:56pm  

Instead of how many angels can dance on a pinhead, the dazed and confused LibbyFucks have a contest for how many disparate, oxymoronic, illogical, counter intuitive, self contradictory concepts can they juggle in their fevered brains at the same time in order to be card carrying woke.
40   richwicks   2022 Aug 24, 10:57pm  

just_passing_through says


Pffft... How about several thousand. How about 100 thousand?

This is old dude, c'mon:

https://www.science.org/content/article/dna-could-store-all-worlds-data-one-room


First, this is complete bullshit. I once downloaded the entire genome of a human being. It easily fit on my hard disk. My entire genome would fit on an SD Card today - no problem.

just_passing_through says


This is from today:

https://finance.yahoo.com/news/twist-bioscience-present-flash-memory-120000492.html

Let me know if you'd like to see a white paper. In the past you never read white papers I linked you to al la bitcorn. Seriously, acted like what I said never happened.


DNA doesn't last for millennia, it deteriorates and transforms.

I am looking for a permanent storage system that will survive from -50C to +60C for thousands of years.

And I'm not kidding about the possibilities of it. I can fake it of course, I'll be dead easily in 40 years, but I want to make a mark. If I faked it, I may as well be dead.

Incidentally, I may have found it, but it's still in development. I have to build a patent around it.
41   SunnyvaleCA   2022 Aug 24, 11:16pm  

richwicks says


the reason RCA decided on 1080p is that it was basically the resolution that the eye could see

Right on Richwicks!

I'll add a little bit more, though...
For a computer monitor showing static data, 4k is enormously better than 1080p. Once a frame is frozen on the screen we can focus on a specific part and the higher resolution becomes important. In fact, a pair of 4k monitors is a better setup. 2x 4k is is great because we can have the entire set of static data laid out before us and then, with the bare minimum subconcious of effort, focus in on one part or another; and when we focus in on a part—5% of that enormous 2x screen—we get that part in the high resolution we need. When we focus in on a specific part, all the other parts could be at 480p and we wouldn't even notice.

So, taking all that about static image and moving to video, I still think there's a benefit in 4k video in the right situation. If you have a large display and the video is a wide-angle view of something, you want that wide-angle view to fill your entire vision field. But that means your area of ocular concentration is a small patch. If that small patch is to be high enough resolution and the entire screen has to also follow at that resolution, then you need 4k so that the small patch you are actually concentrating on is a megapixel.

Smartphone resolution, which is definitely way beyond excessive for optical accuity, has another purpose: Resolution Independence. If I tell the phone to draw a line 1 pixel wide, but then run on a screen that is 1.5x as pixel dense, that line is 1 pixel wide, 2 pixels wide, or maybe 1 pixel and a grey (fuzzy) pixel wide? Ugh. So, make the actual screen 3x the drawing resolution and it can be 3 pixels wide; if, in the future, there's a different screen with even higher resolution it can be drawn 4 pixels or 5 pixels wide. With a sufficiently high resolution, the phone can drawn any sort of thing accurately. So, Apple and others are now packing a 4k screen into a 5 inch diagonal! The video acceleration hardware is plenty good these days to handle those pixels without much notice. And the text looks fantastic!
42   richwicks   2022 Aug 25, 1:05am  

SunnyvaleCA says


richwicks says


the reason RCA decided on 1080p is that it was basically the resolution that the eye could see

Right on Richwicks!

I'll add a little bit more, though...
For a computer monitor showing static data, 4k is enormously better than 1080p. Once a frame is frozen on the screen we can focus on a specific part and the higher resolution becomes important. In fact, a pair of 4k monitors is a better setup. 2x 4k is is great because we can have the entire set of static data laid out before us and then, with the bare minimum subconcious of effort, focus in on one part or another; and when we focus in on a part—5% of that enormous 2x screen—we get that part in the high resolution we need. When we focus in on a specific part, all the other parts could be at 480p and we wouldn't even notice.

So, taking all that about static image and moving to video, I still think there's a benefit in 4k video in the right situation. If you have a large display and the video is a wide-angle view of something, you want that wide-angle view to fill your entire vision field. But that means your area of ocular concentration is a small patch. If that small patch is to be high enough resolution and the entire screen has to also follow at that resolution, then you need 4k so that the small patch you are actually concentrating on is a megapixel.



OK, there is a difference between a static image and a non static image. 99.9% of the people cannot differentiate between a NONstatic 1080p image and 4K.

Having said that, I've had a 4K monitor, and the damned thing is so fucking huge, I wouldn't even see I opened a web browser on the right of my screen as I was looking at my left, so I would open another instance. 2K is where I'm comfy with, even 2 2K screens, 4K is too much.

SunnyvaleCA says


Smartphone resolution, which is definitely way beyond excessive for optical accuity, has another purpose: Resolution Independence. If I tell the phone to draw a line 1 pixel wide,


Who does this? Nobody.

1080p is what people 2000 years will be using for viewer screens now.

You might need higher resolution for ARTWORK, but that is already kind of dead. Sorry, technology has killed even art.

When I was a kid, and The Last Starfighter came out, that was essentially 1000x1000, which was around 1080p. That was adequate for a FEATURE film.
43   SunnyvaleCA   2022 Aug 25, 1:36pm  

richwicks says

[... Resolution Independence ...]
Who does this? Nobody.

1080p is what people 2000 years will be using for viewer screens now.

I don't think you've programmed a smartphone for the last decade. There's a nominal coordinate system (all floating-point values) and a scale factor. The original iPhone (for example) used a scale factor of 1.0x, but newer iPhones have used 2x and 3x "natively" as well as some funky scale factors to help out people who would like a magnified view. In addition to the pixel density dictating a 2x or 3x scale factor, the screens have gotten physically bigger, so that the original 480x320 iPhone is dwarfed by the 2778x1284 pixel iPhone 13 Pro Max. Progress marches on, so that the 13 Pro Max displays things much faster on its enormous pixel surface than the original iPhone on the tiny one and also has 2x the battery life even while playing full-resolution video games.
44   SunnyvaleCA   2022 Aug 25, 1:41pm  

richwicks says

Having said that, I've had a 4K monitor, and the damned thing is so fucking huge, I wouldn't even see I opened a web browser on the right of my screen as I was looking at my left, so I would open another instance. 2K is where I'm comfy with, even 2 2K screens, 4K is too much.

That seems to be a "you" problem. Well, that and your software. The rest of the world moved to using scaling factors on commodity-level computers a decade ago. Heck, the original NeXT machines of the late 1980s used "display postscript," which had a floating-point coordinate system and theoretical resolution independence—the software that drove the screen image also generated the bitonal raster image for the 400 DPI laser printer.
45   SunnyvaleCA   2022 Aug 25, 1:50pm  

richwicks says


1080p is what people 2000 years will be using for viewer screens now.

I'm interpreting that as: 1080p is what people 2000 years from now will be using for viewer screens.
My response is...
Absolutely not. It's already difficult to find 1080p screens for TVs larger than 32 inches. Few consumers will pay more for a low-sales-volume 1080p screen when they could get a 2160p screen that is still capable of handling 1080 input by scaling. 1080p is usually fine for motion video of the real world, but why limit what you can do with the screen and also pay more?
46   SunnyvaleCA   2022 Aug 25, 2:00pm  

Ha! Well that ^^^ was way off topic of wokeness!

Anway, I've been saying for years that there is seemingly no limit to wokeness. For the pushers of wokeness, there's no goal other than to push wokeness, so there's also no end. After one woke goal is achieved, just make up some more. This phenomenon is manifested by the ever-increasing jumble of letters in the LGBTQXIASDF!@# pushers.
48   richwicks   2022 Sep 28, 6:27pm  

SunnyvaleCA says


Absolutely not. It's already difficult to find 1080p screens for TVs larger than 32 inches.


Yeah, it's stupid.

I am not surprised there's higher resolutions, but for codecs, 1080p is "good enough". You have a limited number of receptors in your eyes.

People who claim they can tell the difference between a 1080p video and a 4K video, played at the same frame rate, where it's recording an actual video (not computer generated) are like the same dumb assholes that claim that they can differentiate between a 16 bit audio file and a 32 bit audio file.

I worked for RCA. There's a REASON they picked these numbers. There's 1080 lines, why not 1024? That's a power of 2, seems more logical, doesn't it?

You can display a 1080p film IN A THEATER and the audience won't notice it.

Let me impress the difference we have between NTSC and what is modern high def. This is a NTSC picture:



That was the resolution of a NTSC television until around 2005. Well, approximately, from left to right, that's the encoding that was used by DVDs, but there are really no "pixels" in NTSC. The picture is actually a little big, because it's ignoring vertical and horizontal blanking intervals.

For a computer, you want sharp images for text, for for a photograph or a film? 1080p is basically the maximum resolution a normal human being can discern. Now, maybe you're a mutant, but unlikely.

4K is ridiculous, but I expect it to become the standard over time. 8K is pure stupidity but I know it exists. 8k has no technical reason to exist, perhaps for VR but that's only so you can rotate your head. That would be the resolution of the ENTIRE picture, both what you can see, and what's behind you in a complete immersion.
50   richwicks   2022 Sep 28, 6:34pm  

SunnyvaleCA says


I don't think you've programmed a smartphone for the last decade. There's a nominal coordinate system (all floating-point values) and a scale factor. The original iPhone (for example) used a scale factor of 1.0x, but newer iPhones have used 2x and 3x "natively" as well as some funky scale factors to help out people who would like a magnified view. In addition to the pixel density dictating a 2x or 3x scale factor, the screens have gotten physically bigger, so that the original 480x320 iPhone is dwarfed by the 2778x1284 pixel iPhone 13 Pro Max.


I know about the stupidity of the retinal display. It would be useful for people with perfect eyesight that also use magnifying glasses with their phone.



Fortunately we don't live in that world.

Apple makes a lot of things that have "better specs" that just don't matter. It's a sort of cult. Sure, it's better, but a person can't experience its superiority. The screen has more pixels than the eye has cones and rods, and it is more costly, it requires more computational power, it drains the battery, all this stupidity just so they have a MARKETING point - then to add insult to injury, they LITERALLY glue the battery in, make it impossible for 3rd parties to repair it, and slow down the operating system to encourage upgrades. They're just an asshole company.

I have a piece of shit "Motorola 5G one", literally one of the cheapest phones I could purchase. It has a TWO DAY battery. I'd rather it be lighter and smaller, than have this ridiculous battery. It's just extra weight. It's probably the same size as the latest iPhone, but it doesn't have stupid useless specs, it's designed for functionality, I'm SURE the processor is slower and there's less cores - it's more than adequate. It's just a phone.
51   Hircus   2022 Sep 28, 6:50pm  

Patrick says


https://slaynews.com/news/woke-hollywood-celeb-comes-to-grim-realization-i-wont-be-woke-enough-for-the-left-one-day/



“Sometimes I wonder — did I and others cutting them off make them dig their heels in deeper, fuel their ignorance with a nitro-boost of resentment and spite?


Their ignorance?

It never crosses his mind that the woke can be wrong sometimes. It's always the others that are wrong and need to "change" or "evolve" or "learn";.

Even as he tried to virtue signal by being self reflective and demonstrating tolerance toward the mean bad outdated conservative, it was only as a vessel to assert intolerance.

« First    « Previous    Comments 27 - 51 of 51   

Please register to comment:

about   best comments   contact   latest images   memes   one year ago   random   suggestions