medialiteracy

Washington Post
"The study “helps add to the growing body of evidence that, despite a variety of mitigation efforts, misinformation has found a comfortable home — and an engaged audience — on Facebook,” said Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University, who reviewed the study’s findings."
huh, people love to hear things that confirm something they want to believe. That’s extremely profitable!
The Atlantic
"I asked about the possibility—floated by many critics of the account—that by sharing extremist rhetoric to a broad audience with little other information, PatriotTakes is effectively re-platforming people who have been removed from the public square for a reason. The account’s owner was uninterested in discussing it."
If an account reposts racist and fascist garbage does the intention (pointing and laughing or "monitoring") really matter? The account is amplifying and spreading racist and fascist garbage.
The Why Axis
"The “both sides” model of journalism is being exploited by bad actors intent on spreading misinformation and conspiracy theories. “If the weight of the evidence allows you to make a judgment, but instead you go with ‘he said, she said,’ you're behaving recklessly even as you tell yourself you're doing the cautious thing,” as press critic Jay Rosen notes."
Hedging must feel like the safe path for journalists—especially since they wouldn't want to anger Big Dowsing. This is a good example of how baked-in both-sides thinking is. See also.
The Atlantic
"The Republican operatives, who dismiss the expositions of critical race theorists and anti-racists in order to define critical race theory and anti-racism, and then attack those definitions, are effectively debating themselves. They have conjured an imagined monster to scare the American people and project themselves as the nation’s defenders from that fictional monster."
Ibram X. Kendi with the best definition of the CRT "debate" happening in the media.
The Atlantic
"Twitter is a parasite that burrows deep into your brain, training you to respond to the constant social feedback of likes and retweets. That takes only a week or two. Human psychology is pathetically simple to manipulate. Once you’re hooked, the parasite becomes your master, and it changes the way you think."
Sure Twitter took away their love of reading, but it gave them outrage and a sense of righteousness in return. Fair trade?
Business Insider
"Over 70% of the videos flagged by respondents came through YouTube's suggestion algorithm — an effect that's impossible to study because the algorithm is a closely-guarded secret at Google. That means YouTube users aren't primarily finding misinformation through search, but through YouTube feeding users those videos."
Hosting disinformation is bad enough, but actively recruiting people—at scale—is awful.
Esquire
"Elevated Stupidity stems from the idea that being good at arguing is the same thing as being correct. That rhetorical skill—or at least a degree of big debate-club energy sufficient to wear out one’s opponent—is the equivalent of intelligence."
This article makes me tired, but yes this big debate-club energy is powering The Discourse.
twitter.com
"One thing you see a lot on here is people pointing out the contradictions in the putative views of Trump’s GOP. COVID is a Chinese plot but also a hoax. The insurrection was antifa but also a tour of patriots."
Twitter thread explaining why the contradictory beliefs of Trumpism are a feature not a bug.
Galaxy Brain
"The social internet promised us deep human connections — the sort that requires nuance and patience for messiness — but instead, it’s just turned us all into brands."
Insightful analysis of The Discourse.
npr.org
"Many of the 12, he said, have been spreading scientifically disproven medical claims and conspiracies for years. Which provokes the question: Why have social media platforms only recently begun cracking down on their falsehoods?"
It’s much easier to poison the information well than we realize because social media platforms don’t have an incentive to fix it. Poisoned water might even bring people to the well more often.
buzzsprout.com
We're talking about a guy who received one complaint from a student who came to his office to talk to him, and then he himself voluntarily canceled the course. He took his ball and went home. And yet we're supposed to be like, “All of these kids today, they're so over-sensitive”.
Fantastic conversation that connects 90s political correctness discourse with cancel culture discourse. They show how flimsy moral panic stories were fabricated, used as evidence of liberal overreach, and repeated ad nauseam.
conradakunga.com
"So today I set out to actually see what it is one agrees to when they accept all."
Peeling back the layers on those cookie agreement dialogs helps us learn about how web advertising works (and how massive the industry is).
« Older posts