+- +-

+-User

Welcome, Guest.
Please login or register.
 
 
 

Login with your social network

Forgot your password?

+-Stats ezBlock

Members
Total Members: 48
Latest: watcher
New This Month: 0
New This Week: 0
New Today: 0
Stats
Total Posts: 16867
Total Topics: 271
Most Online Today: 1208
Most Online Ever: 1208
(March 28, 2024, 07:28:27 am)
Users Online
Members: 0
Guests: 285
Total: 285

Author Topic: Resisting Brainwashing Propaganda  (Read 20510 times)

0 Members and 3 Guests are viewing this topic.

AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 36274
  • Location: Colchester, Vermont
    • Renwable Revolution
Re: Resisting Brainwashing Propaganda
« Reply #60 on: December 20, 2017, 10:02:00 pm »





December 19, 2017

How Did We Wind Up in a Post-Truth World? And What Can Be Done About It?

From coal’s astroturfing online to an artificial intelligence’s both-sides equivocation, when we talked about denial in the age of AI last week, things didn’t look promising. Fortunately, the December issue of the Journal of Applied Research in Memory and Cognition is here! It’s a special edition, focused on a lengthy article and featuring nine responses, all offering some help navigating misinformation in the post-truth age, with an eye towards technology.

Given the importance of the matter and depth of the research, and that all but the initial paper are behind a paywall, this will be the first in a rare three-part roundup. We felt it fitting to end this post-truth year with a rumination on our post-truth past.

Our journey through the truthiness landscape starts with a target article by Stephan Lewandowsky, Ullrich K.H.Ecker and John Cook, summarizing the state of scholarship regarding how the public deals with misinformation and offering some suggestions about how to address the problem. The authors argue that key to surviving in this new “post-truth” landscape is “technocognition,” or the combination of psychological principles and technological solutions.

For those who want a crash course in the tenets Lewandowsky, Ecker and Cook’s piece is based on, see The Debunking Handbook. But the three authors move past the summary to a much more interesting analysis:  they argue that American society must look at the socio-political context of fake news to fully understand its impacts and solutions, expanding the current focus from online interactions to the full IRL experience.

“The post-truth problem is not a blemish on the mirror,” they write. Instead, “the problem is that the mirror is a window into an alternative reality.” In this reality, elites and their evidence, like the multiple independent lines of research proving climate change is caused by human activities, are cast aside in favor of socially shared alt-news. The election of Donald Trump shows just how these misinformation ecosystems have moved from the fringe corners of the internet into the mainstream.

But the creation of these new realities is not a bipartisan problem: rather, it’s a curiously conservative phenomenon. Whether it’s a NASA-run child slave colony on Mars or the decades-old conspiracy around the UN’s plans for a global government or climate change being a Chinese hoax, the authors advise on the need to consider misinformation through “the lens of political drivers that have created an alternative epistemology that does not conform to conventional standards of evidentiary support.”

This is a fancy way of saying that sometimes conservative leaders just make bullshit up and people believe them. While this reckoning may seem new, the authors demonstrate that it’s been a long time coming (Karl Rove’s admission that the Bush administration actors “create[d] our own reality” is a particularly poignant example). The authors’ reference that Republicans “have moved towards the right in a manner unprecedented since the 1880s” follows with the fact that the right appears to be more susceptible to the pseudo-profound bullshit philosophical nonsense we’ve talked about before.

One important effect of creating alternate realities on social media, the authors explain,  is the invention of intense, imaginary conflict. Did scientists really discuss manipulating data in hacked emails? Of course not, but arguing about it makes for good TV! Fanning the hot flames of these conflicts, in turn, pushes politicians towards extremism. While nominees have traditionally hewed to the center for the largest possible share of votes, modern politicians now focus on their echo chamber to rile up the base. In this new post-truth world, “lying is not only accepted, it is rewarded,” Lewandowsky, Ecker and Cook write. Falsifying reality is no longer about changing people's beliefs, it is about asserting power.”

These concepts make it crystal clear that climate denial is not an attempt to build a base of knowledge contrary to the consensus. Rather, the authors write, climate denial is “a political operation aimed at generating uncertainty in the public's mind in order to preserve the status quo and to delay climate-change mitigation.”

So how do we get people (conservatives) to care about truth and reality again? Technocognition might just have some answers.

But, uh… what is that? Mind melds with a Mac? Uploading our consciousness into the Matrix? Studying climate change while listening to the latest techno jams? Tune in tomorrow to find out!
 





December 20, 2017

Technocognition: Countering Fake News and Denial with Science and Technology

Yesterday we charted the course that led us to Post-Truth Land. How might we find our way out? Hard to say, but fortunately, the second portion of Lewandowsky, Ecker and Cook’s piece offers some suggestions. They also coined fun new phrase to embody the changes that need to be made: technocognition.

As the authors explain, technocognition is the idea that we should use what we know about psychology to design technology in a way that minimizes the impact of misinformation. By improving how people communicate, they hope, we can improve the quality of the information shared.

Fundamentally, the authors argue for the need to educate the public about trolls and fake news, and improve journalism to better fight the misinformation. In addition to common sense steps like disclosing pundit and writer’s conflicts of interests and encouraging more participation to collectively reshape the norm into one where facts matter, media outlets should hire myth-busting fake news reporters, and consider forming a common “Disinformation Charter” of what’s acceptable behavior and standard of accuracy.

But the authors recognize that we can’t expect everyone to start playing by the rules which is why there is a need for independent watchdogs to act as the fake news referees, calling out errors and identifying when stories go past the truth. The climate world, which had already formed important defenses against deniers even before one was elected president, have a couple key actors already in this space, including Climate Feedback  . More broadly there’s the UK’s Independent Press Standards Organisation, which recently forced a correction of a Daily Mail climate conspiracy.

Then there’s the techno-side of the equation. These are the Silicon Valley fixes, like algorithms that can automatically fact check content to prevent fake news from showing up in searches or feeds, or mechanisms to flag fake news on social media.

Website moderators, the authors argue, need to do a better job containing trolls in the first place. From screening certain phrases that are primarily used as fake news framing to eliminating comment sections all together, there are lots of potential ways of curating the comment section so it’s not such a cesspool of hate and lies.  But more important than the comments is the content, which is why the authors suggest that an app for reporters would be useful for quick and easy determinations of what’s real and what’s an alternative fact- the Skeptical Science app for example.

And finally, while this is hardly an ask coming solely from the authors, tech companies should find ways to show people content from beyond their bubble. For example, while Facebook and Twitter primarily show users content based on their subscriptions, reddit’s /all and /popular pages show a mix of what everyone’s looking at, regardless of personal preference. This gives users a sense of the world outside their immediate awareness, forcing at least a subconscious recognition of the wider world they may not want to recognize.

Reading through this list of recommendations, and one gets the idea that with some simple tweaks from Silicon Valley, our post-truth problems could be solved. But is it enough?

For now, we hope that technocognition gets some techno-recognition. As unlikely as it may be, we find ourselves wishing for a way to make this anti-fake-news scholarship achieve a fraction of the viral shares that fake news regularly does.


He that loveth father or mother more than me is not worthy of me: and he that loveth son or daughter more than me is not worthy of me. Matt 10:37

 

+-Recent Topics

Future Earth by AGelbert
March 30, 2022, 12:39:42 pm

Key Historical Events ...THAT YOU MAY HAVE NEVER HEARD OF by AGelbert
March 29, 2022, 08:20:56 pm

The Big Picture of Renewable Energy Growth by AGelbert
March 28, 2022, 01:12:42 pm

Electric Vehicles by AGelbert
March 27, 2022, 02:27:28 pm

Heat Pumps by AGelbert
March 26, 2022, 03:54:43 pm

Defending Wildlife by AGelbert
March 25, 2022, 02:04:23 pm

The Koch Brothers Exposed! by AGelbert
March 25, 2022, 01:26:11 pm

Corruption in Government by AGelbert
March 25, 2022, 12:46:08 pm

Books and Audio Books that may interest you 🧐 by AGelbert
March 24, 2022, 04:28:56 pm

COVID-19 🏴☠️ Pandemic by AGelbert
March 23, 2022, 12:14:36 pm