“Deplatforming” Parler

“Deplatforming” Parler

I had not heard of Parler until news this past week that Amazon was terminating Parler’s use of Amazon cloud services. Parler was, apparently, a social media platform for extremists.

In the aftermath of the insurrection and attempted coup this past week, Twitter, Facebook, Youtube, Reddit and other social media outlets suspended Trump’s online access plus suspended related groups or posts.

Amazon terminated an entire social media platform called “Parler”.

This is viewed by some as censorship of “conservative” thought, but others say it was not conservative thought – it was just hostile extremist perspectives, often advocating violence and hate towards others.

In regards to Parler, this article found posts advocating blowing up Amazon data centers. Parler refused to delete such posts. Consequently, Amazon deleted Parler.

Parler is said to have had many posts advocating violence or hate.

Posts advocating violence and hate are often obvious, easily identified, and removed by other platforms. (And sometimes they misfire and remove legitimate content too.)

Amore challenging issue is dealing with “misinformation” and “lies”. For example, in mid-year 2020, persons who shared information that had been the official statement of WHO in January were then accused of “misinformation” – even though WHO’s original statements remained on their social media. Is such a post “misinformation”?

Is a view that is skeptical of lock downs spreading “misinformation”? Some say yes, while WHO itself says lockdowns only seem to work for a short period of time, if implemented early in the pandemic.

In some cases, there was plenty of room for interpretation – and the determination of “truth” or “lie” was not clear. In late 2021, statements by reputable scientists and doctors, questioning the “narrative” over such things as Covid-19 treatment or public health mitigations were judged “misinformation”. Actual experts weighing in on subjects which were not clearly “true” or “false” were censored by social media.

A group of medical school professors run a Youtube channel on a wide variety of well researched medical topics – their intended audience is med students and doctors. When they ran videos discussing legitimate issues of Covid-19 and treatments, Youtube deleted their videos!

This results in the appearance of social media becoming the arbiter (and incorrect) judge of truth.

Disclosure – when Gab was first created, I gave it a try. But I quickly found those who interacted with my tame posts came across as ideologically driven extremists lacking critical thought. They were kind of scary.  After a few weeks, I left the platform and deleted my account there.

Comments are closed.