With (Online) Power Comes Responsibility

 In Blog, Michelle Solomon

by Michelle Solomon

An observation—and major complication—that I have not heard elsewhere involves the multijurisdictional contexts of media causes and effects. Specifically, that the media problems often cited by critics and experts are international, yet the potential responses are regional or national. This disconnect has created a compelling site of struggle that may seem insurmountable, but isn’t. It has resulted from legacy attitudes, contexts and institutions that will hopefully change as laws and agreements catch up to behaviours.

Google and Facebook—often cited for fake news distribution—are U.S. companies subject to U.S. laws. Yet their scopes are international, a factor that exports the effects of U.S. law worldwide. As a case in point, Facebook’s membership policy states, “You will not use Facebook if you are under 13.” This restriction is often assumed to be a move to protect pre-teens from invasions of privacy or intrusive marketing, but Facebook’s policy is actually a business decision: U.S. law forbids collecting data for marketing purposes on people under 13, so it’s a waste of Facebook’s resources to have them on the membership list. The total U.S. Facebook audience amounts to 214 million users, with more than 1.8 billion monthly active global users (Google). Doing the math, that means 88% whose Facebook membership age and other behaviours are dictated by a foreign government and company. If any of those billion and a half people have a problem, they have to take Facebook to court in California. Many predicaments are at least jurisdictional, contextual.

A woman in Belleville, Ontario wrote a fake news story for Planet Free Will, a Pennsylvania-based website, ‘alleging that aides to Democratic presidential candidate Hillary Clinton were involved in a child prostitution ring hidden in mysterious tunnels beneath’ Comet, a Washington D.C. pizza parlour.  (Belleville Woman Helped Cook Up Pizzagate)

[Curiously, there are multiple reports and commentaries , mainstream and otherwise, about the article but the article itself is no longer posted at Planet Free Will, nor, it seems, anywhere else.]

As a result of the Planet Free Will article and others proclaiming similar misinformation, many restaurants on the same D.C. street—completely unconnected to the pizza parlour—received threats. Then a man entered the pizza parlour carrying a Colt AR-15 assault rifle, a .38-calibre Colt revolver and a folding knife. He fired his rifle while searching to free the non-existent child prisoners. He was arrested before anyone was shot. ‘The merchants approached Facebook and Twitter and asked that disparaging, fictitious comments about them be removed. The shopkeepers said the replies they got advised them to block individual users who were harassing them.’ But blocking would have no effect on the posts’ influences on current or future customers. If thine eye doth offend… Asked how she felt about the life-changing experiences of the arrested man and the innocent restaurant owners, the woman said, “I really have no regrets and it’s honestly really grown our audience.”

I suspect that you know of additional events where irresponsible online misinformation has caused serious problems yet has gone unsanctioned because no mechanisms exist to facilitate effective responses. Teen-Macedonian-presidential-election bloggers, for example.

The stage, actors, actions and repercussions are international, but the legal jurisdictions are national, creating a serious disconnect. Can the pizza parlour owner or Clinton aides sue the Belleville woman for libel? Can the alleged vigilante seek redress?  Can the D.C. authorities extradite her for mischief or inciting violence? Can the restaurateurs sue a website, Facebook or Twitter for distributing information that caused such damage?

With power comes responsibility. What kinds of catch-up might help users be mindful or afford victims recourse? Media literacy education can play a major role by helping people understand and appreciate the powers and responsibilities that users have acquired from networked participation. Case studies based on the D.C. pizza store events and others can help students understand problems and imagine solutions. Media companies like Facebook can use human rather than algorithmic filters to identify misinformation/fake news more effectively, increasing the chances of warning/protecting its users. International laws and protocols need to be established to facilitate the same kinds of responsibilities and recourse for internet citizens that national citizens enjoy. If a media organization can be legally charged, tried and punished for publishing toxic misinformation within national boundaries, similar mechanisms are needed internationally. People/businesses that are subject to the rules and vagaries of private companies that will not act to protect them must be provided legal recourse.

The United Nations might be one organization that could facilitate useful structures, but they could also be built into international trade agreements. The European Union, by virtue of being an international organization with an established history of international legal interfaces and cooperation, might be a good model to trial the interactions and protocols.

However change occurs, it needs to start before more harm is done and it needs to be a multinational legal and educational effort. As a start—international legal cooperation being historically awkward and slow—media literacy education can be supported and distributed widely.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Start typing and press Enter to search