By Gabrielle E. Kirsch
Texas Christian University

 

The term “fake news” was not coined after the advent of the internet. It had been around and defined as “propaganda in which the mass media had been a vehicle for propaganda that was exploited by both state and non-state actors to push messages that distort the opinions and emotions of people largely for the promotion of certain political agenda or ideology” and it is also used by parties of opposing views to express their disbelief in a point or topic as a whole.

Besides the global pandemic, we are all involved in a global “infodemic.” This “infodemic” includes widespread disinformation on not only theories surrounding the virus, but also election propaganda and targeted ad campaigns. This weaponized information isn’t all originating in the U.S., but from foreign interference as well. Who are the designated gatekeepers to keep the public safe from fake news? In the United States, that would be Big Tech. In nearly the entire world, technology companies control how the public receives news- not the government. Should the government be playing a role in protecting people from information warfare?

U.S. policymakers are correct in their stance of not allowing the government to interfere with the free flow of information across internet platforms (interfering with creative content, limiting expression, and online commerce breaks our constitutional right to freedom of speech/freedom of the press). However, this grants censorship rights to technology companies instead, giving them the right to propagate whatever their own political stance is to all Americans. Freedom House warns that “at least 17 countries [that] approved or proposed laws that would restrict online media in the name of fighting ‘fake news’ and online manipulation turn governments into arbiters of acceptable speech.” Saudi Arabia, for example, created policies fighting fake news that laid the groundwork for shutting down dissident speech: this is not what the U.S. is after, but we could end up in the same position.

These Big Tech companies, i.e., Facebook, Amazon, Google, Twitter, Microsoft, and YouTube, have now been supported by giant ads that once paid for local news and newspapers as well as television news media. “Americans are now spending an average of 24 hours a week online, and 40 percent of them believe the Internet plays an integral role in U.S. politics.” These companies have implemented policies to prevent the spread of false news since they are so heavily relied on as the public’s main source for news- especially political news. The problem is “the rules and procedures differ within and among platforms, creating loopholes for cross-platform disinformation campaigns.” Despite the request for a regulatory framework across all platforms, there has been little government response.

After the 2016 election, the U.S. Senate passed the bipartisan Honest Ads Act. This act forces technology companies to disclose “advertising spending, targeting strategies, buyers, and funding. It would also require online political campaigns to adhere to stringent disclosure conditions for advertising on traditional media.” This act is geared towards furthering the transparency between technology companies and consumers, utilizing “light patterns” rather than “dark patterns.” Light patterns are intuitive user designs rather than dark patterns which trick users into believing content posted by unverified accounts that obscure audio and video. Enforcing the use of light patterns could look like an identity verification check before creating an account on any platform or creating a support mechanism for independent public-interest journalists, voting information, and fact-checkers. This recommendation by the Digital Innovation and Democracy Initiative creates fact-checking work for human beings, limiting the control of algorithms to verify political information.

Still, in November 2019, many different United States government organizations warned that “Russia, China, Iran, and other foreign malicious actors all will seek to interfere in the voting process or influence voter perceptions.” These predictions for the 2020 election are based on findings from targeted advertising on social media before the 2016 election. BBC exploited a fake Russian Facebook page in October 2017 that had reached 126 million users, remarking, “Russian operatives did actually upload socially and politically divisive social media content to influence the outcome of the 2016 US Presidential election…”

These outsiders interfere using all the aforementioned ways of reaching people digitally, while also using astroturf campaign methods. Astro-turf campaigning uses a carefully constructed narrative to make you feel like an outlier of a mainstream agenda, covering up the actual intentions of the group. Those using this malicious method tend to be dismissive of those exposing the wrongdoing. “Fake news becomes a national security issue when it undermines the foundations (e.g., social cohesion, public institutions, peace, and order) of the nation-state.”

The division in this country is a lot within itself, but having foreign interference makes it even harder to determine the truth. The Council of Foreign Relations recommends creating a “global privacy framework” localizing data to its own country. In the European Union, this took form in 2016 as the GDPR or the “General Data Protection Regulation.” The GDPR addresses the transfer of data outside of the EU as well as the privacy of individuals within the EU. This may seem like a method of division between the European Union and the United States, but rather, it’s a means for creating a united global standard of privacy. This has also inspired other countries to create digital privacy laws including Brazil, India, Japan, South Korea, and Thailand.

News in the modern era is sent out with speed and wide outreach. It will be difficult to account for fake news in order to legislate it. Legal measures designed to target fake news may end up taking unexpected turns. One possible outcome of deleting this fake content is the “Streisand Effect” which is a social phenomenon that occurs after an attempt to hide or block certain content. Trying to hide or block content draws further attention to it, causing the information to be copied and published elsewhere. In other words, trying to oppress information draws unwanted attention to it. For example, in China, the strict censorship of information that is out of line with the government’s narrative has forced some netizens to believe they are censoring the truth and not the other way around.

Moves worldwide to combat fake news are now coming to fruition. People with money and power (that aren’t part of the elected government) know the easiest way to influence people is through influencers and social media. Any type of legislation or guidelines prohibiting the spread of certain digital information could place a weight on our democracy, threatening its resilience. A strategy towards regulating content will involve a variety of organizations, including technology companies and the government found organizations. These guidelines will be evaluated and commented on by users on platforms like YouTube and Facebook, let’s hope that our digital democracy is regulated like anything else in our nation- through discussions by real people who have real opposing views. This way, the cyberworld can remain human and ethical, with freedoms that mimic our freedoms in the physical world.

 

By Gabrielle Kirsch a student at Texas Christian University.

Authors Note:

Throughout 2020, due to the pandemic flooding news and social media feeds with conflicting information, this has been a topic of conversation. I wanted to write this essay to explore how these platforms are responding to such an influx of journalists and news sources fighting for the top of the algorithm, whether that’s through advertising or other methods, and what guidelines these technology companies must follow. Turns out, there aren’t government regulations surrounding what news the public consumes (however, there is an angry Donald Trump yelling “fake news!” which is reason enough to investigate this).

While searching for journals on the topic, all I found was credible articles on fake news and policy, digital trade of data, and fighting online manipulation. These sources included some useful examples of the effects of legislating digital press in other countries, which I included in my essay. I think this is an important topic to warn that what we believe in may be but a narrative in a much greater story. This is a topic I would only want to explore in an academic database such as Jstor. I’m afraid I would have too many opportunities to stumble upon biased articles and clickbait if I were to Google search “fake news” or another topic that arose in my essay.

 

References:

Jeffrey I. Cole et al., “Surveying the Digital Future: The 15th annual study on the impact of digital technology on Americans,” Center for the Digital Future at USC Annenberg, 2017.

Kornbluh, Karen, et al. Safeguarding Digital Democracy: Digital Innovation and Democracy Initiative Roadmap. German Marshall Fund of the United States, 2020, www.jstor.org/stable/resrep24545.

Knake, Robert K. TOWARD A NEW APPROACH TO INTERNET GOVERNANCE. Council on Foreign Relations, 2020, pp. 6–12, Weaponizing Digital Trade: Creating a Digital Trade Zone to Promote Online Freedom and Cybersecurity, www.jstor.org/stable/resrep26189.7.

Haciyakupoglu, Gulizar, et al. Legislating Fake News: Global Case Studies. S. Rajaratnam School of International Studies, 2018, pp. 5–13, COUNTERING FAKE NEWS: A SURVEY OF RECENT GLOBAL INITIATIVES, www.jstor.org/stable/resrep17646.5.

Policy Recommendations: Internet Freedom. (n.d.). Retrieved November 19, 2020, from https://freedomhouse.org/policy-recommendations-internet-freedom

Federal Bureau of Investigation, Joint Statement from DOJ, DOD, DHS, DNI, FBI, NSA, and CISA on Ensuring Security of 2020 Elections, November 5, 2019

Naja Bentzen, “Understanding disinformation and fake news,” European Parliament Think Tank, accessed November 7, 2017, http://www.europarl.europa.eu/RegData/etudes/ ATAG/2017/599408/EPRS_ATA (2017)599408_EN.pdf.