By Raymond Malewitz, Associate Professor, School of Writing, Literature and Film
My research on the relationship between digital technologies and politics often takes me to future dystopian worlds. Gary Shteyngart’s Super Sad True Love Story, for example, is a novel in which citizens of a social-media-saturated world are incapable of managing complex social and political problems. While I have grown accustomed to such fictional narratives, I am nonetheless taken aback by the cautionary tales they offer for our world.
Last October, representatives of Google, Facebook and Twitter testified before Congress that Russian agents had targeted millions of Americans in a coordinated digital propaganda campaign. In spite of such revelations, the companies insisted that they were not responsible for the distribution of fake news. Echoing Mark Zuckerberg’s defense of Facebook as a “tech company, not a media company” a year earlier, a Google representative insisted, “We are not a newspaper. We are a platform that shares information.”
Both comments indirectly refer to a 1996 statute that legally separated “interactive computer service[s]” from the third-party content they display. The larger history of this legislation presents us with an occasion for rethinking our own evolving relationships to media technologies and the information we believe we control.
In 1996, the internet was a very different cyberspace: its dial-up users devoted less than an hour a month to web browsing, Amazon was still a digital bookstore and the top two most visited sites on the internet were AOL and Webcrawler, a search engine that encouraged users to “Search before you surf!” through its novel full-text search options. In February of that year, Congress established a set of laws that would govern this strange new medium. Packaged in the Telecommunications Act of 1996 was the statute invoked by Facebook and Google in recent months: U.S. Code 230. The code made sense at the time. If, Congress insisted, the internet “offer[s] a forum for a true diversity of political discourse,” then such websites were instrumental to preserving this diversity. Indeed, they reasoned, “these services offer users a great degree of control over the information that they receive” and promised “even greater control in the future as technology develops.”
Has this prediction — as bold as any in science fiction — come true? Websites can now confidently predict what stories we will like, what products we will buy and what answers we want to receive from our questions. We therefore have, in a sense, a great deal of control over the information we receive. However, this increased agency is clearly not ours alone: We now share it with technologies that not only present but also increasingly curate the content we consume. Moreover, because this curation depends upon our previous web behaviors, social media tend to reinforce rather than test our existing opinions and prejudices. This tendency can arrest our development and, as the exaggerated title of Shteyngart’s novel suggests, limit our ability to critically evaluate our world and its many challenges. To avoid its super sad conclusion, we should take steps now to change how we encounter and engage with the information technologies that surround — and increasingly create — us.
CATEGORIES: Healthy People