Knight First Amendment Institute, Reimagine the Internet
Great panel today; more to come the rest of the week and they will shortly post the video.
Francesca Tripodi (UNC) shared her amazing research about how
conservatives use textual interpretation techniques to interpret information
and reject journalistic interventions. Conservatives then use and trust Google’s
top results, believing that Google top results reflect reality, which seems a
bit contradictory to me. The problem is that our keywords are ideological, so Google
searches confirm one’s worldview: searching for “illegal aliens” gets you
right-wing sites that confirm what they already believe, while “undocumented
workers” produces very different results. And it’s not just Google—DuckDuckGo is
better for your privacy but returns the same type of results based on
ideological keywords. Google suggestions create the possibility of parallel
internets that are invisible to outsiders. “Data void”: limited/no content is
available, so it’s easy to coordinate around keywords to guarantee that future
searches are directed to content that includes these terms—this is what
happened to “crisis actor.” Search engines are not designed to guide us through
existential crises or challenge our beliefs—the notion of relevance is
subjective and idiosyncratic as well as unstable and exploitable. Knowing/understanding
audience concerns and amplifying key phrases allows conservative media to drive
users to search where their beliefs will be reinforced. Like Council of Conservative
Citizens reaching Dylann Root in his searches for black on white crime. They
encourage viewers to “do the research” while highlighting phrases that lead to
the preferred sources. So Google started autofilling “Russian collusion” with “delusion,”
a phrase promoted by Roger Stone. In impeachment proceedings, Rep. Nunes used
his opening remarks to repeat a few names/phrases and tell us that we should be
paying attention to those—which, when searched in Google, linked to Fox, Daily
Caller, and even more right-wing sources. Urged constituents to do their own
research. Nelly Ohr: a perfect data void/litmus test. She used to work for
Fusion GPS and is part of a conspiracy theory about Russia investigation—the search
exists in a vacuum and was curated by conservatives as a dogwhistle about
election fraud.
What can we do? How can Google fix this? It’s important to stop
thinking about a fix and focusing on Google. Misinformation is not a bug in the
code but a sociological issue. The only way to circumvent misinformation traps
is knowing the kinds of Qs people seek answers to, knowing how they interpret
information, and knowing how political actors exploit those things. [Easy-peasy!]
Barbara Fister, Gustavus Adolphus College: In practice, students
are treated as information consumers who need to be educated to examine claims.
At universities, they are often treated as needing help finding information in
the walled garden of the library, focusing on information that will help them
satisfy professors. Libraries have felt compelled to emulate Google and create
single-search boxes. But the results don’t help you navigate the results, so it’s
no wonder that students come up with workarounds. Students have trouble getting
themselves situated. They adopt a strategy and stick to it; look for “safe”
sources; often don’t really care about the topic because it’s been assigned.
Follow the news, but don’t trust it; don’t think college does much to prepare
them to ask questions of their own. Feel both indignation and resignation about
algorithmic systems invading their privacy. Students feel that they’re in a
very different place than professors; they’re used to different sources. “We
grew up with untrustworthy sources and it’s drilled into us you need to do the
research because it can’t be trusted.” Students are already being taught “media
literacy” but more of the same won’t necessarily help, because people who
believe misinformation are actually quite “media literate” in that they
understand how these systems work and are good at manipulating them. Qanons understand
how media/info systems work; they interpret media messages critically; they
feel passion for discovery and enjoy the research b/c they feel like they’re
saving the world. Alternate authority structure: trust yourself and
trust Trump/“the Plan.”
What is to be done? Deep-seated epistemological differences: if we
can’t agree on how we know what’s true, hard to see common ground. So what’s
next? Recognize the importance of learning to trust, not just to be skeptical;
get at why to trust rather than what to trust—saying “peer-reviewed research”
doesn’t help; explore underlying values of knowledge systems, institutions, and
practices such as journalism’s values; frame learning about info systems as
education for democracy: you have a role to play; you should have an ethics of
what it is that you will share. Peer-to-peer learning: students are learning
from each other how to protect privacy etc. Students are concerned about their
grandparents and about their younger siblings—interested in helping other age
groups understand information.
Ethan Zuckerman, moderator.
Fister: Further reading: Information
Literacy in the Age of Algorithms—what
students are interested in that doesn’t come up in class: knowing that Google
works by using the words we use rather than as a neutral broker would be very
important! Alison J. Head (January 5, 2016), Staying smart: How today’s graduates
continue to learn once they complete college; Project
Information Literacy Research Institute, Alison J. Head, John Wihbey, P. Takis
Metaxas, Margy MacMillan, and Dan Cohen (October 16, 2018), How Students Engage with News: Five
Takeaways for Educators, Journalists, and Librarians, Project Information
Literacy Research Institute.
Tripodi: People would say “I don’t trust the news” and she’d ask
where they got candidate info; they say “Google,” without acknowledging that
Google is an aggregator of news/taking content directly from Wikipedia. We’re
not in a new time of epistemological fissures or polarization—we have always
been in a place of big differences in how we seek truth, what are sources of
knowledge, how we validate knowledge. What’s changed: we can connect from
further away and we have an immediate ability to determine what we think is
right. Focus on keywords is something that work on filter bubbles hasn’t yet
considered—it’s not the tech that keeps us in the filter bubbles; we are the
starting point for that closure.
Zuckerman: the people who find hate speech on YouTube are the
people with hateful racial attitudes—so the polarization argument may not work
the way we thought.
Fister: the power to amplify and segment market messages is way
more pronounced now. But it was deliberate fissure with the rise of Fox News,
talk radio. Amplified by platforms that like this content b/c controversy
drives attention. Far right white supremacists have always been good at tech—used
film early, used radio; they are persuasion machines designed to sell stuff. They
are earning money while using the platforms, which has changed the velocity/amplitude
of the most hateful speech.
Tripodi: There may be ways to figure out the keywords that
resonate with people’s deep stories, to find the data voids, by doing more ethnographic
work. The narrative that conservatism is being silenced: trying to reshape
objectivity as “equal balance.” Rebranding of objective to mean “both sides.”
If your return doesn’t show equal weight, it’s somehow flawed/biased/manipulated
[at least if your side isn’t dominant]—that’s leveraged in the rightwing media
ecosystem to say “don’t use these platforms, use these curated platforms that
won’t ‘suppress’ you.” That’s complicating notions of media literacy, which
sometimes uses “look for both sides” as an indicator of bias. Propaganda campaigns
are now leveraging the idea of “lateral reading”—looking for relevant phrases around
the target of interest in a new window—these systems are being deliberately
exploited. Thinking about keyword curation may help: you could put a bunch of “Nelly
Ohr” all over mainstream coverage of the impeachment. Old fashioned SEO
manipulation in a new light.
Fister: discussion of the tautology underneath this: you trust the
sources you trust b/c you trust them. People create self-reinforcing webs of
trust by consulting multiple sources of the same bent. Students are also
interested in talking about how algorithms work, including for sentencing
people to prison; tie that to traditional values/understanding of how we make
knowledge.
Tripodi: in response to comment on similar dynamic on doctor/patient
relations: when people search “autism treatment” they are more likely to see
non-evidence-based treatments, because doctors with evidence-based treatments
are not using YouTube. Has a student who is trying to create a lexicon for
doctors to tell people “research these treatments”—you can’t tell them not to
search, but you can give them phrases that will return good quality content.
Also important to make good quality content for evidence-based treatments.
People are looking on YT; have to be there.
Zuckerman: that requires auditing the platform; YT is not that
hard to audit, but FB is when it directs you to content.