why doing your own research just doesn't cut it in the era of search engines
New study shows that far from dispelling misinformation, googling can actually make it sound more plausible and realistic. Why? For money, of course.
If you’ve spent any time whatsoever on the internet, you almost certainly either saw or used the Ralph Wiggum “I’m a Resurcher” meme as a response in an argument. It’s all too easy to deploy it because we all know that the internet is now full of very official or professional looking sites that can support whatever stance you want, and that people suffering from a clinical case of confirmation bias will skip all the “fake news” until the screen in front of their faces says whatever it is they demand it to say.
“What’s the big deal? Yeah, people believe whatever they want to believe. All they need to do is run an actual search and find out the truth if they’re open to it. If not, well, they were never reachable in the first place.”
Well, about that. If I were to go by my knee-jerk reaction, I would agree with the, sadly, very real fact that some people do not want to be reached or learn any real facts, they just want to stay in their little universe. But when I read the new study from Nature on the subject, and remembered some of the worst aspects of how the the web works, it was pretty apparent that not only does this view write off a lot of people, it also fails to take into account that search engines made it harder to find accurate information.
Basically, the root of the problem is that search engines — by which I mean Google as it does 91% of all searches on the web, and has since 2008 — are supposed to treat websites differently. Search rank has over 200 constantly adjusted factors which are designed to tell a unique, high quality website from a spam trap. Top tier news sites to which a lot of other sites link and which have constant updates? Right to the top of the results. Steve’s House of Conspiracies last updated in 2003 with an Illuminati gif? Not so much. Maybe page 637 at best. At the bottom.
However, while those 200+ factors for high ranking success are as closely guarded as the Colonel’s chicken recipe and the Unites States’ nuclear codes, an entire industry has sprung up to game them through millions of automated attempts every day. Just like the Colonel’s chicken was either leaked online or reverse-engineered, and we now know that the nuclear launch code for all LGM-30 Minuteman ICBMs was 00000000 despite vociferous denials, many an SEO expert and marketer has managed to at least temporarily game Google rankings and get hoaxes and lies to the top of the pile.
Even worse, if you take the aforementioned Steve’s House of Conspiracies and spend about $100 on a decent web host, domain, and a clean, responsive WordPress theme created by a professional designer, then rebrand it as the Stevington Institute with an AI logo or a quick rush job from Fiverr, suddenly, Google is a lot more lenient. Now get your friends with similar strategies to link to you and post screenshots of your articles, and a pitiful joke of a conspiracy site looks very legitimate.
Google can’t understand the actual words, nor does it fact-check sites. As long as the code is easy to crawl and conforms to its standards, it’s updated regularly, and lots of people link to it for one reason or another, it’s treated the same way as sources ran by journalists, employing teams of fact checkers and professional editors.
how research can turn into resurch
And this is where the team from the University of Central Florida picked up the thread, asking thousands of test subjects to read some articles, then “do their own research” and decide if it’s true or not. The catch? Some groups of test subjects were presented with search results that had less than stellar sources, the kind that are usually rife with misinformation, and for those test groups, the low quality sources made up to a tenth of all search results they could study before rendering a verdict.
You probably already know where this is going, and yes, subjects who went down the rabbit holes of disinfo sites were 19% to 22% more likely to believe that a false story was actually true even when it wasn’t, and they had access to results rightfully calling bullshit on them.
The study also notes another major problem. Participants may go out and research claims on their own with the best intentions, but the majority of false narratives aren’t fact checked at all. Not only are they not going to find that no, the cartels did not get in business with the Reptoids to ship in Chinese-funded illegal pedophile cannibals to indoctrinate kids into turning trans in elementary schools for voter fraud, they’re not even going to find any mentions of this story outside the Stevington Institute.
As Jonathan Swift wrote all the way back in 1710, falsehood flies, and the truth comes limping after it. If that was the case back when news traveled at the blazing top speed of 11 kilometers per hour, today it travels at the speed of light and is often distributed by social media and search results, which is the equivalent of relying on rumors, zines, and pamphlets on the streets.
In other words, there has never been more news and it has never moved faster, but it also moves through more unreliable channels than ever, while actual fact checkers and reputable journalists are overwhelmed. Add deeply flawed search results, lack of followup on the majority of stories, malicious disinformation, and political biases that have been elevated to a sacred right to believe in one’s favorite reality, and you have a recipe for disaster.
All of this was on full display in the second of the experiments detailed in the study, in which trying to verify whether a story was true with online searches prompted 17.6% of respondents who correctly said that a story was false or misleading to change their answers and mark the stories as true, while a little under 6% corrected their answers after a thorough spelunking through the search results.
Even more disheartening, data from the fifth experiment showed that none of the test subjects were not more likely to believe any particular news source over others or fall for conspiracies, hoaxes, or lies. So, it’s not that there were any highly motivated and obstinate partisans looking for what they wanted to be true. The test subjects were in fact just regular people being genuinely misled by a torrent of unchecked lies.
why the “marketplace of ideas” doesn’t work
Perhaps the biggest reason why social media and search engines don’t fact check the information they present is because a) they don’t want to because it’s expensive and thankless, meaning there is no extra money in it, b) they can’t keep up with the flood of links and posts, and c) they’re told that doing so will undermine the “marketplace of ideas” in which the user is the God Emperor of veracity.
This so-called “marketplace of ideas” was sold to us as the obvious next stage of our scientific and cultural enlightenment, a safe space for demagogues where every idea gets aired so the public at large could make the decision about its ethical and factual merits. From a purely abstract, the-world-is-my-debate-club-where-nothing-really-matters standpoint, it seems like a perfectly fine way to build a worldview. But if you think that facts matter and not all ideas are created equal, this is just more hardcore libertarian spherical chicken in a vacuum thinking.
Simply put, there are people who just really, really don’t like being told that they are wrong, be called out on their lies, or feel inadequate and angry because they’ve been told that they fell for a scam or a hoax. By dismissing any sort of fact checker as the pawn of “the censorship agenda of the elites who want to tell you what to think,” liars and the partisan zealots who hang on their every word can pretend that anyone who can prove them wrong is just peeing in their cereal because they’re jealous that these supposed luminaries of free thought “broke free from old paradigms.”
If you had Google starting to verify certain sources and purge the flood of Stevington Institutes polluting a lot of popular search results, the marketplace of ideas brigade would be demanding the heads of Googlers on a pike. In fact, they’ve already done so when their favorite lies didn’t score highly enough in their queries, or the results also featured links to fact checks and criticism.
“Yes, yes, I get it, people want soothing lies and hate being told no. But so what? It’s how people have always been, isn’t it? What’s the worst that can happen?”
Well, the worst that could happen is the complete collapse of trust and shared reality, which is critical to govern a free, democratic, modern country. This is why so many modern authoritarians don’t outright censor anything unless they feel it’s absolutely necessary, and flood the zone with a tsunami of bullshit and propaganda so it’s more and more difficult to figure out what’s actually true in the hope that their citizens just give up and silently comply, exhausted and confused.
This is how what’s being so generally called modern conservatism has deteriorated into an endless stream of outrage bait, and so many Republicans spend 96% of their time being infuriated to their point of clinical concern about things that never really happened, but they saw in memes on some Facebook page most likely called some variation of Patriot Boomers 4 MAGA 1776, shared to Apartheid Boy’s House of Nazi Bluechecks… err, I mean X, from where it’s picked up by Fox, OAN, and Newsmax.
And this is the fundamental question we need to solve sooner than later. If it’s more and more difficult to tell truth from fiction because our main tools for doing so care much more about appeasing people than telling them the truth, and angry liars and their fan clubs can force a nearly $2 trillion business to kowtow to them, so much so that trying to do one’s own research is now an invite to be lied to and scammed, why are we still using these tools and simply rolling over when those who benefit from keeping them broken demand they stay that way?
See: Aslett, K. et al. (2024) Online searches to evaluate misinformation can increase its perceived veracity. Nature 625, 548–556, DOI: 10.1038/s41586-023-06883-y