Source: Library Journal
By December 10, 2015 3 Comments
on
This all started when my teenage son reported that Adam Sandler has Ebola. He saw it trending on Facebook. I sighed inwardly and asked if he had looked at the source of the information. Being the son of a librarian he quickly said: “Yes! CNN.com.”
Of course, Adam Sandler did not really have Ebola, and CNN wasn’t reporting that he did. The site had been hacked. But a lot of people fell for it.
The Adam Sandler Ebola scare threw me for a loop. I decided to take a deep dive into the research on how we should be thinking about evaluating and searching for information at a time when people get most of their information from Google, Facebook, and Twitter.
I devised some interesting everyday-life research questions, then set about trying to answer them: I investigated whether red wine has health advantages, whether dogs exhibit some rudimentary form of empathy, and the veracity of user review sites for restaurants, travel, books, and other products. (Byinvestigated, I don’t mean I spent a few hours: I spent months getting the backstory on the information I was turning up.)
There were many surprises. After 20 years of teaching information literacy I found I didn’t know that much about finding reliable everyday-life information. My teaching focus had been on discipline-specific research skills, but many undergraduates could benefit as much from, and relate better to, learning more sophisticated search and evaluation strategies for work, for health, even for buying a car.
In research done in 2012 by Project Information Literacy on the information competencies of recent college graduates in the workplace, employers indicated that their newly hired students were adept at quickly finding an answer, but lacked the skills needed to find the best answers to solve problems in the workplace. They lacked persistence and patience and relied on information found on initial search screens rather than using more sophisticated strategies.
As I traversed the information landscape I was glad to be supported by ACRL’s new Framework for Information Literacy. Research can be creative, reflective, iterative, and most of all, messy! One of the messiest places I explored was the notion of expertise and how it could vary in different contexts. I also looked into why user-generated data is wonderful in some cases and unreliable in others.
SEARCH PSYCHOLOGY
As I devised strategies to answer my research questions I uncovered a new (to me) area that I call the “psychology of search,” which encompasses information heuristics and other phenomena that can greatly impact our ability to locate reliable information.
Information heuristics are shortcuts we all use when we search for information. For example, there is the Bandwagon Heuristic that involves people assuming that if many others think something is correct, then it must be correct. Other common information heuristics are the Reputation Heuristic, the Consistency Heuristic, the Persuasive Intent Heuristic, Motivated Cognition, and “Satisficing.” Research by media studies professor Miriam Metzger and her colleagues reported on in the Journal of Communicationfound that these heuristics are a double-edged sword because, on the one hand, they can reduce the amount of cognitive effort used in information seeking. On the other hand, using heuristics can also lead to systematic biases or errors in judgment.
Daniel Kahneman’s research on fast and slow thinking is a useful antidote to relying on shortcuts that can shortchange our ability to locate reliable information. Slow thinking includes the ability “to doubt, to hesitate, to qualify” and can also help us question our tendency to suffer from source amnesia, a common occurrence involving remembering the information we find, and thinking of it as “true,” but forgetting the source of the information. Slow thinking can also keep us from our penchant for “false certainty,” drawing instant conclusions based on the first piece of information we find and not even recognizing the possibility of uncertainty.
I also made new connections between evaluating information and Dominique Brossard and Dietram Scheufele’s research (reported on in Science in 2013) on self-reinforcing information spirals. This spiraling takes place when how people search for a topic then influences how a search engine like Google weighs and retrieves content. Brossard focuses on how science findings are communicated to the public. She questions whether we are headed into a world in which the information we view is heavily influenced by what links search engines pull up, in effect narrowing our options. Moving forward, as many people are fed news through social media sites, this phenomenon will likely accelerate. It may be that your friend who keeps “liking” cat videos is also choosing the science news you read.
Eli Pariser’s ideas about echo chambers also have an impact on the information we view. An echo chamber occurs when information is amplified by repetition inside an enclosed system, such as our Facebook or Twitter Feed, and where different or competing views are not provided. In related work, Andrew Revkin criticizes the journalistic tendency toward “single study syndrome,” which refers to the habit journalists have of seizing on the latest scientific finding and representing it as “the truth,” greatly oversimplifying the way science is communicated by implying it is an isolated event rather than a scholarly conversation that needs to be presented in context.
A MAJOR MESS
What really struck me hard, as I carried out my investigations, is what a mess we are in! As we do our Google searches we are too often hit with clever marketing drivel masquerading as information. For certain kinds of searches, such as brief factual information, Google is often terrific. But Google is not able to keep on top of the search engine optimizers, and Google itself manipulates search results for commercial gain. Many factors contribute to what comes up when we search, including white hat (legal) and black hat (illegal) search engine optimizing tactics. As a result, appearing in the top five in the search list sometimes has little correlation with being a trustworthy source, though research by Eszter Hargittai and colleagues in the International Journal of Communication in 2010 found that many people equate link order with reliability.
I came up for air three years later with a new book of information stories that illustrate best (and worst!) practices for finding and evaluating information online. Hint: librarians play a humongous role. These practices include helping our users “start at the source” when searching for information rather than just throwing a few words into Google. In other words, helping people articulate and locate trustworthysources of information as part of the search process. By helping people better understand their options, for example knowing that Amazon is not the best place to go for honest book reviews, we also help boost the visibility of more valuable information sources improving their odds of survival.
We also need to educate people about when the “wisdom of the crowd” effect is valid. For example, sites like the Weather Underground, that crowdsource verified objective data can be relied on, whereas aggregated review sites like Yelp and TripAdvisor need to be used with caution because they contain many fake reviews, suffer from positive data skew, and do not meet the various criteria required for a “wisdom of the crowd effect” to function. We need to point people toward experts when expertise is called for, such as investigating health treatment options, and we need to help people understand motivation and context in a disintermediated information environment.
MARKING NEW TERRITORY
But developing strategies to find reliable information is only one small step in the much larger challenge of how we decide to structure the web in the years to come. We need to have better markers of quality and reliability. We need to know when we are on a pharmaceutical company–funded site that is giving us biased health advice (such as WebMD) and when we are on an independent health information site. We need many new tools to help filter and curate the web. Crowdsourcing “likes” is not a sufficient strategy.
If we think of the web as the Wild West, where brothels and outhouses were a little too front and center and more reputable businesses took time to establish themselves, we can start trying to create some zoning and street signs online to help people find their way. Librarians have a long history of curating and filtering information, and we need to continue to transition this important work onto the web.
There is a tremendous need in our profession to communicate and develop new strategies for locating reliable information, such as the ones mentioned above. I see the road ahead as drawing on what librarians are great at: providing filters, curating information, designing markers of quality, and helping people figure out when they are in a shoe store and when they are in a library.
Leslie Stebbins is a consultant and the author of the new book: Finding Reliable Information Online: Adventures of an Information Sleuth, Rowman & Littlefield, 2015. For more on the author, her book, or a free ALA Booklist Webinar on the book see: LeslieStebbins.com.
No comments:
Post a Comment
Have a Say?..Note it down below.