Librarians tend to view information literacy in light of the ACRL Information Literacy Competency Standards. Information literacy is a set of competencies, a set of things we should be able to do. However, one of the many problems with becoming information literate in any robust sense is that it’s completely unnatural. The entire enterprise goes against the way the human mind tends to gather and use information. Human beings are animals perhaps capable of information literacy, but apparently designed to work in other ways.
Librarians tend to view information literacy in light of the
ACRL Information Literacy Competency Standards. Information literacy is a set of competencies, a set of things we should be able to do. If you’re information literate, you should be able to, among other things:
- Determine the extent of information needed
- Access the needed information effectively and efficiently
- Evaluate information and its sources critically
- Incorporate selected information into one’s knowledge base
These all sound like good, sensible outcomes for a sounder higher education. However, one of the many problems with becoming information literate in any robust sense is that it’s completely unnatural. The entire enterprise goes against the way the human mind tends to gather and use information. Human beings are animals perhaps capable of information literacy but apparently designed to work in other ways. You probably didn’t balk at that last sentence at all. Human beings are designed to work in other ways. You might think human beings are “designed.” But did you come to that belief through a neutral and critical evaluation of the available evidence or through some other route? If you approached and evaluated the evidence for such design critically, you might be much less sure of that belief and conclude, along with the scientific consensus, that humans are the product of evolution and not design. However, regardless of whether it is “designed” or not, “the human mind is highly prone to detecting agency,” according to an article in the recent book
The Philosophy of Pseudoscience, “and it often does so even in the absence of agents.” We might think a person is nearby even if it’s only the wind. From an evolutionary perspective, in our history “it is far less costly to assume that there is an agent when there is none than the other way around.” After all, if it’s the wind and we’re temporarily wary it’s a person behind us, that’s better than if we ignore the sound and get attacked. We think we’re designed because we want to attribute everything to a designing agent. It’s just natural. Even if you don’t think we’re designed, you’re probably comfortable with that kind of language, and language shapes our thought. In this case, the language is in line with what Robert McCauley calls “maturational naturalness,” the sort of thinking that comes naturally just because we grow up. We’re comfortable with attributing agency even where it doesn’t exist, because that’s just the way we work. Some skills, like information literacy, can be the result of practiced naturalness, if in fact people practice them. The literature on pseudoscience is rife with examples of flawed reasoning, but the flawed reasoning is a result of our natural thought patterns. Scientific reasoning and critical thinking, the motors behind information literacy and the academic enterprise, are learned and rare, which is why most of us reason poorly much of the time and all of us reason poorly some of the time. Aristotle wrote that humans by nature desire to know, but that doesn’t seem to be true. Psychologists studying how the mind functions might say instead that humans by nature desire to interpret the world in a way that makes sense to them and makes them feel good about themselves, regardless of the facts. It’s called motivated thinking, and one review of the literature claims that “individuals' preferences for certain outcomes are believed to often shape their thinking so as to all but guarantee that they find a way to believe, decide, and justify whatever they like.” Consider that in relation to the task of teaching information literacy. Such thinking is especially prevalent around issues of great importance to us, such as politics or religion. In every area of belief, we want to be right, or at least considered right, but when it comes to beliefs core to our definitions of ourselves, we are highly resistant to alternative beliefs. People are more likely to evaluate positively information that confirms their beliefs and spend more time criticizing information that challenges them. Not only that, but some studies show that when confronted with strong evidence that their beliefs are mistaken, people tend to hold those beliefs
even more strongly. It doesn’t matter how rigorous or scientific the information is. What matters most is their previous beliefs and how the new information makes them feel about themselves. There are various names for these flaws in critical thinking. A study on “motivated skepticism in the evaluation of political beliefs” focused on the following: motivated skepticism, confirmation bias, disconfirmation bias, prior attitude effect, selective exposure, attitude polarization, and cognitive dissonance. We naturally do everything we can to avoid changing our minds and everything we can to make ourselves look better. However, these behaviors don’t just apply to politics or religion. A recent study from Finland on “core knowledge confusions among university students” found that even university-educated students had trouble “in fully differentiating the core ontology of physical, biological, and mental phenomena.” For example, “children may construe almost anything as animate,” as if the moon were a living being because it “moves” across the sky. Adults aren’t necessarily that much better, even educated ones. Students were given 30 statements such as “plants know the seasons” or “furniture wants a home.” “Half of the participants considered at least four, and one quarter of the participants considered eight to 30 statements to be literally true.” That’s literally literally, not figuratively literally, as in such common statements as “that movie literally scared me to death.” In addition, consider Daniel Kahneman’s and others’ work on slow thinking and fast thinking, showing that quick intuitive thinking comes very naturally to us and is often inferior when considering anything that requires more complex thought. However, the slower, more complex thought is difficult and indeed physically draining. The natural working of the human mind explains why so many people believe in astrology, crystal healing, or homeopathy, despite the lack of evidence that they work. Even trained scientists and academics aren’t immune to these problems and are often guilty of confirmation bias or the use of selective evidence. Information literacy in a strong sense is deeply unnatural, and yet we task ourselves with teaching it. Sometimes we might feel bad for not accomplishing more, but given the workings of the human mind, when it comes to teaching information literacy, it’s amazing if we accomplish anything at all.
Add Comment :-
Comment Policy:
Comment should not be empty !!!
Jordan Hunt
Though I have some questions. Is information literacy merely unnatural, or truly impossible as defined by the ACRL? If that definition is impossible, is there another approach or conception that is preferable?Posted : Apr 05, 2014 08:57
Jordan Hunt
Great article. I often feel frustrated when librarians and library advocates talk about information literacy as if its a science. Its not as if we finally settle the nation's political division through information literacy, as some writers of articles appearing in Library Journal and elsewhere seem to imply. Evaluating sources critically is more subjective than most people care to admit.Posted : Apr 05, 2014 08:25
Nathan
Wayne, Hey thanks for the engagement. I'll just stop now before I get to be too obnoxious... : ) +NathanPosted : Feb 11, 2014 07:14
Nathan
Wayne, I understand you think I'm reaching. To my knowledge the points I'm making have been made by non-theologians as well - I actually recall hearing about these points - or points like them - being made in a philosophy journal by a non-theist. If we are not designed to know Truth why should we assume that we can know it? We were "designed" to survive, and here being deceived by our senses (not only as regards our views of ourselves) may be just as useful as being able to get a totally accurate map. It seems that what works is really what matters. +NathanPosted : Feb 11, 2014 03:15
Nathan
Wayne, As always, I appreciate your thoughtfulness. What you say here makes some sense - to a point, I think. It seems we can readily agree with most any other human being that knowing some basic facts or “truths” (little t) on the ground might have some obvious, immediate survival value, for instance, when we both immediately respond to the sight of the hungry tiger and run away. But here is the real question, I think: why would our evolved (and evolving) reason and sense “equipment” be useful for anything more complicated and abstract than this – and if it seems to be, why should we trust it? Why, for instance, would paleontologists who postulate that a bone with fresh dino blood and vessels is 65 million years old based on their understanding of radiometric dating methods, the geological column, taxonomy, and sequences of “index” fossils be more readily favored and selected by the evolutionary process over the practical geologists who learn to efficiently mine and refine iron, making weapons of war? (let’s assume these geologists aren’t barbarians and also have great social skills - perhaps because of a little bit of that tendency you mention to believe false things about themselves, giving them more confidence - which no doubt, are as valuable if not more valuable than tool making). Is there any way to definitively prove that we, in our scientific explanations, are capable of producing complicated theories and models that even begin to be accurate representations of reality – precisely since many of these explanations do not have any obvious, immediate survival value? It seems to me that evolutionary thinking can't help but undercut the value of the concept of truth - and the branch that itself sits on. Ironically, it would seem to only be a theistic view of the creation (which includes God’s endowing us with reliable powers of sense and reason, or our “epistemic equipment”) that would give us reason for having confidence in our theories or models as “maps” that help us get closer to the Truth (i.e. the big picture) “out there”. Or not? +NathanPosted : Feb 11, 2014 02:19
Nathan
Wayne, "Aristotle wrote that humans by nature desire to know, but that doesn’t seem to be true. Psychologists studying how the mind functions might say instead that humans by nature desire to interpret the world in a way that makes sense to them and makes them feel good about themselves, regardless of the facts. It’s called motivated thinking, and one review of the literature claims that “individuals’ preferences for certain outcomes are believed to often shape their thinking so as to all but guarantee that they find a way to believe, decide, and justify whatever they like.”" A very interesting post here. Jumping off that paragraph above, I am wondering how that might fit in with the current scientific consensus re: evolution that you mention. Do you know if there has been much theorizing about why persons seem wired to believe, decide, and justify whatever they like when it seems like this should have been selected against? Doesn't it make sense to think that knowing about reality might more amenable to survival in our environments? Off the topic a bit I know, but its hard to not leap off onto these bunny trails when your article synthesizes so many big ideas. +NathanPosted : Feb 11, 2014 01:01
Rebecca
Thank you for being truthful about this IL nonsense. I read Stanley Wilder's 2005 article- Information Literacy Makes All the Wrong Assumptions article this week, and I cannot believe librarians are spinning their wheels with one-shot instruction and even throwing in "assessment" of these so called instruction sessions. Even the term "information literacy" is offensive. As an academic librarian for seven years, I am ashamed that the profession so desperately is hanging on to information literacy as a form of survival, when in fact the entire premise of information literacy is destroying our roles in academia.Posted : Nov 21, 2013 08:48
Elizabeth
As an instruction librarian, I'm oddly comforted by all this. I've had fairly good luck getting my students to understand and apply what I'm teaching them, but it is ABSOLUTELY an unnatural skill, and doesn't come easily to most people, especially 18 year old college freshmen who have very little incentive to care deeply about web evaluation or the origin of scholarly sources.Posted : Nov 05, 2013 10:51
Wayne BT
For the curious, some of the sources referenced in the column: “Motivated Thinking.” In The Cambridge Handbook of Thinking and Reasoning / Morrison, Robert G.,; 1966-, 295–317. New York: Cambridge University Press, 2005. Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011. Lindeman, Marjaana, et al. “Core Knowledge Confusions Among University Students.” Science & Education 20, no. 5–6 (May 1, 2011): 439–451. McCauley, Robert N. Why Religion Is Natural and Science Is Not. New York: Oxford University Press, 2011. Pigliucci, Massimo and Maartin Boudry, eds. Philosophy of Pseudoscience: Reconsidering the Demarcation Problem. Chicago: The University of Chicago Press, 2013. Taber, Charles S., and Milton Lodge. “Motivated Skepticism in the Evaluation of Political Beliefs.” American Journal of Political Science 50, no. 3 (July 1, 2006): 755.Posted : Oct 31, 2013 10:00