Friday, January 6, 2017

Teaching Research and Writing in the Age of "Fake News"

It's almost hard for me to imagine it now, but growing up, I never watched the news. It was never on in my house. I remember taking "current event" quizzes in my history class and feeling completely disconnected from the world. I only heard of the topics in the few minutes we spent before each class naming some things going on at the moment.

This is still the state of things in the house where I grew up. My mom doesn't watch, read, or listen to the news. She gets her knowledge of current events from snippets of conversation with the people she works with (and me and my little brother, who frequently go on political rants). When I ask her why, she suggests that the news stresses her out, feels overwhelming, doesn't really add value to her life.

I've been hanging out in some "minimalist" groups online, and--while definitional battles abound--one of the core principles is to remove things that don't add value to your life and to simplify what remains. Perhaps, then, it shouldn't have taken me so off guard today when someone in one of these groups asked if people had cut news media out of their lives as part of their minimalist principles. Many, many people chimed in to say they had done exactly that.

It kind of startled me.

Several commenters immediately connected their rejection of news to the "fake news" epidemic. Unfortunately, I have now seen that label applied haphazardly to any news that the assessor doesn't like, not just the content that is in fact completely fabricated by its author for clicks and revenue. This label has given people like Rush Limbaugh a new tool in a right-wing attempt to dismantle and disrupt news media. 



I got in a heated Facebook debate with an uncle who chastised my little brother for sharing "fake news" when he posted about the recent ruling that gave police a lot of latitude if they made the choice to shoot someone's pet. Admittedly, my brother had shared this information from a click-baity site with a headline designed to be inflammatory and an obvious agenda against expanding police authoritarianism. However, the presence of bias and even a lack of journalist ethical standards does not make something "fake." When I linked my uncle to more legitimate news sources reporting on the case and eventually to the actual court ruling itself in an attempt to demonstrate it was not "fake," he responded by saying that it just didn't feel real to him. That was the end of the discussion. "Feelings" replace facts, journalistic integrity, and multiple sources (including a primary document).

I'm a community college English instructor, and I taught research writing for the first time in a long time last semester. Many of my students (online students, many enrolled in other colleges and universities who had theoretically been educated in media literacy) reported having no idea how to determine if a source was valid or not. Seeing how uncomfortable my students are with critically analyzing media and applying media literacy principles to their own thinking and writing made me realize that these issues go far beyond the Facebook rants of my elder relatives. The landscape of information is universally overwhelming, and people like me (as a professor of rhetoric) are responsible for helping make it less so.

I think that there are a lot of factors in the mix right now that could make turning away from news media attractive: the political climate is toxic, we're very polarized and human nature tends to make us want to consume news that supports our own views, the sheer number of sources available at our fingertips is overwhelming, and the pay structure for journalism has become so dysfunctional as to make even some otherwise ethical institutions turn to some rather questionable practices.

It's coupled with a turn away from experts and expertise. We're reacting to information overload by trying to simplify and condense what there is to process. I think that's an understandable reaction, but we're doing it (as we do all things) within the confines of our own social conditioning and prejudices. The result is that we pick and choose what makes us feel comfortable and filter out the rest, leaving us with a very narrow worldview and the option to reject anything at any time if it happens to conflict with whatever general framework we've aligned ourselves with at the moment.

In other words, our desire for stability and simplicity as individuals has led to chaos and confusion as a collective.

National Geographic has an interesting post about this anti-science phenomenon that asks why it seems that "doubters have declared war on the consensus of experts." 

Author Joel Achenbach makes a lot of smart observations, including this one about the risk that we take to our identity when aligning ourselves with particular beliefs: 
Americans fall into two basic camps, Kahan says. Those with a more “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change. In contrast, people with a “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to—some kind of tax or regulation to limit emissions. 
In the U.S., climate change somehow has become a litmus test that identifies you as belonging to one or the other of these two antagonistic tribes. When we argue about it, Kahan says, we’re actually arguing about who we are, what our crowd is. We’re thinking, People like us believe this. People like that do not believe this. For a hierarchical individualist, Kahan says, it’s not irrational to reject established climate science: Accepting it wouldn’t change the world, but it might get him thrown out of his tribe.
He also notes some of the more obvious reasons that people are finding it easier and easier to establish their own parameters around "truth," namely that the internet has removed the gatekeeping institutions that used to determine which information was widely disseminated.

I agree with his claims and conclusions, but I don't think they're quite complete. He briefly mentions distrust of corporations when he discusses people's reluctance to accept GMOs as safe, but I think that corporate influence deserves a much closer look. We've been socially trained to "follow the money" and figure out how economic interests might influence someone's perspective. When we do that in a world where corporations are funding the research about their own fields, our skepticism is more than understood--it's responsible.

Furthermore, I know several people with left-leaning political ideals who have a dogmatic adherence to the scientific method in a way that operates as a blind spot. Yes, the scientific method is a wonderful tool to help us ascertain truths that are free from human bias, but the reporting of those findings and even the decision to research them in the first place is just as tangled in bias as everything else. We have awfully short memories if we want to pretend that science has not been used as a tool to further political agendas. Just consider how phrenology was used to justify racism or how homosexuality was classified as a mental disorder in the DSM. And these approaches are far from old and dead.

Also, many of the "anti-science" people I know are not actually anti-science when the new scientific findings fit their existing beliefs. I've met several people who reject any scientific evidence that vaccines do not cause autism or that climate change is manmade but who are quick to adopt new scientific evidence about the early development of a fetus as part of an anti-abortion rhetoric. This suggests to me that defending science for science's sake isn't going to be the answer to this riddle.

There's a part of me (the part that has researched rhetoric across centuries) that feels a kind of optimism about everything. That part thinks that we're in the midst of a rather predictable upheaval surrounding notions of truth in the face of massive technological advancements and changes to human communication. That part of me has read about the pendulum swings that have followed every technological revolution to communication (the advent of writing, the printing press, open access education). Things will settle. They always do.


But that's kind of the coward's answer, isn't it? After all, things don't just "settle" on their own. They settle with the work of the people of those times (the thinkers, the writers, the scientists, the artists, the philosophers) to reign the pendulum back in. People work from either side to pull it back until the swing is manageable and less disruptive, until the next advancement can throw it back into wild motion and it's someone else's turn to do that work.

And it's my turn now, right? I mean, it is literally my job to teach people how to tell what sources they can trust and figure out how to use them in conjunction with their own perspectives to make ethical, informed decisions and disseminations.

It's hard, though. I suspect that it has always been hard, but I think that those of us who are trying to teach research and writing skills in today's climate have a particularly and uniquely challenging task.

In a very interesting and important post, Danah Boyd at NYU asks if media literacy has backfired:

Understanding what sources to trust is a basic tenet of media literacy education. When educators encourage students to focus on sourcing quality information, they encourage them to critically ask who is publishing the content. Is the venue a respected outlet? What biases might the author have? The underlying assumption in all of this is that there’s universal agreement that major news outlets like the New York Times, scientific journal publications, and experts with advanced degrees are all highly trustworthy. 
Think about how this might play out in communities where the “liberal media” is viewed with disdain as an untrustworthy source of information…or in those where science is seen as contradicting the knowledge of religious people…or where degrees are viewed as a weapon of the elite to justify oppression of working people. Needless to say, not everyone agrees on what makes a trusted source. 
Students are also encouraged to reflect on economic and political incentives that might bias reporting. Follow the money, they are told. Now watch what happens when they are given a list of names of major power players in the East Coast news media whose names are all clearly Jewish. Welcome to an opening for anti-Semitic ideology.
What her post makes clear to me (and I hope you will read the whole thing and supplemental links at the bottom) is that media literacy cannot be introduced as a concept in a class with basic principles and then built upon slowly over time. We do not have the luxury of a stepping stone approach to teaching media literacy because teaching only the basic principles is a liability. Students have too many influences on their attention and ideas for educators to pretend that what they learn from us will stay pristine and safe until we meet again next class period, next semester, next year. If we want media literacy to work as an educational tool, we have to go ahead and pull back the curtain and show its messy, chaotic reality from the very beginning. We need to admit that it's not easy. The rules aren't always clear cut, and we need to create assignments that allow for that mess.



I was trying to identify the core ideas I want my students to take away from my class, ideas they can apply in a variety of frameworks and ideologies to make their own thoughts (and hopefully the world around them) clearer, better informed, and more solidly supported. Here's what I came up with:


  • Biased Does Not Mean Untrue- This might seem like a bad precedent to set. A lot of the composition textbooks I've seen teach identifying bias as a way to figure out if a source is credible or not. Often, it is simplified (intentionally or not) as "biased=unusable." I understand how this could be a good starting point for a discussion about credibility, but the reality is that every single source of information available is "biased" in one way or another. Even a completely fact-based report was chosen as a headline over some other fact-based incident. If we tell our students that they can't use a biased source while they're simultaneously hearing that all of the sources around them are biased (from our class and elsewhere), they're left with very few options. Instead, we need to teach that bias must be identified and accounted for, but it doesn't make a source unusable. Information found in an obviously biased source should be verified more carefully. It should be included in a paper with some sense of balance and a clear identification of its bias, but it can be read, it can be used, and most importantly, it can be true


  • Stop Hating on Wikipedia- I don't let my students use Wikipedia as a source in their papers, but I tell them to use Wikipedia as a working bibliography that links to other sources. I also tell them that it's fine to use as a starting place to generate ideas or to find out some basic information about a topic before determining whether they want to research it further. Many of them are absolutely shocked when I say this because they've been told over and over again that Wikipedia and "real research" never overlap. But we all know that plenty of information on Wikipedia is valid and informative. If we teach students to reject all information on Wikipedia outright and they then go and see good information on Wikipedia, we're helping to create a culture where good information is rejected. Yes, Wikipedia (and many other open source sites on the web) are problematic and tricky to use well, but let's just go ahead and embrace that conflict and tension from the beginning. 


  • Including the "Other Side" Is More than a Cursory Paragraph- One habit that students are picking up as a direct result of media literacy is to include a paragraph (sometimes two) showing the "other side." This often leads to choppy, disconnected information included in a perfunctory way without any real reflection on what it means for the author's own argument. It's a troubling habit that is being reflected in legitimate news sources all the time as they attempt to appear "fair and balanced" in an increasingly politicized world. But balance isn't a given. Some things don't have an equally valid counterpoint. Teaching students to hunt for one and then drop it into their papers (whether that's what we meant to teach or not) isn't helping anything. This balance fallacy has to be addressed, and it means that when students investigate the "other side," they're necessarily going to have to also investigate the validity and weight of those viewpoints. 

I think that these goals in my classroom all reflect one underlying principle: be straightforward and upfront about the chaos and messiness of research and writing. It's only natural to want to present things concisely for students, especially when we're required to evaluate what they produce. We want set standards and hard boundaries. If we teach at an entry level, we reason that students will get more advanced media literacy standards elsewhere, in higher level classes. We do these things in an attempt to make a very difficult topic more manageable for our students, but I think we do them a disservice. If we really want to provide them the tools they need to navigate a messy world, we need to admit its messiness from the very beginning. It might be harder, it might be more frustrating, but it will ultimately serve our students much better in the long run--and that serves us all.

Images: Gary Thompson, Robin Malik, Sharyn Morrow

No comments:

Post a Comment