Positive, Negative, or Somewhere In Between
What’s your immediate reaction? Those having a positive reaction may appreciate the value in using technology to help students succeed. That might be particularly important to academic librarians at community colleges or universities with large segments of low-income and first-generation students where retention rates are low. To help more of those students persist to graduation, we might be willing to use any technology resources at our disposal—especially as the latest data indicates that the number of students who fail to persist after the first year is on the rise. Those who had a negative reaction or were feeling some level of discomfort no doubt worry how such technologies invade and compromise personal privacy. There is also likely some middle ground where we can see the possibility for both good and bad outcomes. We want to give our students every advantage, but at what cost—and are we willing to pay the price?Already There?
However you reacted, the reality is that high-tech performance monitoring systems will become commonplace in the not too distant future. From the cradle to the grave, parents, educators, and employers will leverage monitoring and assessment technology to help us perform at our best, guide us to make the right decisions, and protect us from the unseen and unknown dangers that could derail us from the tracks of security and success. The growing popularity of wearable fitness devices points to a willingness to subject ourselves to performance monitoring if we believe it will produce a desired outcome. When the data from a day of activity is uploaded to the cloud, do we know exactly where it’s going, who can access the data, or the degree to which its confidentiality is protected? Surveillance technology is also being applied in the workplace, mostly in the service and retail industries, to monitor and improve employee performance as well as allow for the detection of theft or unethical behavior. Research from 392 restaurants that installed these systems reported that loss from thefts had moderately declined, but, more significantly, the sales per server increased dramatically. Knowing they were being monitored, servers worked more diligently to sell extra appetizers, drinks, and desserts. Owners received more profits, servers received bigger tips, and customers received better service. Who else was a winner? It also allowed the system owners to collect vast amounts of data to use for their restaurant consulting services. As the products and services we use all become more “intelligent,” data will be collected about and from us in ways we can hardly imagine today. By 2025, every automobile manufacturer will produce “connected cars” that collect data about our driving habits, destinations, and system performance. Would you be opposed to getting a text message from your car pestering you to stop procrastinating in getting the oil changed? Helpful, maybe, but at what cost to your privacy?Asking the Right Questions
Why, now, do humans need to be monitored to help them become better students or workers? What’s wrong with allowing students to experience college without a safety net there to save them whenever they lose their balance? John Warner smartly tackles these questions in his essay “The Costs of Big Data.” Reacting to a piece by Anya Kamenetz about the Course Signals system at Purdue, an analytical performance-tracking warning system, Warner asks what it is we really want for our students when it comes to success. Just graduating? He writes, “What if we worry that their adult lives will not come with Course Signal warnings? And mostly, what if we worry that this institutional focus on capturing and employing data distracts us from what is most meaningful about the college experience…maybe tells students that they are a data point. Or maybe Course Signals becomes a crutch, substituting tips and tricks for in-depth human interaction, the kind we know alters lives.” We need to have critics question the value or necessity of performance tracking and analytics systems, not only because of the mishandling of data and privacy intrusions but because of the unknown consequences it may have for our students. What if it helps them survive college but not the real world of work they’ll likely face where no one is helping them avoid failure—or will it be the world where constant monitoring, data collection, and analysis is just the way of life?Calming the Fears
Knowing that there are ways in which collecting and using student data could prove beneficial, perhaps we need to refrain from immediately writing off monitoring and preemptive warning systems as dangerous technologies. To that end, how student data is or will be used is a growing area of debate at all levels of the American education system. Repeated large-scale data mishandling and privacy intrusion incidents should rightfully have us questioning why feeding student data into these systems is a practice worth even considering. In his article “Reframing the Data Debate,” Steve Rappaport acknowledges this when he states, “Fears about misuses of student data feed into larger narratives about dangers to privacy and the security of data fueled by revelations about the NSA, Target, etc., and their fervor makes it impossible to dismiss them as ill-informed rants.” He goes on to remind us that progress in education at all levels has always depended on the collection of student data. Though he represents the interests of educational technology firms that produce the learning products consumed by students, Rappaport writes that those firms must clean up their acts and demonstrate that they can calm the fear by making sure student data is secure and that privacy rights are respected. That sounds good, but can we trust the EdTech industry to do the right thing?Setting Limits and Sensible Choices
Perhaps the use of digital tools for tracking, monitoring, and performance assessment, all intended to facilitate predictive analytics, is neither good nor bad. They are tools we have at our disposal to allow us to accomplish something helpful but could have unintended consequences that would lead to harmful results. It’s up to us to determine the level at which we implement and apply the tools and to understand fully the context for their use. While I think it’s interesting and builds on a growing personalization of service trends in academic librarianship, I’m personally uncertain about a “Library That Learns You” service. While I think some students would find it valuable, and it could possibly shift the odds of success to the student’s favor, it hardly seems like our preferred mode of interaction. Just because you could put a robot at your reference desk, would you do it? It may sound awful now, but in 25 to 30 years when it’s technologically possible, perhaps it will be just one more user expectation, not unlike expecting to find a café in today’s library.Learning From the Past
I don’t have the answers. What I do believe is that, over the next 20 or 30 years, our profession will be greatly challenged in this whole environment of student data. Some of the pressure to participate in these systems will come from our own academic administrations as they seek to improve student performance, lower student debt, and achieve the metrics required by emerging government standards. At one point in time, Taylorism was a respected method for improving the workplace and outcomes. Looking back, we now know that imposing scientific management achieved great efficiencies but did so at the cost of destroying worker morale. We will need to be careful not to repeat the mistakes of the past when it comes to deploying technology with the good intention of helping our students achieve short-term results when it is not clear to us, in the long run, how it will truly impact them.We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing
Add Comment :-
Comment Policy:
Comment should not be empty !!!
Christina
I would like to have a student profile pop up while doing online reference. Having their course schedule can help with learning what the student may need and if there is a professor that is specific we can recommend materials that can help them along with their assignments because we know the faculty member and what they want from students. Kinda like when a junior tells a sophmore ..oh in Dr. so and-so's class make sure you include acd in your paper he likes that...it helps them get a better grade. It would be neat to be able to link to syllabi and other stuff to help the student out. We can still give that human touch, without delving into way too personal info. It could also be voluntary ..when a student signs into the library system they can check or uncheck a box that would deliver the personal info..so if they wish to be anonymous then so be it.Sometimes it's a help and sometimes a patron just wants what they want and wishes to move on quickly and that's understandable. Reminds me of Star Trek and The Time Machine. Both have adaptable computers. The hitch is that the human touch is really only present in The Time Machine where as in Star Trek the computer is static and the characters get frustrated sometimes and have to construct in other ways to accommodate the limitations of The Computer. Won't bring up Star Wars and that old snobby librarian in the Archives. Bring on the adaptive library. I like it.Posted : Sep 18, 2014 12:32
Lisa Hinchliffe
A robot at the reference desk isn't a 25-30 year out possibility. Stella (a bot) is online (since 2004 I believe) ... though you have to talk to her in German. I used Google translate to change "when is the library open" into German and was helpfully provided with this: http://www.sub.uni-hamburg.de/service/wann-wo/oeffnungszeiten.html?aufruf I understand there are other library bots around the world...Posted : Jul 30, 2014 01:54
Rudy
Not one pixel in this article addressed the impact of such a library surveillance suite on the Library Bill of Rights, or the potential impacts on intellectual freedom. I just can't take any conversation that avoids the topic seriously. We protect intellectual freedom. The argument against library surveillance (again, your phrasing Steven) is that the use of such tools endangers intellectual freedom and abrogates our responsibilities as libraries and librarians. This is not an issue of personal wants and wishes and possibilities; it's an actual professional obligation under dismissal. If that issue can be resolved, then a conversation can happen. Until then, it's an irresponsible professional conversation that fails to take up the essential issues. .Posted : Jul 30, 2014 12:51
Barbara Fister
I'm okay with a librarian who knows me. I don't want a library to do so, because that means data is collected by third parties and algorithms rather than human beings are making recommendations, and I think humans are much better at it. I am also concerned about some of the underlying assumptions or goals of proponents of big data analysis. I really liked the way Kelly Jensen at Book Riot constrasted agorithms with people and reaching out instead of reaching in in her post "Libraries are Not a 'Netflix for Books.'" http://bookriot.com/2014/07/15/libraries-netflix-books/ Thanks for being thoughtful about the issues, Steven.Posted : Jul 30, 2014 12:47
steven bell
Wouldn't we like to find that killer app for academic libraries. The library that learns you is a personalization concept. It's not something we are currently working on, and I would say that technologies that would be needed to make it possible are happening but are not yet where they need to be. So something like this could be a few years off. There are thing we can do now to develop some more personalized services, but I don't think we can put it in the form of an app just yet. You, Kate, seem like you would be fine with some of the data issues and concerns I discuss in the column. Your idea about an app makes this type of personalized service - that depends on gathering data about students - an opt-in service. They can choose to utilize it if they think it will help them - or pass on it if they have concerns about how their data might be used. Thanks for your comment and perhaps we'll see something like the app you describe in the not too distant future.Posted : Jul 25, 2014 04:57
Kate
Yes! The Library That Learns You is very close to "My Ideal Library App" that I've been hoping for : highly personalized, productive, and social. (see: http://ganski.wordpress.com/2012/11/29/my-ideal-library-app/) Is Temple really working on this? KatePosted : Jul 25, 2014 02:10