1. The Changing Role of Libraries
Ethnographic Approaches to Understanding Students
Andrew Asher Lynda Duke
Assessment Librarian Academic Outreach Librarian
Indiana University Bloomington Illinois Wesleyan University
3. How do students find and use information
for their academic assignments?
Assignment
? Complete Paper
4. What is the social contexts of
these assignments?
Students
Librarians Teaching
Faculty
5. The ERIAL Project
9 Data Collection Methods
719 Research Contacts (over 600 unique
participants)
6. The ERIAL Project
280 Semi-structured Ethnographic Interviews
49 Librarians
75 Faculty Members
156 Students
60 Research Process Interviews
7. Research Process Interviews
7
Detail of students’
actual practices.
Effects of information
literacy problems.
Revealed issues
hidden by
quantitative data.
11. Closing the Loop:
Illinois Wesleyan University
Discovery Tool
January – Dec. 2011: 56,027 sessions
43,796 FT downloads
January – Dec. 2012: 66,065 sessions
53,063 FT downloads
12. Closing the Loop:
Illinois Wesleyan University
Response from teaching faculty
Reference vs. research
Service points
Research sessions
Mellon Grant
Admissions
Future research
14. How do students use these
tools?
86 students participating
41 IWU
46 Bucknell
Qualitative and quantitative measures of search
practices
15. Methods
5 Test Groups
Summon
EDS
Google Scholar
“Conventional” Library Catalog
No tool
4 Research Tasks
Find 2 sources per task
Evaluated using a 0-3 scoring rubric by instructional
librarians
Debriefing Interview
Open-ended questions on search practices and evaluation
processes
18. (Almost) Every search is a Google search:
Overall, simple search was used 82% of the time.
19. “I basically throw whatever I want into the
search box and hope it comes up. . . .
But it’s like Google and I use it like
Google. I don’t know how to use it any
other way.”
--Junior in
Nursing
20. Constructing a Search
“Too much information”
Simple Search “Not enough
information”
Students iterate search rather than refine
“Magic” Search Terms
Poorer quality search terms
21. 92% of the resources utilized were
found on the first page of search results.
22. Search Evaluation
Cursory evaluation of sources
Eclectic, and sometimes inaccurate, methods
of evaluation
Assumption that if information is not easily
found then it must not exist
“Apparently you don’t have much
on Rock and Roll”
--First Year in French
23. What a tool searches determines
what students use:
24. Usage of selected newspaper databases at Bucknell, 2009-2011.
2009 Click- 2010 Click- Usage increase 2011 Click- Usage increase
throughs throughs compared to 2009 throughs compared to 2009
ProQuest National
131 1,475 1026% 918 601%
Newspapers Premier
Ethnic NewsWatch 60 562 837% 481 702%
ABI/INFORM Trade &
28 220 686% 107 282%
Industry
America's Historical
15 101 573% 24 60%
Newspapers, 1690-1922
LexisNexis Academic 1,280 6,977 445% 5,233 309%
Total, All Databases 49,886 90,854 82% 89,116 79%
25. Search Epistemology
De facto outsourcing of evaluation to the search
algorithm itself.
Brand Bias Default Bias Trust Bias
27. An Ethnographic Project
Assessme
nt
Data Evaluation/ Research
Data Collection Processing Data Analysis Recommendatio Question
n
Research
Question
28. For more information:
E-mail:
andrew.d.asher@gmail.com
lduke@iwu.edu
Twitter: @aasher
Website: www.erialproject.org
Toolkit: www.erialproject.org/publications/toolkit/
Discovery Paper: http://crl.acrl.org/content/early/2012/05/07/crl-374.full.pdf+html
Book: College Libraries and Student Culture (ALA Editions, 2012)
Hinweis der Redaktion
We are presently working in an environment where one of the predominant narratives about universities and higher education is one of crisis, arguing that our institutions are failing to deliver in their educational mission and our students are not gaining the skills they need. Whether or not we agree with this narrative—and there is much to critique in these studies—universities are being asked more often, and with greater intensity, to justify themselves by measuring their educational outcomes.
ERIAL Project Information. . . .
The ERIAL project was motivated by the question: What do students actually do when they are assigned research projects for class? We know about the assignment, and we know about the results, but what happens in between is a “black box.”
We asked members of each group about expectations that they had for one another and about experiences that they had working with each other.
But, for my purposes here I’d like to like to focus in on just one part of this larger project, which was an evaluation of first-year information literacy at Illinois Wesleyan university. [] that was designed to evaluate the effectiveness of library instruction in first-year seminars [READ SLIDE]. We were hoping to show that library instruction improved information literacy—but unfortunately it didn’t
ERIAL was built on a unique ethnographic methodology, which employed close observation of students’ research habits. In order to construct a holistic portrait of students’ research practices, the ERIAL project used a mixed-methods approach that integrated nine qualitative research techniques that were was designed to generate verbal, textual, and visual data. In total, the project involved more than 700 research contacts with over 600 participants. This presentation will focus on two of these methods: semi-structured ethnographic interviews and research process interviews. Largest ethnographic study of libraries to date.
The starting point for this research was two liberal arts universities that I worked at had recently implemented different discovery tools– Illinois Wesleyan University and Bucknell University. I was not involved in the decision to purchase the discovery tool at either university, but as an anthropologist I was very interested both in how students search for information generally, and how they use discovery tools that are meant to break down the silos of information that are often typical in academic libraries. IWU and BU have demographically similar student populations, as well as similar collections & electronic subscriptions, so this provided an opportunity to compare these tools “in the field”
We wanted to understand both how students used discovery tools, but also how this usage compared to other “discovery” type tools like Google scholar, as well as the conventional library catalog and subject databases set up. And we wanted to collect both quantitative measures of effectiveness and qualitative measures of students’ experience. We recruited 86 students to participate, [Sirsi at Bucknell and the Voyager VuFind interface at IWU.]
And divided them into 5 test Groups. We gave each student a set of four research tasks, which were similar to questions that they be given for an academic assignment. [Civil Rights Act, Women’s Baseball, Wealth and Happiness, and Climate Change]. They were asked to find 2 resources they would use for this assignment (recorded on a web form). These were then scored for quality against a rubric of 0-3. We recorded the students using screen capture software, which we played back the students in a debriefing interview. This interview focused on the thought processes the students went through to construct the search and to evaluate the resources they found
When scored with the rubric, students using Google Scholar receive the lowest scores, while EDS was the highest. Summon was in the middle at 1.92. This difference between Summon and EDS is usually the finding most interesting for librarians, and explaining why this difference occurred is one of most interesting outcomes of this research. EDS significant compared to all others (ANOVA, .05)
As we watch the ways students searched for information we found definite patterns:
[Explain] Every search is Google like
Every search found on 1st page
In particular, students were having tremendous difficulties evaluating their materials—the score on the evaluation section of the test was relatively high-- but it appeared that students knew the right answer, but couldn’t actually perform the task [read slide]
Back to evaluation– Together these equal == determination --Explain slide This explained a good share of variation in scores—especially between EDS and Summon
Backed up by Bucknell’s usage stats--
Returning to the issue of evaluation, these processes result in the defacto outsourcing of the evaluation step to the algorithms of the search system. This potentially introduces several biases into students search practices. [read slide & explain]
What, in general terms, does an ethnographic project look like? I see ethnography as an iterative process that begins and ends with research question development. It then moves through a data collection phase, a period of data processing, data analysis, developing and evaluating recommendations, and finally creating new research questions and assessment strategies. I’ve developed a ethnographic toolkit for use in libraries that describes in detail the steps of conducting an ethnographic project form beginning to end. I will briefly talk about each of these steps: