8. University Mandates
All Level 4 (1st year UG) lists to be
completed by September 2014
80% of all modules to be
completed by September 2015
90% by September 2016
12. Project Review Group
Meetings 4 times per
year
Includes stakeholders
Reports to University
Learning & Teaching
Assessment Sub-
Commitee
SAMPLE REPORT
14. Keeping Statistics
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Anglia Law School
Cambridge School of Art
Dept. of English & Media
Dept. of Humanities & Social Sciences
Dept. of Music & Performing Arts
School of Education & Social Care
School of Nursing & Midwifery
Dept. of Allied & Public Health
Dept. of Medicine & Healthcare Science
Dept. of Biomedical & Forensic Sciences
Dept. of Computing & Technology
Dept. of Engineering & Built Environment
Dept. of Life Sciences
Dept. of Psychology
Dept. of Vision & Hearing
Dept. of Accountancy, Finance & OM
Dept. of Economics & International Business
Dept. of HRM & Organisational Behaviour
Dept. of Leadership & Management
Dept. of Marketing, Enterprise & Tourism
% of All Modules completed by Department Overall modules
% Uploaded % Missing
16. Analysis of Sample
14
28
Yes
No
Has importance been set for all items?
14
28
Yes
No
Are there any notes on items for students?
1
41
Yes
No
Are there other resources, such as Bob, YouTube?
17.
18. Future
• Looking at quality of lists
further
• Updating our reading resource
strategy
• Compulsory for course
validation
Good afternoon everyone. My name is Christina Harbour and I’m going to talk to you a little bit today on how we implemented Talis Aspire at Anglia Ruskin University. In my usual day job I am a Subject Librarian but I am currently on secondment to project manage what we have called ReadingLists@Anglia. Very unique title isn’t it!
I'm going to give a brief introduction to our implementation and more on how we have evaluated, analysed and kept statistics throughout the project.
So first a brief overview of ARU… we have 3 campuses in 3 cities
Chelmsford, Cambridge & Peterborough so very spread far apart…
This in itself can be problematic… but we also run as one library service.
This includes converged services. Information literacy sessions, telephone enquiries and sharing stock between campuses.
This also means that we implemented Talis Aspire across all campuses at the same time.
Not easy with one project manager.
So we chose Talis back in March 2013. March to July generally looked at tech implementation done by our library in house IT team.
By the time I came to the project in the July we ran a Pilot phase until December which included early adopters.
We then moved in to Phase 2 which ran from January to August 2014. This generally included adding first year modules to the system and initial marketing.
Right now we are in Phase 3, running from September 2014 until May 2015, which I’ll talk a bit more about later.
What we hope will be the final phase will be focusing on marketing further to students and looking at quality of lists.
Hopefully it will then just become part of our library daily routine.
My secondment actually finishes on 31st July 2015, not sure if this will be extended yet.
During these early phases we had a top down approach with by in from the University senior management team.
First and foremost ReadingLists@Anglia is a University project that the library is managing.
The Library have done a lot of liaison work to get the project off the ground. In particular subject librarians working with academic staff to help them complete their list(s)This means we had lecturers complete lists themselves and we had no lists carried over to the new system.
This all meant that the project moved along quite quickly…
…so we needed a way to plan how the phases of the project would work.
We used the One Page Project Plan (OPPP) to identify how to run the project and advocate the system to the University as a whole.
This is a really good way of doing easy project plans. We have one for each phase and also for the project overall.
…as a result of all of that we needed a marketing plan as we needed to promote the new system to lecturers very quickly.
We had an outside designer design a logo and a web banner shown at the top.
These have then been used for promotional items such as mugs, screen wipes and post-it notes.
We used the mugs as an incentive, so once a lecturer completed a reading list we gave them a mug..
This then marketed the system to other staff as they often kept them in offices etc.
As we moved in to Phase 2, senior management made the system mandatory. In order to add lists in at a steady pace we decided to split this between the phases. We mandated that all Level 4 (first year undergraduate modules needed to be added to the new system by September 2014) As of now we have 98% (this equates to 5 modules missing)
There was some apprehension to this but generally Faculty Deans and Course leaders have been supportive. There is some rivalry between Faculties and some have even mandated that all modules lists be added straight away.
In January of this year we landed the big one.
During our pilot phase we had a target number of reading lists to add in to the new system. This was 50.
We did not meet this target in December 2013… but as a result of our marketing and the mandate we celebrated adding our 1000th list in January 2015. so in a year a big improvement.This is me with our Pro-Vice Chancellor and the lecturer who added the 1000th list. He was awarded with a voucher at our Chancellors conference at our Cambridge campus.
We also had an article in our University magazine Bulletin covering the story.
In fact we have had many articles over the months in Bulletin, generally covering percentages of lists. This is what has caused a lot of the rivalry between Faculties.
After all that work we wanted to do more on evaluating what we had achieved so far in the project.
Myself and Stakeholders provide a project overview and feedback at our project review group. More on that in the next slide.
We use quantitative and qualitative methods such as questionnaires, statistics, focus groups. In the future we want to film student and academic users.
Analytics are provided by Talis Aspire reports and Google.Acting on feedback: we look at functionality and usability issues with Talis Aspire.
We show staff and students how we have acted on feedback by creating marketing and promotional material on feedback received. This has included “you said, we did” promotion.
Our project review group was started in our pilot phase. It consists of myself, library directors and library staff. Other stakeholders such as: academic staff, learning technologists, the students union, IT staff and other support staff. In the future we hope to have some students on the group.
We meet every 3 months, and a Faculty Dean chairs the meeting as an impartial member. We report to the Universities Learning & Teaching Assessment Sub-Committee, which is a sub-committee of our Quality, Enhancement & Standards Committee.Every meeting I send a report that provides an update on the project and data on the number of lists that have been added.
So… how did we gather this data for reports. Lots of different ways and lots of spreadsheets.We received a list of modules running from our academic office and I keep separate spreadsheets for each Faculty/Departments progress.
They each have a dashboard showing percentages and charts.
All Faculties receive a yearly report on how they are progressing based on this data and I keep in regular contact with heads of departments etc.I keep data on percentages by Level, By Faculty and by Department.The table shows % of all modules completed by Departments and the doughnut shows how many modules have reading lists overall.
So once we had a lot of lists we wanted to look at list quality so we completed a sample of lists added so far.
In November 2014 a report of all reading lists was downloaded via Talis Aspire reports.
At the time there were 856 lists and we sampled 5%. This meant we looked at every 20th entry, which totalled 42 lists to look at in more detail. For me too much math!
A template was created so I knew what to look for in the list, for example we looked at if links worked, quality of resources, notes and referencing.
Once I had looked at the 42 sample lists I did some further analysis.
I created a report that looked in to every section and question in further detail.
It has highlighted weaknesses in certain areas, in particular the bookmarking of e-books, low usage of non-book items, and low use of the importance and notes features.
From the evidence most reading lists appear to include either key text or main recommended books, but they fail to go any further and include other additional resources such as articles and web resources where applicable.
This is what has let to us wanting to look more at the quality of lists in the next phase.
It was a struggle at times but we now have over 70% of all modules in the system.
Which means we are now in a position to further promote to students in conjunction with the students union.
Looking at ways of incorporating user experience ethnographic techniques
Further marketing and promotional items, such as a reading list checklist postcard for staff and USB bracelets for students.
Now what next!As I mentioned we feel we now need to look at the quality of lists.
We will update our reading resource strategy as RL@A now being our sole reading list system.
We are also in the process of making RL@A part of the course validation process as well as for new courses.