Documenting Student Connectivity and Use of Digital Annotation Devices in Connected Courses: An Assessment Toolkit
1. Documenting Student Connectivity and Use of Digital
Annotation Devices in Virginia Commonwealth University
Connected Courses: An Assessment Toolkit for Digital
Pedagogies in Higher Education
Laura Gogia, MD - @googleguacamole
Virginia Commonwealth University
February 12, 2016
#gogodoc
2. This is a design project:
“An enterprise that combines CREATIVITY
and INSIGHT to TRANSLATE theory and
fact in ways that makes things better for
people.”
-- The Design Council
http://www.designcouncil.org.uk
3. LEARNING THAT MATTERS
Education at the
intersection of open
education & connected
learning
Digital Fluency +
Integrative Thinking
5. • Most if not all course materials are
housed on open web
• Students are engaged in public blogging
• Students are engaged in some sort of
public discourse
Connected Courses
6. The identified need:
If we are to design, implement,
and evaluate the kind of learning
called for by the VCU QEP, we need
a “toolkit” that offers strategies for
assessing student learning in
connected courses.
10. SOCIAL LEARNING THEORIES
Bandura, 1977; Bruner, 1960; Harel & Papert, 1991;
Lave & Wenger, 1991; Mezirow, 1991; Vygotsky, 1980
KNOWLEDGE TRANSFER
Bransford, Brown, & Cocking, 2010; Anderson,
Krathwohl, & Bloom, 2001
SCHEMA, THRESHOLD & CONCEPT MAPPING
Ausubel,1968; Bruner, 1966; Downes, 2006; Meyer &
Land 2003; Novak & Canas, 2008; Piaget, 1983;
Vygotsky, 1980
People
Concepts
Space and Time
Connectivity draws from:
11. Design Point #2: Connected Course Design
Details available at http://bit.ly/1S5nErP
12. Design Point #3: Learning Objectives
LO1. Forming, reflecting, &
acting on connections
LO2. Engaging in networked
participation
LO3. Developing digital
workflows
GOALS
Integrative Thinking
Connectivity
Digital Fluency
PEDAGOGICAL STRATEGIES
Connected Courses
LEARNING
OBJECTIVES
Details available at http://bit.ly/1NWccXN
13. How do we document or assess
student connectivity in connected
courses?
Important for
Student assessment • Learning innovation •
Faculty development • Course evaluation
15. 1. Explore 2. Apply 3. Reflect
How are digital
annotations used
now? What are
some best,
common, and
missing uses?
How do we
document
student progress
towards stated
learning
objectives?
Consistent with
21st century
assessment
strategies?
The project trajectory
17. “Smith (2010) suggests that verbal and
nonverbal communication can impact…”
Sample Hyperlinks
“Last week, I proposed a list of questions meant
to inspire my research for this course...”
“For more information, check out [hyperlink]”
[#1]
[#2]
[#3]
[#4]
[#5]
19. Areas of Alignment
Citations/References
Additional Resources
Definitions & Description
Personal Context
Connecting to other
concepts
Building a personal
narrative
Aesthetics, Illustration,
Extension Multimodal expression
23. Article
Contribution
Interesting article about
student loans [hyperlink]
[course hashtag]
Connection
to course
material
Reminds me of what we talked
about last week in
[course hashtag] [hyperlink]
Targeted
contribution
@Mention might find this on
financial aid interesting
[hyperlink]
[course hashtag]
CommonRare
Layers of Contribution
24. My latest blog post [hyperlink]
[course hashtag]
Blog Post
Promotion
Check out my latest blog post
[hyperlink] [course hashtag]
Audience
I blogged on my financial aid
difficulties at [#VCU]
[hyperlink] [course hashtag]
Hook,
Hashtag, &
Hyperlink
CommonRare
Layers of Self-Promotion
25. Missed Opportunities.
LACK OF USE
(BUT INTERESTING
SPACE FOR STUDY!)
FAILURE TO
PROMOTE OTHER
STUDENT WORK
LACK OF
TARGETTED
CONTRIBUTION
STUDENTS FAILED
TO COMBINE
ANNOTATIONS
LACK OF
DIVERSITY IN
COMMUNICATIVE
IMPACT
LACK OF
WORKFLOW
26. LO1. Forming, documenting, & reflecting on connections
LO2. Engaging in networked participation
LO3. Developing digital workflows
LEARNING GOALS
Integrative Thinking
Connectivity
Digital Fluency
LEARNING
OBJECTIVES
Connecting to Concepts
Connecting People to Concepts
Developing Digital Workflows
Connecting with People
Networked Participation
HYPERLINKS
MENTIONS
HASHTAGS
[Connecting to Concepts]
[Connecting People to Concepts]
[Networked Participation]
[Developing Digital Workflows]
Areas of Alignment
28. Student
Hyperlinked
Source URL
TOP MATERIAL SOURCES
NUMBER OF STUDENTS WHO LINKED-
EMBEDDED
NUMBER OF LINKS-EMBEDS
Twitter.com 16 19
Embedded Images 6 13
YouTube.com 7 13
CNN.com 5 11
Student Posts 5 6
HuffingtonPost.com 5 5
Data Visualizations
30. QUANTITY
• 600 words
• 8 hyperlinks and 1 embedded image
DIGITAL MECHANICS
• All hyperlinks work
• Image is properly embedded & auto-
citing
COMMUNICATIVE IMPACT
• Citations (3)
• Definition/Description (4)
• Example (1)
Connections to
other ideas or
things within &
outside Jon’s
discipline
32. How do these assessment strategies
conform to published recommendations for
21st century digital assessments?
• Integration
• Sustainability
• Scalability
3. Reflect
33. Next Steps & Limitations
• Get these into integrated course designs
• Streamline data collection & analysis
• More piloting
First of all, thank you so much for coming today and welcome. Thank you to Jon Becker and the rest of my committee for providing me with guidance and feedback along the way. I want to extend a special welcome and thank you to Dr. Lee Skallerup Bessette. Lee drove down from Mary Washington University to be here today, not only to offer moral support but also to live tweet my defense. I asked her to do this because many of the people who inspire me and with whom I collaborate live all over the world and we come together through Twitter. Lee is here to help bring those people into this room, and I can’t think of a better, more skilled live tweeter to do it. Feel free to join her if you see something you would like to share or comment on – our hashtag is #gogodoc, which I did not make up – it was suggested by one of my other twitter followers earlier this week and I thought it was kind of catchy
What we are going to discuss today is not a straightforward research project; It’s a design project, an enterprise that combines creativity and insight to translate theory and fact in ways that makes things better for people.
Design projects emphasize application. They allow for creative problem solving. They have a goal of fulfilling a real need in the real world.
The need I identified and built a project around is related to the revised quality enhancement plan at Virginia Commonwealth University.
As is consistent with our strategic plan, the revised QEP takes a very distinctive approach to generalizable education. It suggests that the dispositions and skills that students need for the 21st century might be best developed in educational environments that exist at the intersection of open education and connected learning.
Since the publication of the revised QEP, VCU has established a large public publishing platform called RamPages, meant to support the development of student, organization, and formal academic spaces in the digital world.
As more instructors begin to use RamPages to support formal academic courses, a certain type of course design has emerged – within the university these are called “connected courses”–and although they all look very different and cover a range of disciplines and topics, they have certain identifiable, common threads around open, digital course materials, student blogging, and public discourse
It has been suggested that these courses – these connected courses – are one promising approach to enacting the strategy outlined in the QEP.
However – and here is the identified need –
If we are to call the connected course the bonafide enactment of the QEP (and if we are to say that the QEP is working) – we need course evaluation and assessment tools.
More specifically, we need a toolkit that assesses the types of learning discussed in the QEP for connected courses.
So how do we get to novel and practical assessment strategies when all other definitions involved in the equation are still unfixed, course designs are still emerging, and the theoretical foundations are still being described?
In the first page of your packet, you’ll find my design strategy
I’ve outlined the path of logic and the design decisions I’ve made, starting with what we know – which is the QEP and current connected course practice – and ending with my research and development work on assessment strategies. Several critical design points emerged along the way, and they are highlighted in yellow. These are the places where I had to synthesize theory, university documents, and existing pedagogical practice in order to create operational definitions – just so I could continue to move forward towards the project’s desired outcome.
The 1st design point took place at the level of established learning goals.
What is it exactly that we want to measure?
I narrowed the scope of my interest to what I consider to be the intersection of the two learning goals mentioned in the QEP, digital fluency and integrative thinking. I identified the intersection as connectivity. As you can see from this model, connectivity is a form of experiential learning that closely aligns with Kolb’s cycle.
It draws from social and cognitive learning theories and speaks to knowledge transfer research to define exactly what learners are connecting; namely their current thinking or experiences with the feedback of people, and connections to other concepts and their own experiences over space and time.
The second design point took place at the definition of learning environments. What makes the “connected course” representative of the intersection of open education and connected learning?
So I created a framework for the VCU connected course that allows instructors to map out their course design along four agents of connectivity – openness, digital creative expression, participation, and agency. This is in your packet and online
I moved to learning objectives – which are design point #3.
Now obviously courses have more than one type of learning objective – some will relate to disciplinary content or professional skills or similar. However, I argue that if a set of courses are to be held up as some sort of model or test case for the evaluation of the impact of the QEP, then it should also have learning objectives that align with connectivity.
These are summaries of the learning objectives I’ve proposed; more detailed versions can be found in your packet or online.
And so we finally get to main thrust of this project. how might we assess student connectivity?
I focused on digital annotations in blogging and tweeting environments as a potential site of documentation for connective behavior.
The project consisted of three strands of inquiry, written to support the design process –
First, I wanted to explore how digital annotations are being used by students in current iterations of VCU connected courses,
Then I needed to apply that understanding so that I might design assessment tools around student annotation use.
And finally, I needed to engage in a quick self-evaluation – there are published recommendations regarding educational assessments for the digital age – are the assessments I am creating consistent (or potentially consistent) with these recommendations.
Because of time constraints I might not get to a discussion of #3, but it’s in the document and we can discuss it through questions after the presentation as well.
So, step one. How do students use digital annotations in their blogging in current iterations of connected courses?
So to protect the privacy of our students, these aren’t direct quotations, but they are modeled closely after student work.
Students used hyperlinks in a variety of ways.
In the first hyperlink …. So the link goes back to an old blog post. This is an example of personal context. You see it particularly when blog posts build on each other and a student is able to go down a list – first I did this, then this, then this….The additional of hyperlinks literally allows students to experience the order of their learning and in some cases, reorder it to make sense to them.
The second hyperlink…this is course context…the link goes back to the course website. You saw the grad students do this sometimes and it suggests that they were thinking of an audience beyond their classmates and professor – someone who would need a broader understanding of why the student was writing the post.
The third hyperlink – very straightforward and not really well connected to a specific part of the post. You would find this with the undergrads in a course that had very little instructor feedback, modeling, or oversight on their writing – it would be thrown out at the end of the post.
And I feel like #4 and #5 is where you would like the student in #3 to be moving – so #4 is an extension of a standard intext citation. If you click on it, it goes exactly where you’d think it would. But let’s look at #5 It led to a slideshare presentation about different forms of nonverbal communication. In this case, the hyperlink allowed student to provide additional description without cluttering up of the primary narrative- because the definition of nonverbal communication is not the primary point of the sentence – this link acts as a sort of value added to the narrative that you would not have received in a paperbased essay.
So I would argue that based on learning objectives, #1, #2, #4, and #5 are equally important types of hyperlinks – in fact you want to see students move between all of them. #3 is something that you see in a beginner and you would want to try to encourage them more towards #4 and #5.
So, let’s look at embedded materials
Many were identified as aesthetic – the pretty picture at the top of the blog post; the clip art at the section header.
But also illustrated. These were things that directly spoke to items written about in the post.
Finally, students used embedded materials to extend the narrative. In some cases, a infographic or graphic or chart might be essential to helping the student tell the story. In some cases, students made their own “how to” videos about something they were writing about in the post. Another student inserted a picture from his last vacation and then made connections between what was in the picture and the content of the post.
I would argue that in terms of assessment, I’d like to see less aesthetics and more illustrations and extensions. There is a special place in my heart for images and videos that are created by the student and inserted into the blog post because students always managed to tell these really rich stories around their self-generated materials. They made very explicit connections between the art and the text. I’ll be honest with you – if I were teaching a class I’d require students to use all of their own art.
There were striking alignments with the learning objectives
A
There were also quite a few missed opportunities.
In terms of hyperlinks, students failed to engage with the ideas expressed by their peers. It’s not that they couldn’t have. Clearly hyperlinks can lead to conversations about other student blogs as much as they can to other forms of web content, but students rarely mentioned their classmates, let alone link to their blog posts.
Second, there was a lack of documentation and a seeming lack of intentionality around image use. Students seemed to love them – every student who contributed blog posts embedded at least one picture along the way, but they failed to credit them. They frequently failed to caption them. They often failed to explain why they were there. In light of some of the incredible things some students did with the ability to use images, it appears the majority of students could use more information on how to do it well and why it’s important.
Student engagement in Twitter was a really rich area of exploration. I do not have time to do it justice today.
However, what I will say is that how students engaged with the platform and performed in terms of making connections seemed to be strongly related to course designs – and by course design I mean how instructors structured Twitter activities and whether tweeting was required, incentivized, or mandatory for students.
In general, students used hyperlinked in tweets to contribute resources to the group and to promote their own work
But even these had extremely rich layers to how students used them.
The top tweet is very basic, and shows some effort to contribute a resource to the group as a whole.
The second example, however, shows a little more thought – it’s still a contribution, but it also shows a conscious link to a specific aspect of the course materials. A solid connection was made.
The third example, is still a contribution but it links to a specific person in the class – it is engaging a classmate in a specific, targeted sort of way that is much more likely to lead to discussion around the contribution.
Self-promotion was equally layered.
The first is so basic, it doesn’t even really suggest interest in audience even though it’s self promoting.
The second at least suggests care that someone might read it.
But let’s unpack this last one. This didn’t happen very often with students. However, this student provided a hook and added an affinity group hashtag in an attempt to get a wider reading audience than just those in the course.
The average student was not using effective strategies for networked communication and workflow – most did not come to courses with pre-existing strategies for how to make contributions count or how to access audiences for their work on digital platforms. Furthermore, instructors did not seem to be talking about it. And students are still failing to discuss each other’s work.
However, again, some students were doing great thing, making all sorts of connections between people – social connections, conversational connections, contributory connections. I believe that this suggests that it is possible to capture connectivity learning objectives through hyperlinks, mentions, and hashtags, if students are taught about networked communication, if there are structured activities around it, if students received feedback about it.
I applied what I learned to explore graphic visualizations and develop rubrics and dashboards for assessing student annotation use. Due to time constraints, we are not going to get through the entire protocol I’ve been developing.
To touch briefly on data visualizations, I didn’t find them as useful in terms of student assessment per se - but I believe they make wonderful tools for discussing connectivity and networked communication with students.
And I would like to spend the remainder of our time actually assessing a blog post. The protocols I have designed are really meant to look at a body of student’s work over time rather than grading one specific blog post, but I feel like the blogging rubric, which you will find in your packet, can be adapted fairly easily to look at one blog post at a time. So we are going to use it to check out how Jon is doing. You also have a paper copy of this in your packet – it’s the last two pages.
Obviously I’m running low on time, but I’d like to briefly say that much of what I have created is consistent with recommendations for 21st century assessments –
They lend themselves to self and peer assessment
Much of what they run on is or can be automated in terms of scalability
And integration is essential – the number of learning activities that I could develop that would integrate these assessments within the activity are mind boggling.
So much of this was highly experimental – it needs to be piloted for real in terms of working with dedicated faculty members to redesign courses and get these sorts of learning objectives and learning activities and assessments on board. Student voice needs to be involved in this work – we need to know what they are thinking as they use these annotations – what do these connections mean to them and what do they mean when we actually teach them what they could mean.