SlideShare ist ein Scribd-Unternehmen logo
1 von 52
Veronica Diaz, PhD Associate Director EDUCAUSE Learning Initiative, EDUCAUSE ::: League for Innovation Learning College Summit, Phoenix, AZ Seeking Evidence of Impact: Answering "How Do We Know"
Today’s Talk Review what it is to seek impact of (teaching and learning innovations)  Consider some strategies for using evaluation tools effectively  Determine ways to use evidence to influence teaching practices  Review ways to report results
Academic instruction Faculty development Instructional technology Instructional design Library Information technology Senior administration  Other  Who are we?
Why are we here?
I am working in evaluating T&L innovations now.	 Evaluation of T&L is part of my formal job description. My campus units director or VP mandates our gathering evidence of impact.		 A senior member of the administration (dean, president, senior vice-president) mandates gathering evidence of impact in T&L.	 I am working as part of a team to gather evidence. Accreditation processes are prompting us to begin measurement work.
Why evidence of impact?
What the community said Download the Survey https://docs.google.com/document/d/1Yj37DINUdCyk5DXPelr0FJ1Rewfx8xON-cLo-NJ5e4U/edit?hl=en_US#
Technologies to Measure Web conferencing  LMS and individual features  Lecture capture  Mobile learning tools (laptops, ebooks, tablets)  Clickers Collaborative tools  Student generated content Web 2.0 and social networking technologies  Learning spaces  OER Personal learning environments  Online learning: hyflex course design, blended learning programs, synchronous/asynchronous delivery modes, fully online programs  Eportfolios Multimedia projects and tools: pod/vod casts  Simulations  Early alert systems  Cross-curricular information literacy programs Large course redesigns
Technologies and their connection/relationship to… Student engagement  Learning related interactions  Shrink the large class  Improve student to faculty interaction  Student retention and success  Specific learning outcomes
12 3 most important indicators you use to measure  the evidence of impact of technology-based innovations in T&L
What is “evidence?” Grades (was frequently mentioned)  Learning outcomes (was frequently mentioned)  Satisfaction Skills  Improved course evaluations  Measures of engagement and participation  Retention/enrollment rates  Graduation rate  Direct measures of student performance (at the course level and cumulative)  Interview data Institutional data  Faculty/student technology utilization rates Data on student/faculty facility and satisfaction with using technology Successfully implementing technology  Job placement  Student artifacts  Better faculty reviews by students  Course redesign to integrate changes; impact on the ability to implement best pedagogical practice Rates of admission to graduate schools Success in more advanced courses
Methods/techniques you ROUTINELY USE for gathering  evidence of impact of technology-based innovations in T&L
Most difficult tasks associated with measurement were ranked as follows  Knowing where to begin to measure the impact of technology-based innovations in T&L Knowing which measurement and evaluation techniques are most appropriate Conducting the process of gathering evidence Knowing the most effective way to analyze our evidence Communicating to stakeholders the results of our analysis
Yes No I have worked with evaluative data
Course level (in my own course)  Course level (across several course sections) At the program level (math, English) At the degree level  Across institution or several programs Other  I have worked with evaluative data at the
Using evaluation tools effectively
Technologies and their connection/relationship to Student engagement  Learning related interactions  Shrink the large class  Improve student to faculty interaction  Student retention and success  Specific learning outcomes  Remember
Triangulate to tell the full story. The impact of a curricular innovation should be “visible” from a variety of perspectives and measurement techniques. ….. Three most commonly used evaluation tools:  questionnaires (paper or online),  interviews (individual or focus group), and  observations (classroom or online).
5 Steps Establish the goals of the evaluation: What do you want to learn?  Determine your sample: Whom will you ask? Choose methodology: How will you ask?  Create your instrument: What will you ask?  Pre-test the instrument: Are you getting what you need? (PILOT YOUR TOOLS/STRATEGIES)
What is a good question? Significance: It addresses a question or issue that is seen as important and relevant to the community Specificity: The question focuses on specific objectives Answerability: The question can be answered by data collection and analysis; Connectedness: It’s linked to relevant research/theory Coherency: It provides coherent explanations that rule out counter-interpretations Objectivity: The question is free of bias Whom does your evidence need to persuade?
Quantitative. This approach starts with a hypothesis (or theory or strong idea), and seeks to confirm it.  Qualitative.These studies start with data and look to discover the strong idea or hypothesis through data analysis.  Mixed. This approach mixes the above methods, combining the confirmation of a hypothesis with data analysis and provides multiple perspectives on complex topics.  Example: starting with a qualitative study to get data and identify the hypothesis and then following on with a quantitative study to confirm the hypothesis.
The Double Loop
Methods?Support in data collection?Double loop?
Using evidence to influence teaching practices
“higher education institutions seem to have a good understanding of the assessment process through the use of rubrics, e-portfolios, and other mechanisms, but the difficulty seems to be in improving the yield of the assessment processes, which is more of a political or institutional culture issue”
Why We Measure Inward (course level, inform teaching, evaluate technology use, reflective)  Outward  Share results with students  Share results with potential students  Share results with other faculty (in/out of discipline)  Share results at the institutional or departmental level (info literacy, writing, cross course projects)  Results can be a strategic advantage
Lessons from Wabash National Study a 3-year research and assessment project provides participating institutions extensive evidence about the teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomes Inputs – the attitudes and values that students bring into college Experiences – the experiences that impact students once they are in college Outcomes – the impact that college has on student ability and knowledge http://www.liberalarts.wabash.edu/wabash-study-2010-overview/
Measuring student learning and experience is the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning. www.learningoutcomesassessment.org/documents/Wabash_000.pdf
Lessons from Wabash National Study ,[object Object]
lack of high-quality data is primary obstacle for using assessment evidence to promote improvements
providing detailed reports of findings is the key mechanism for kicking of a sequence of events culminating in evidence-based improvements
intellectual approach that faculty and staff use in their scholarship facilitates assessment projects,[object Object]
Download the rubrics: http://www.aacu.org/value/rubrics/
http://www.qmprogram.org/research Research Grants:  2010 & 2011
Organizational Level Data  Learner Satisfaction  Student Learning  Student satisfaction higher in QM reviewed courses & non-reviewed courses than in courses at non-QM institutions. (Aman dissertation, Oregon State, 2009) Course evaluation data showed student satisfaction increased in redesigned courses. (Prince George’s Community College, MD, 2005) Currently conducting a mixed methods study student & faculty perceptions of QM reviewed courses. (University of the Rockies) Grades improved with improvements in learner-content interaction (result of review). (Community College of Southern Maryland, 2005) Differences approaching significance on outcome measures.  (Swan, Matthews, Bogle, Boles, & Day,  University of Illinois/Springfield, 2010+) QM Rubric implementation positive effect on student higher-order cognitive presence & discussion forum grades via higher teaching presence. (Hall, Delgado Community College, LA, 2010)
Organizational Level Data  Teacher Learning Organizational Learning  Use of QM design standards led to “development of a quality product, as defined by faculty, course designers, administrators, and students, primarily through faculty professional development and exposure to instructional design principles” (p. 214). (Greenberg dissertation, Ohio State, 2010) Currently utilizing TPACK framework to explain process by which new online teachers use the QM rubric and process when designing an online course. (University of Akron) There may be a carryover effect to non-reviewed courses when institution commits to the QM standards. (Aman dissertation, Oregon State,  2009) Faculty/design team respond different when QM presented as a rule rather than a guideline. (Greenberg dissertation, Ohio State, 2010) Extended positive impact on faculty developers & on members of review teams. (Preliminary analysis 2009; comprehensive summer 2011)
Alignment in the curriculum between course objectives, goals, and assessments  Faculty members identify which assignments they have aligned with learning objectives Design rubrics or instructions to prompt them at various data-collection points Departments or colleges are asked to report data online at the end of each term with prompts for comparison and reflection Doing so makes the data ready for larger scale assessment efforts Session Recording and Resources:  http://net.educause.edu/Program/1027812?PRODUCT_CODE=ELI113/GS12
What organizational mechanisms do you have in place to measure outcomes?
Reporting results
Match your research design to the type of information in which your anticipated consumers are interested or to which they will best respond.  Match your data-collection method to the type of data to which your information consumer will respond or is likely to respect.
Keep it simple, to the point, and brief. Know who is consuming your data or research report, who the decision makers are, and how your data is being used to make which decisions, if any.  Although time-consuming, it might be worthwhile to tailor your reports or analysis to the audience so as to emphasize certain findings or provide a deeper analysis on certain sections of interest.
Good research: Tips and tricks
Be careful of collecting too much data Be aware of reaching the point at which you are no longer learning anything from the data Write up and analyze your data as soon as possible Consider recording the interviews or your own observations/notes Record interviews or focus groups--even your own observations or impressions immediately following the interaction
Besides all the usual good reasons for not reinventing the wheel and using others’ tested surveys, tools, or methods, doing so gives you a point of comparison for your own data http://www.educause.edu/Resources/ECARStudyofUndergraduateStuden/217333 When collecting data, talk to the right people  Don’t overschedule Be sure to space out interviews, focus sessions, observations or other tactics so that you can get the most from your interactions
Guiding Questions or Next Steps Who are the key stakeholders for the innovative teaching and learning projects in which I am involved?  How can I help faculty members communicate the results of their instructional innovations to a) students, b) administrators, and c) their professional communities? What “evidence” indicators do my key stakeholders value most (i.e., grades, satisfaction, retention, others)? Which research professionals or institutional research collection units can assist me in my data collection, analysis and reporting efforts?
Collecting Cases Project Overview Project goals, context, and design  Data collection methods  Data analysis methods  Findings  Communication of results  Influence on campus practices  Reflection on Design, Methodology, and Effectiveness  Project setup and design  Project data collection and analysis  Effectiveness and influence on campus practices  Project Contacts  Supporting Materials
Online Spring Focus Session April 2011 http://net.educause.edu/eli113 ………. Read about the initiative:  http://www.educause.edu/ELI/SEI ………. Get involved:  http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626 ……….
Join the ELI Evidence of Impact Constituent Group http://www.educause.edu/cg/EVIDENCE-IMPACT
SEI Focus Session Content These items for the 2011 Online Spring Focus Session on seeking evidence of impact can be found at http://net.educause.edu/eli113.  ELI Seeking Evidence of Impact Resource List, includes websites, reports, articles, and research: http://net.educause.edu/section_params/conf/ELI113/SFSResourceListAllFINAL.pdf.  ELI Seeking Evidence of Impact Discussion Questions: http://net.educause.edu/section_params/conf/ELI113/discussion_prompts_team-indiv2011.doc.  ELI Seeking Evidence of Impact Activity Workbook, Day 1 and 2: http://net.educause.edu/section_params/conf/ELI113/activity_prompts_team-indiv2011.doc.  ELI Seeking Evidence of Impact Reflection Worksheet: http://net.educause.edu/section_params/conf/eli103/reflection_worksheet.doc.  Presentation slides and resources for all sessions can be found at http://net.educause.edu/eli113/2011ELIOnlineSpringFocusSessionRecordings/1028384.

Weitere ähnliche Inhalte

Was ist angesagt?

22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
Bart Rienties
 
Ed Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasEd Reform Lecture - University of Arkansas
Ed Reform Lecture - University of Arkansas
John Cronin
 
Using tests for teacher evaluation texas
Using tests for teacher evaluation texasUsing tests for teacher evaluation texas
Using tests for teacher evaluation texas
NWEA
 
High Stakes Standardize Testing Keck Pp 3
High Stakes Standardize Testing Keck Pp 3High Stakes Standardize Testing Keck Pp 3
High Stakes Standardize Testing Keck Pp 3
amate1cl
 
Assessment literacy
Assessment literacyAssessment literacy
Assessment literacy
mdxaltc
 

Was ist angesagt? (20)

NASPA AnP 2014
NASPA AnP 2014NASPA AnP 2014
NASPA AnP 2014
 
Data analysis 2011
Data analysis 2011Data analysis 2011
Data analysis 2011
 
Open Academic Analytics Initiative - Campus Technology Innovator Award Presen...
Open Academic Analytics Initiative - Campus Technology Innovator Award Presen...Open Academic Analytics Initiative - Campus Technology Innovator Award Presen...
Open Academic Analytics Initiative - Campus Technology Innovator Award Presen...
 
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
 
Ed Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasEd Reform Lecture - University of Arkansas
Ed Reform Lecture - University of Arkansas
 
Using tests for teacher evaluation texas
Using tests for teacher evaluation texasUsing tests for teacher evaluation texas
Using tests for teacher evaluation texas
 
Feedback as Dialogue
Feedback as DialogueFeedback as Dialogue
Feedback as Dialogue
 
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
The Virtuous Loop of Learning Analytics & Academic Technology Innovation The Virtuous Loop of Learning Analytics & Academic Technology Innovation
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
 
Presentation to ResearchED London Sept 9th 2017
Presentation to ResearchED London Sept 9th 2017Presentation to ResearchED London Sept 9th 2017
Presentation to ResearchED London Sept 9th 2017
 
High Stakes Standardize Testing Keck Pp 3
High Stakes Standardize Testing Keck Pp 3High Stakes Standardize Testing Keck Pp 3
High Stakes Standardize Testing Keck Pp 3
 
Assessment literacy
Assessment literacyAssessment literacy
Assessment literacy
 
Dennis Small
Dennis SmallDennis Small
Dennis Small
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Expertise, Consumer-Oriented, and Program-Oriented Evaluation ApproachesExpertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
 
Using Learning Analytics to Assess Innovation & Improve Student Achievement
Using Learning Analytics to Assess Innovation & Improve Student Achievement Using Learning Analytics to Assess Innovation & Improve Student Achievement
Using Learning Analytics to Assess Innovation & Improve Student Achievement
 
Assessment literacy
Assessment literacyAssessment literacy
Assessment literacy
 
Teacher evaluation presentation3 mass
Teacher evaluation presentation3  massTeacher evaluation presentation3  mass
Teacher evaluation presentation3 mass
 
Using Assessment data
Using Assessment dataUsing Assessment data
Using Assessment data
 
Closing the Gap With STEM Education: Why, What, and How
Closing the Gap With STEM Education: Why, What, and HowClosing the Gap With STEM Education: Why, What, and How
Closing the Gap With STEM Education: Why, What, and How
 
What data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course designWhat data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course design
 

Andere mochten auch (7)

The Training System Session 1
The  Training  System Session 1The  Training  System Session 1
The Training System Session 1
 
Blended by Design: Day 1
Blended by Design: Day 1Blended by Design: Day 1
Blended by Design: Day 1
 
Assessment
AssessmentAssessment
Assessment
 
Demonstrating the Impact of Faculty Development Activities
Demonstrating the Impact of Faculty Development ActivitiesDemonstrating the Impact of Faculty Development Activities
Demonstrating the Impact of Faculty Development Activities
 
IHC Faculty Development Program Plan AY 2013-14
IHC Faculty Development Program Plan AY 2013-14IHC Faculty Development Program Plan AY 2013-14
IHC Faculty Development Program Plan AY 2013-14
 
The blended learning research: What we now know about high quality faculty de...
The blended learning research: What we now know about high quality faculty de...The blended learning research: What we now know about high quality faculty de...
The blended learning research: What we now know about high quality faculty de...
 
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Developme...
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Developme...Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Developme...
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Developme...
 

Ähnlich wie Seeking Evidence of Impact: Answering "How Do We Know?"

Wsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityWsu Ppt Building District Data Capacity
Wsu Ppt Building District Data Capacity
Glenn E. Malone, EdD
 
National university assessment process
National university assessment processNational university assessment process
National university assessment process
Ashley Kovacs
 
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Bart Rienties
 
Wsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of SupportWsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of Support
WSU Cougars
 
Research courses for the EdD
Research courses for the EdDResearch courses for the EdD
Research courses for the EdD
CPEDInitiative
 
Day 3 action research movie bridgewater
Day 3 action research movie bridgewaterDay 3 action research movie bridgewater
Day 3 action research movie bridgewater
vpriddle
 

Ähnlich wie Seeking Evidence of Impact: Answering "How Do We Know?" (20)

Wsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityWsu Ppt Building District Data Capacity
Wsu Ppt Building District Data Capacity
 
National university assessment process
National university assessment processNational university assessment process
National university assessment process
 
Online Course Assessment Part 1
Online Course Assessment  Part 1Online Course Assessment  Part 1
Online Course Assessment Part 1
 
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
 
Interview presentation.pptx
Interview presentation.pptxInterview presentation.pptx
Interview presentation.pptx
 
Wsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of SupportWsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of Support
 
Assessment Advice
Assessment AdviceAssessment Advice
Assessment Advice
 
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
 
Wssu session 2
Wssu session 2Wssu session 2
Wssu session 2
 
Research courses for the EdD
Research courses for the EdDResearch courses for the EdD
Research courses for the EdD
 
DU CTLAT Presentation Assessing Student Learning Outcomes Educational Program...
DU CTLAT Presentation Assessing Student Learning Outcomes Educational Program...DU CTLAT Presentation Assessing Student Learning Outcomes Educational Program...
DU CTLAT Presentation Assessing Student Learning Outcomes Educational Program...
 
Day 3 action research movie bridgewater
Day 3 action research movie bridgewaterDay 3 action research movie bridgewater
Day 3 action research movie bridgewater
 
Moving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingMoving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate Teaching
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating Assessments
 
Teaching Excellence Series: The Educator Perspective
Teaching Excellence Series: The Educator PerspectiveTeaching Excellence Series: The Educator Perspective
Teaching Excellence Series: The Educator Perspective
 
ABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca Edwards
 
E assessment
E assessmentE assessment
E assessment
 
Ch. 12 powerpoint apt 501
Ch. 12 powerpoint apt 501Ch. 12 powerpoint apt 501
Ch. 12 powerpoint apt 501
 
G-51-Collecting-Effective-Data-in-Counseling.pptx
G-51-Collecting-Effective-Data-in-Counseling.pptxG-51-Collecting-Effective-Data-in-Counseling.pptx
G-51-Collecting-Effective-Data-in-Counseling.pptx
 
Quality Teaching in Online Courses
Quality Teaching in Online CoursesQuality Teaching in Online Courses
Quality Teaching in Online Courses
 

Mehr von EDUCAUSE

Learn How Emergent Online Models Serve as Innovation Incubators
Learn How Emergent Online Models Serve as Innovation IncubatorsLearn How Emergent Online Models Serve as Innovation Incubators
Learn How Emergent Online Models Serve as Innovation Incubators
EDUCAUSE
 
Badges for Teaching and Learning
Badges for Teaching and LearningBadges for Teaching and Learning
Badges for Teaching and Learning
EDUCAUSE
 

Mehr von EDUCAUSE (20)

Mentoring for Today’s Generation(s) at Scale: Virtual and Face-to-Face
Mentoring for Today’s Generation(s) at Scale: Virtual and Face-to-FaceMentoring for Today’s Generation(s) at Scale: Virtual and Face-to-Face
Mentoring for Today’s Generation(s) at Scale: Virtual and Face-to-Face
 
ELI Town Hall and First-Timer’s Meeting
ELI Town Hall and First-Timer’s MeetingELI Town Hall and First-Timer’s Meeting
ELI Town Hall and First-Timer’s Meeting
 
Current Trends in Educational Technology
Current Trends in Educational TechnologyCurrent Trends in Educational Technology
Current Trends in Educational Technology
 
Developing a Digital Badge Roadmap
Developing a Digital Badge Roadmap Developing a Digital Badge Roadmap
Developing a Digital Badge Roadmap
 
Digital Badging for Teaching and Learning
Digital Badging for Teaching and LearningDigital Badging for Teaching and Learning
Digital Badging for Teaching and Learning
 
ELI 2017 Town Hall Meeting
ELI 2017 Town Hall MeetingELI 2017 Town Hall Meeting
ELI 2017 Town Hall Meeting
 
Eli2017 Newcomer Orientation
Eli2017 Newcomer OrientationEli2017 Newcomer Orientation
Eli2017 Newcomer Orientation
 
3 Emerging Strategies to Advance Professional Learning in Digital Environments
3 Emerging Strategies to Advance Professional Learning in Digital Environments3 Emerging Strategies to Advance Professional Learning in Digital Environments
3 Emerging Strategies to Advance Professional Learning in Digital Environments
 
Emerging Strategies to Leverage Disruptive Education Technologies
Emerging Strategies to Leverage Disruptive Education TechnologiesEmerging Strategies to Leverage Disruptive Education Technologies
Emerging Strategies to Leverage Disruptive Education Technologies
 
Toward Student Engagement and Recognition: Developing a Digital Badge Roadmap
Toward Student Engagement and Recognition: Developing a Digital Badge Roadmap Toward Student Engagement and Recognition: Developing a Digital Badge Roadmap
Toward Student Engagement and Recognition: Developing a Digital Badge Roadmap
 
Learn How Emergent Online Models Serve as Innovation Incubators
Learn How Emergent Online Models Serve as Innovation IncubatorsLearn How Emergent Online Models Serve as Innovation Incubators
Learn How Emergent Online Models Serve as Innovation Incubators
 
Mobile Teaching And Learning: Engaging Students And Measuring Impact
Mobile Teaching And Learning: Engaging Students And Measuring ImpactMobile Teaching And Learning: Engaging Students And Measuring Impact
Mobile Teaching And Learning: Engaging Students And Measuring Impact
 
Badges for Teaching and Learning
Badges for Teaching and LearningBadges for Teaching and Learning
Badges for Teaching and Learning
 
Working Successfully with Emerging Technologies and Innovations
Working Successfully with Emerging Technologies and InnovationsWorking Successfully with Emerging Technologies and Innovations
Working Successfully with Emerging Technologies and Innovations
 
The MOOC in Review: Contributions to Teaching and Learning
The MOOC in Review: Contributions to Teaching and LearningThe MOOC in Review: Contributions to Teaching and Learning
The MOOC in Review: Contributions to Teaching and Learning
 
gmac2011
gmac2011gmac2011
gmac2011
 
Mba2011
Mba2011Mba2011
Mba2011
 
Mecar2010
Mecar2010Mecar2010
Mecar2010
 
townhall
townhalltownhall
townhall
 
blend10
blend10blend10
blend10
 

Kürzlich hochgeladen

Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 

Kürzlich hochgeladen (20)

Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Spatium Project Simulation student brief
Spatium Project Simulation student briefSpatium Project Simulation student brief
Spatium Project Simulation student brief
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptx
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 

Seeking Evidence of Impact: Answering "How Do We Know?"

  • 1. Veronica Diaz, PhD Associate Director EDUCAUSE Learning Initiative, EDUCAUSE ::: League for Innovation Learning College Summit, Phoenix, AZ Seeking Evidence of Impact: Answering "How Do We Know"
  • 2. Today’s Talk Review what it is to seek impact of (teaching and learning innovations) Consider some strategies for using evaluation tools effectively Determine ways to use evidence to influence teaching practices Review ways to report results
  • 3. Academic instruction Faculty development Instructional technology Instructional design Library Information technology Senior administration Other Who are we?
  • 4. Why are we here?
  • 5. I am working in evaluating T&L innovations now. Evaluation of T&L is part of my formal job description. My campus units director or VP mandates our gathering evidence of impact. A senior member of the administration (dean, president, senior vice-president) mandates gathering evidence of impact in T&L. I am working as part of a team to gather evidence. Accreditation processes are prompting us to begin measurement work.
  • 6. Why evidence of impact?
  • 7.
  • 8.
  • 9. What the community said Download the Survey https://docs.google.com/document/d/1Yj37DINUdCyk5DXPelr0FJ1Rewfx8xON-cLo-NJ5e4U/edit?hl=en_US#
  • 10. Technologies to Measure Web conferencing LMS and individual features Lecture capture Mobile learning tools (laptops, ebooks, tablets) Clickers Collaborative tools Student generated content Web 2.0 and social networking technologies Learning spaces OER Personal learning environments Online learning: hyflex course design, blended learning programs, synchronous/asynchronous delivery modes, fully online programs Eportfolios Multimedia projects and tools: pod/vod casts Simulations Early alert systems Cross-curricular information literacy programs Large course redesigns
  • 11. Technologies and their connection/relationship to… Student engagement Learning related interactions Shrink the large class Improve student to faculty interaction Student retention and success Specific learning outcomes
  • 12. 12 3 most important indicators you use to measure the evidence of impact of technology-based innovations in T&L
  • 13. What is “evidence?” Grades (was frequently mentioned) Learning outcomes (was frequently mentioned) Satisfaction Skills Improved course evaluations Measures of engagement and participation Retention/enrollment rates Graduation rate Direct measures of student performance (at the course level and cumulative) Interview data Institutional data Faculty/student technology utilization rates Data on student/faculty facility and satisfaction with using technology Successfully implementing technology Job placement Student artifacts Better faculty reviews by students Course redesign to integrate changes; impact on the ability to implement best pedagogical practice Rates of admission to graduate schools Success in more advanced courses
  • 14. Methods/techniques you ROUTINELY USE for gathering evidence of impact of technology-based innovations in T&L
  • 15. Most difficult tasks associated with measurement were ranked as follows Knowing where to begin to measure the impact of technology-based innovations in T&L Knowing which measurement and evaluation techniques are most appropriate Conducting the process of gathering evidence Knowing the most effective way to analyze our evidence Communicating to stakeholders the results of our analysis
  • 16. Yes No I have worked with evaluative data
  • 17. Course level (in my own course) Course level (across several course sections) At the program level (math, English) At the degree level Across institution or several programs Other I have worked with evaluative data at the
  • 18. Using evaluation tools effectively
  • 19. Technologies and their connection/relationship to Student engagement Learning related interactions Shrink the large class Improve student to faculty interaction Student retention and success Specific learning outcomes Remember
  • 20. Triangulate to tell the full story. The impact of a curricular innovation should be “visible” from a variety of perspectives and measurement techniques. ….. Three most commonly used evaluation tools: questionnaires (paper or online), interviews (individual or focus group), and observations (classroom or online).
  • 21. 5 Steps Establish the goals of the evaluation: What do you want to learn? Determine your sample: Whom will you ask? Choose methodology: How will you ask? Create your instrument: What will you ask? Pre-test the instrument: Are you getting what you need? (PILOT YOUR TOOLS/STRATEGIES)
  • 22. What is a good question? Significance: It addresses a question or issue that is seen as important and relevant to the community Specificity: The question focuses on specific objectives Answerability: The question can be answered by data collection and analysis; Connectedness: It’s linked to relevant research/theory Coherency: It provides coherent explanations that rule out counter-interpretations Objectivity: The question is free of bias Whom does your evidence need to persuade?
  • 23. Quantitative. This approach starts with a hypothesis (or theory or strong idea), and seeks to confirm it. Qualitative.These studies start with data and look to discover the strong idea or hypothesis through data analysis. Mixed. This approach mixes the above methods, combining the confirmation of a hypothesis with data analysis and provides multiple perspectives on complex topics. Example: starting with a qualitative study to get data and identify the hypothesis and then following on with a quantitative study to confirm the hypothesis.
  • 25. Methods?Support in data collection?Double loop?
  • 26. Using evidence to influence teaching practices
  • 27. “higher education institutions seem to have a good understanding of the assessment process through the use of rubrics, e-portfolios, and other mechanisms, but the difficulty seems to be in improving the yield of the assessment processes, which is more of a political or institutional culture issue”
  • 28. Why We Measure Inward (course level, inform teaching, evaluate technology use, reflective) Outward Share results with students Share results with potential students Share results with other faculty (in/out of discipline) Share results at the institutional or departmental level (info literacy, writing, cross course projects) Results can be a strategic advantage
  • 29. Lessons from Wabash National Study a 3-year research and assessment project provides participating institutions extensive evidence about the teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomes Inputs – the attitudes and values that students bring into college Experiences – the experiences that impact students once they are in college Outcomes – the impact that college has on student ability and knowledge http://www.liberalarts.wabash.edu/wabash-study-2010-overview/
  • 30. Measuring student learning and experience is the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning. www.learningoutcomesassessment.org/documents/Wabash_000.pdf
  • 31.
  • 32. lack of high-quality data is primary obstacle for using assessment evidence to promote improvements
  • 33. providing detailed reports of findings is the key mechanism for kicking of a sequence of events culminating in evidence-based improvements
  • 34.
  • 35. Download the rubrics: http://www.aacu.org/value/rubrics/
  • 37.
  • 38. Organizational Level Data Learner Satisfaction Student Learning Student satisfaction higher in QM reviewed courses & non-reviewed courses than in courses at non-QM institutions. (Aman dissertation, Oregon State, 2009) Course evaluation data showed student satisfaction increased in redesigned courses. (Prince George’s Community College, MD, 2005) Currently conducting a mixed methods study student & faculty perceptions of QM reviewed courses. (University of the Rockies) Grades improved with improvements in learner-content interaction (result of review). (Community College of Southern Maryland, 2005) Differences approaching significance on outcome measures. (Swan, Matthews, Bogle, Boles, & Day, University of Illinois/Springfield, 2010+) QM Rubric implementation positive effect on student higher-order cognitive presence & discussion forum grades via higher teaching presence. (Hall, Delgado Community College, LA, 2010)
  • 39. Organizational Level Data Teacher Learning Organizational Learning Use of QM design standards led to “development of a quality product, as defined by faculty, course designers, administrators, and students, primarily through faculty professional development and exposure to instructional design principles” (p. 214). (Greenberg dissertation, Ohio State, 2010) Currently utilizing TPACK framework to explain process by which new online teachers use the QM rubric and process when designing an online course. (University of Akron) There may be a carryover effect to non-reviewed courses when institution commits to the QM standards. (Aman dissertation, Oregon State, 2009) Faculty/design team respond different when QM presented as a rule rather than a guideline. (Greenberg dissertation, Ohio State, 2010) Extended positive impact on faculty developers & on members of review teams. (Preliminary analysis 2009; comprehensive summer 2011)
  • 40. Alignment in the curriculum between course objectives, goals, and assessments Faculty members identify which assignments they have aligned with learning objectives Design rubrics or instructions to prompt them at various data-collection points Departments or colleges are asked to report data online at the end of each term with prompts for comparison and reflection Doing so makes the data ready for larger scale assessment efforts Session Recording and Resources: http://net.educause.edu/Program/1027812?PRODUCT_CODE=ELI113/GS12
  • 41. What organizational mechanisms do you have in place to measure outcomes?
  • 43. Match your research design to the type of information in which your anticipated consumers are interested or to which they will best respond. Match your data-collection method to the type of data to which your information consumer will respond or is likely to respect.
  • 44. Keep it simple, to the point, and brief. Know who is consuming your data or research report, who the decision makers are, and how your data is being used to make which decisions, if any. Although time-consuming, it might be worthwhile to tailor your reports or analysis to the audience so as to emphasize certain findings or provide a deeper analysis on certain sections of interest.
  • 45. Good research: Tips and tricks
  • 46. Be careful of collecting too much data Be aware of reaching the point at which you are no longer learning anything from the data Write up and analyze your data as soon as possible Consider recording the interviews or your own observations/notes Record interviews or focus groups--even your own observations or impressions immediately following the interaction
  • 47. Besides all the usual good reasons for not reinventing the wheel and using others’ tested surveys, tools, or methods, doing so gives you a point of comparison for your own data http://www.educause.edu/Resources/ECARStudyofUndergraduateStuden/217333 When collecting data, talk to the right people Don’t overschedule Be sure to space out interviews, focus sessions, observations or other tactics so that you can get the most from your interactions
  • 48. Guiding Questions or Next Steps Who are the key stakeholders for the innovative teaching and learning projects in which I am involved? How can I help faculty members communicate the results of their instructional innovations to a) students, b) administrators, and c) their professional communities? What “evidence” indicators do my key stakeholders value most (i.e., grades, satisfaction, retention, others)? Which research professionals or institutional research collection units can assist me in my data collection, analysis and reporting efforts?
  • 49. Collecting Cases Project Overview Project goals, context, and design Data collection methods Data analysis methods Findings Communication of results Influence on campus practices Reflection on Design, Methodology, and Effectiveness Project setup and design Project data collection and analysis Effectiveness and influence on campus practices Project Contacts Supporting Materials
  • 50. Online Spring Focus Session April 2011 http://net.educause.edu/eli113 ………. Read about the initiative: http://www.educause.edu/ELI/SEI ………. Get involved: http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626 ……….
  • 51. Join the ELI Evidence of Impact Constituent Group http://www.educause.edu/cg/EVIDENCE-IMPACT
  • 52. SEI Focus Session Content These items for the 2011 Online Spring Focus Session on seeking evidence of impact can be found at http://net.educause.edu/eli113. ELI Seeking Evidence of Impact Resource List, includes websites, reports, articles, and research: http://net.educause.edu/section_params/conf/ELI113/SFSResourceListAllFINAL.pdf. ELI Seeking Evidence of Impact Discussion Questions: http://net.educause.edu/section_params/conf/ELI113/discussion_prompts_team-indiv2011.doc. ELI Seeking Evidence of Impact Activity Workbook, Day 1 and 2: http://net.educause.edu/section_params/conf/ELI113/activity_prompts_team-indiv2011.doc. ELI Seeking Evidence of Impact Reflection Worksheet: http://net.educause.edu/section_params/conf/eli103/reflection_worksheet.doc. Presentation slides and resources for all sessions can be found at http://net.educause.edu/eli113/2011ELIOnlineSpringFocusSessionRecordings/1028384.
  • 53. Other Related Resources Focus Session Learning Commons: http://elifocus.ning.com/ Full focus session online program: http://net.educause.edu/Program/1027810 ELI Seeking Evidence of Impact initiative site: http://www.educause.edu/ELI/SEI Resource site: http://www.educause.edu/ELI/SeekingEvidenceofImpact/Resources/206625 Suggest an additional resource: http://tinyurl.com/resourceidea Get involved: http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626 Contribute: http://tinyurl.com/elisei
  • 54. Contact Information Veronica M. Diaz, PhD Associate Director EDUCAUSE Learning Initiative vdiaz@educause.edu Copyright Veronica Diaz, 2011. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.