Next Practices for OER Quality Evaluation | Lisa Petrides
Upcoming SlideShare
Loading in...5
×
 

Next Practices for OER Quality Evaluation | Lisa Petrides

on

  • 7,208 Views

Keynote at the Learning Analytics and Knowledge conference (LAK 2013), Leuven, Belgium - for the Learning Object Analytics for Collections, Repositories & Federations workshop, by Lisa Petrides, ...

Keynote at the Learning Analytics and Knowledge conference (LAK 2013), Leuven, Belgium - for the Learning Object Analytics for Collections, Repositories & Federations workshop, by Lisa Petrides, entitled "Next Practices for OER Quality Evaluation: Using Analytics to Support Continuous Improvement"

Statistics

Views

Total Views
7,208
Slideshare-icon Views on SlideShare
901
Embed Views
6,307

Actions

Likes
0
Downloads
8
Comments
0

4 Einbettungen 6,307

http://www.scoop.it 6268
http://classroom-aid.com 28
https://twitter.com 9
http://hootsuite.scoop.it 2

Zugänglichkeit

Kategorien

Details hochladen

Uploaded via as Microsoft PowerPoint

Benutzerrechte

CC Attribution-ShareAlike LicenseCC Attribution-ShareAlike License

Report content

Als unangemessen gemeldet Als unangemessen melden
Als unangemessen melden

Wählen Sie Ihren Grund, warum Sie diese Präsentation als unangemessen melden.

Löschen
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Ihre Nachricht erscheint hier
    Processing...
Kommentar posten
Kommentar bearbeiten

    Next Practices for OER Quality Evaluation | Lisa Petrides Next Practices for OER Quality Evaluation | Lisa Petrides Presentation Transcript

    • Next Practices for OER QualityEvaluation: Using Analytics toSupport ContinuousImprovementLAK 2013 - Learning Object Analytics forCollections, Repositories & FederationsApril 9, 20134.9.13Lisa Petrides, Ph.D.ISKME 2013:
    • ISKME (Institute for the Study of KnowledgeManagement in Education)Research, tools and services to advance teaching and learning• Study (research)• Open (open knowledge networks)• Build (training and design)
    • OER Commons.org
    • Open Author
    • OpenKey drivers and next practicesNext practices:OER evaluation toolsCustom analytics00Uncertainty aroundimplementation ofnew learningstandardsDecreases ineducationfundingNew learningstandardsIncreaseddemand foranalyticsrelated to theuse of onlineresources
    • Analytics – For What PurposeFrom resource discovery to improved teaching and learning• What resourceusage patterns canand should betracked and shared?• How can paradatasupport resourceand technologyimprovements?• Which resourcesare learnersspending time on?• How do usagepatterns map toassessmentoutcomes?• Are resourcesmeeting learningstandards? If yes,how? If no, why not?• What factors makeresources reusable byteachers/learners?• What makes anexemplary resourceexemplary?LearningRegistryNSDL schema forparadata exchangeSite-specificinitatives**Examples include: Open High School Utah, Carnegie Mellon OLI, edX and othersISKME –OER Commons• How can wesupport resourcediscovery throughshared metadataand paradatastandards?KeyquestionsResourcediscoveryTechnologyand resourceimprovementsCurriculumimprovementsTargetoutcomesEnhanced teachingand learning practices;Curriculumimprovements
    • OER Quality EvaluationEQuIP tool for evaluating resources on alignment to state standardsRubric dimensions:1.Alignment to the depth ofthe CCSS (Common Core StateStandards)2.Key shifts in the CCSS3.Instructional supports4.Assessment5.Overal rating for thelesson/unit
    • OER Quality EvaluationAchieve tool for evaluating resources on quality dimensionsRubric dimensions:1.Quality of explanation ofthe subject matter2.Utility of the materialsdesigned to support teaching3.Quality of assessments4.Quality of technologicalinteractivity5.Quality of instructional andpractice exercises6.Opportunities for deeperlearning
    • Analytics Use CaseSupporting teacher professional development around finding,creating, evaluating and aligning resources• Are my teachers finding the resources they need?• Are they reaching our district’s goals for identifyingand evaluating resources?• What activities do teachers need more support in?• Where should I focus my professional developmentefforts with my teachers?• Are teachers able to see what dimensions of aresource need to be improved for it to beconsidered exemplary?Project leaders, districtadministrators, and statecurriculum developersworking with teachers toidentify quality resourcesthat are aligned tolearning standardsKey Questions the Analytics Help to Answer
    • • What distinguishes resources with high ratings from those with lowratings?• What is it that makes a resource exemplary?• What factors contribute to the use and reuse of resources byteachers?• How can we encourage the creation of high quality resourcesthrough our tools and supports?Key Questions the Analytics Help to Answer (for ISKME)Analytics Use CaseSupporting improvements on learning resources
    • Example Dashboard ViewResources by evaluation scoresQuality ofexplanation ofsubject matterQuality oftechnologicalinteractivity
    • Example Dashboard ViewEvaluation activities by userMichael SanderJessalyn KatonaMarta LevyWilliam DonovanAvery MitchellSam OlssonChris SengesIndicates whethergoals for evaluating(or tagging) resourceshave been met
    • Example ReportUser comments on evaluated resources• All qualitativecomments can beexported to a csvfile for contentanalysis• Comments canprovide insightinto neededimprovements tothe resource, whatis good about theresource, andways the resourcecan be used in theclassroom
    • Next Phase Custom AnalyticsExamples of additional data we are collecting through Open AuthorOpen Author Analytics Indicator of….# of subheadings by resource Whether resources can be broken into smaller parts.How “modular” is the resource collection?# of external URLs by resource Whether resources are being combined with otherresources. How “remixable” are the resources?# of versions of a resource by theoriginal author; # of versions by otherothersHow many derivatives are being made of the resources,and by whom? Are resources in the collectionadaptable?Reasons provided by users forchanging an existing resourceWhy and how resources are changed.What makes a resource adaptable?
    • What This All MeansContinuous improvement of resources toward enhanced learningIf one of our hyptheses is correct that…Resources with the highest overall qualityrating on our Achieve rubric are also found tohave:•The highest rating on dimension 5: Quality ofinstructional and practice exercises•More subheadings than other resources(more modular)•More external URLs than other resources(more remixable)•More versions created (more reusable)•ISKME builds prompts into Open Author toencourage the creation of resources that havethese components•This leads to the creation of new resourcesthat potentially better meet learningstandards and teaching needs•The newly created resources are thenanalyzed through the analytics•This creates a continuous cycle of resourceand tool enhancement, towards improvedteaching and learningThis could lead to…
    • Lisa Petrides, PresidentEmail: lisa@iskme.orgTwitter: @lpetridesInstitute for the Study of KnowledgeManagement in EducationHalf Moon Bay, California