3. SERIOUS GAMES ASSESSMENT
Not many serious games include in-game evaluation
Serious games with integrated assessment usually rely in
Q&A structures
… but games produce a lot of data that can be analyzed
with educational/assessment purposes
4. LEARNING ANALYTICS + VIDEOGAMES
Web Analytics
Business Intelligence
Game Analytics
5. WHAT DATA CAN WE TRACK?
INTERACTION TRACES LOGIC TRACES
Low level events High level events
GUI interactions Player score
Mouse clicks Phases changes
Keystrokes Completed missions
Joystick movements Player "deaths"
8. START GAME
Whenever a student launches the game
How many students played the game, who they were and
when they played.
9. END GAME
Whenever a student successfully the game.
Who accomplished the goals established for the game
Does the optimal goal attain?
10. QUIT GAME
Whenever a student quits the game, before finishing it
Who abandoned the game before finishing it, and with
the appropriate context, where he quitted.
11. PHASE CHANGES
In an educational videogame, these phases can mark
several educational sub-goals.
Identifying most time-consuming phases
Understand how each part of the game is being accessed
(if the phase exploration sequence is not linear)
12. SIGNIFICANT VARIABLES
Games rely on variables to represent their state
Some of those variables can be relevant for the
assessment
Track when and with which values these variables are
updated
13. USER INTERACTION
Raw user interaction (mouse clicks, screen touches, keys
pressed…) can be used to retrieve some useful
information
14. SOME REQUIREMENTS
Most of games are black boxes.
No access to what is going on during game play.
We need access to game “guts”
Or… the game must communicate with the outside world,
using some logging framework
20. GAME DESIGN FOR LEARNING ANALYTICS
1. What do we exactly want to teach in the game?
Is a skill, is a concept?
2. Can we measure if the students learned it?
3. If so, how are we going to measure it?
Direct methods: Q&A
Indrect methods: accomplish a mission that requires
the skill/concept
24. CAN WE RELATE THESE RESULTS WITH
TRACES COLLECTED IN THE GAME?
In-game metrics: correct and incorrect xml documents
Students with best performance in post-tests had less
incorrect documents
They also had a smaller number of documents sent
25. CONCLUSIONS
Our "universal" traces model is useful to extract web-
analytics style statistics.
Game design is key to enable effective learning
analytics
We need more concrete analysis tools to measure the
learning process
Simple approaches to measure simple things
Generalization and abstraction