Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

The Pragmatic Evaluation of Tool System Interoperability

Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Wird geladen in …3
×

Hier ansehen

1 von 31 Anzeige

The Pragmatic Evaluation of Tool System Interoperability

Herunterladen, um offline zu lesen

A. de Moor (2007). The Pragmatic Evaluation of Tool System Interoperability (invited paper). In Proc. of the 2nd ICCS Conceptual Structures Tool Interoperability Workshop (CS-TIW 2007), Sheffield, UK, July 22, 2007. Research Press International, Bristol, UK, pp.1-19.

A. de Moor (2007). The Pragmatic Evaluation of Tool System Interoperability (invited paper). In Proc. of the 2nd ICCS Conceptual Structures Tool Interoperability Workshop (CS-TIW 2007), Sheffield, UK, July 22, 2007. Research Press International, Bristol, UK, pp.1-19.

Anzeige
Anzeige

Weitere Verwandte Inhalte

Diashows für Sie (20)

Anzeige

Ähnlich wie The Pragmatic Evaluation of Tool System Interoperability (20)

Weitere von CommunitySense (14)

Anzeige

Aktuellste (20)

The Pragmatic Evaluation of Tool System Interoperability

  1. 1. The Pragmatic Evaluation of Tool System Interoperability Aldo de Moor CommunitySense CS-TIW, July 2007
  2. 2. Once upon a time...
  3. 3. Those days are gone!
  4. 4. Tool systems <ul><li>Tool system </li></ul><ul><ul><li>the set of integrated and customized information and communication tools tailored to the specific information , communication , and coordination requirements of a collaborative community </li></ul></ul><ul><li>No standard prescriptions </li></ul><ul><li>Communities need to evaluate the functionalities in their unique context of use </li></ul>
  5. 5. Technical comparison is not enough
  6. 6. The “orchestra metaphor” <ul><li>How to create a well-tuned orchestra of tools able to perform a magnificent symphony? </li></ul><ul><li>Go beyond the technical abilities of the individual tools </li></ul><ul><li>Practice, trial and error, leads to synergy and alignment </li></ul><ul><li>Where is the conductor? </li></ul>
  7. 7. Tool system interoperability <ul><li>How to assess the interoperability of a tool system in a particular usage context? </li></ul><ul><li>Interoperability </li></ul><ul><ul><li>The need to make heterogeneous information systems work in the networked world (Vetere-Lenzerini) </li></ul></ul><ul><ul><li>The ongoing process of ensuring that the systems, procedures, and culture of an organisation are managed in such a way as to maximise opportunities for exchange and re-use of information, whether internally or externally (Miller) </li></ul></ul>
  8. 8. Pragmatic evaluation <ul><li>Much research focuses on syntactic and semantic interoperability </li></ul><ul><ul><li>UDDI standard </li></ul></ul><ul><ul><ul><li>Universal Description, Discovery and Integration </li></ul></ul></ul><ul><ul><ul><li>Rules for building service directories and facilitation of top-down querying </li></ul></ul></ul><ul><ul><li>Pragmatic interoperability? </li></ul></ul><ul><ul><ul><li>Link standards to context-dependent needs of user communities </li></ul></ul></ul><ul><ul><ul><li>How? </li></ul></ul></ul>
  9. 9. Questions <ul><li>How to conceptualize the usage context in tool system interoperability evaluation? </li></ul><ul><li>What would an evaluation procedure look like? </li></ul><ul><li>How would such a procedure influence design choices? </li></ul>
  10. 10. Goals <ul><li>Construct minimal conceptual model of pragmatic evaluation methods for tool system interoperability </li></ul><ul><li>Can be used for developing whole classes of methods specifically tailored to communities </li></ul><ul><li>Make pragmatics explicit </li></ul><ul><li>Find common ground for pragmati(ci)sts, tool builders, and designers </li></ul>
  11. 11. Case: co-authoring a call for papers <ul><li>2006 International Pragmatic Web conference </li></ul><ul><li>Three co-chairs in different countries </li></ul><ul><li>Write call for papers by e-mailing around Word-files </li></ul><ul><li>PragWeb new paradigm, confusion abounded, no convergence </li></ul><ul><li>Co-evolution of requirements led to satisfactory tool system solution </li></ul>
  12. 12. Co-authoring tool system v1 Author 1 Author 2 Version Author 2 Version Author 1 Version Author 3 Author 3
  13. 13. Co-authoring tool system v2 Author 1 Author 2 Version Author 2 Version Author 1 Version Author 3 Author 3 Conference
  14. 14. Co-authoring tool system v3 Author 3 / Editor Author 1 Author 2 Conference Agreed lines (Modified) paragraphs Chat Version Author 1 Version Author 1 Version Author 1 Version-in Progress
  15. 15. A conceptual model of the tool system <ul><li>Functionality </li></ul><ul><ul><li>a set of functions and their specified properties that satisfy stated or implied needs (SEI) </li></ul></ul><ul><li>Different levels of granularity </li></ul><ul><ul><li>Systems </li></ul></ul><ul><ul><li>Tools </li></ul></ul><ul><ul><li>Modules </li></ul></ul><ul><ul><li>Functions </li></ul></ul><ul><li>Interfaces, information objects, info/comm processes </li></ul>
  16. 16. Example
  17. 17. A conceptual model of the usage context <ul><li>(De Moor, 2005) </li></ul><ul><ul><li>Patterns for the Pragmatic Web </li></ul></ul><ul><li>pragmatic context is common context + set of individual contexts </li></ul><ul><ul><li>Concepts, definitions, communicative interactions, context parameters </li></ul></ul><ul><ul><li>Focus on meaning negotiation process </li></ul></ul><ul><li>Current focus: the pragmatic patterns themselves </li></ul>
  18. 18. Usage context: goals <ul><li>Goals: activities, aspects </li></ul><ul><ul><li>Sense of purpose, drive people and processes, evaluation criteria </li></ul></ul><ul><li>Activities </li></ul><ul><ul><li>Operationalized goals: processes with concrete deliverable as outcome </li></ul></ul><ul><ul><li>E.g. writing a call for papers, making a group assignment </li></ul></ul><ul><ul><li>High-level workflows, interested in potential functionalities, not implementation details </li></ul></ul><ul><li>Aspects </li></ul><ul><ul><li>Abstract goals cutting across processes and structures </li></ul></ul><ul><ul><li>E.g. security, interactivity, effectiveness </li></ul></ul>
  19. 19. Usage context: actors <ul><li>“ The user” does not exist </li></ul><ul><li>Many stakeholders , with their own needs, interests, and goals </li></ul><ul><li>Actor roles increasingly important </li></ul><ul><ul><ul><li>responsibilities in workflows </li></ul></ul></ul><ul><ul><ul><li>access to functionalities and information resources </li></ul></ul></ul><ul><ul><li>E.g. Role-Based Access Control paradigm </li></ul></ul>
  20. 20. Actor role typologies <ul><li>Currently mostly technology-focused </li></ul><ul><ul><li>Administrator, Facilitator, Member,... </li></ul></ul><ul><li>Need to become much more contextualized </li></ul><ul><ul><li>Customized responsibilities and access rights </li></ul></ul><ul><li>Examples </li></ul><ul><ul><li>Workflow-based </li></ul></ul><ul><ul><ul><li>Author, Reviewer, Editor, ... </li></ul></ul></ul><ul><ul><li>Organization-based </li></ul></ul><ul><ul><ul><li>Secretary, Manager, Team Leader, ... </li></ul></ul></ul><ul><ul><li>Domain-specific </li></ul></ul><ul><ul><ul><li>Env. Protection Agency, Corporation, NGO, ... </li></ul></ul></ul>
  21. 21. Usage context: domain <ul><li>Major influence on evaluation processes and tool system functionalities </li></ul><ul><li>Still ill-understood </li></ul><ul><li>Determinants </li></ul><ul><ul><li>Structure and size : e.g. distributed, centralized, small, large </li></ul></ul><ul><ul><li>Setting : academic, corporate, gov, non-gov </li></ul></ul><ul><ul><li>Financial : resources for customization or off-the-shelf software only? </li></ul></ul><ul><ul><li>Political : certain software choices mandatory/prohibited? </li></ul></ul>
  22. 22. The pragmatic evaluation process
  23. 23. The scoring process <ul><li>The main process in which stakeholders reflect on role of functionalities in complex usage context </li></ul><ul><li>Many ways to do so </li></ul><ul><ul><li>E.g. Bedell’s method for IT functionality effectiveness evaluation </li></ul></ul><ul><ul><li>Score functionalities on effectiveness and importance for activities </li></ul></ul><ul><ul><li>Problem: complex, time-consuming, many levels of aggregation </li></ul></ul>
  24. 24. A practical method for courseware evaluation <ul><li>Questions </li></ul><ul><ul><li>How well are the various activities supported by the various functionalities? </li></ul></ul><ul><ul><li>How effectively are the various functionality components used? </li></ul></ul><ul><li>Goal scores and functionality scores </li></ul><ul><li>Users in their actor roles provide , interpret and use scores in decision making </li></ul><ul><li>Context: courseware evaluation: </li></ul><ul><ul><li>Actors: students, software manager </li></ul></ul><ul><ul><li>Tool system level: module </li></ul></ul>
  25. 25. Goal and functionality scores <ul><li>Elements </li></ul><ul><ul><li>I(g) = importance of a goal </li></ul></ul><ul><ul><li>I(f,g) = importance of a functionality in supporting a goal </li></ul></ul><ul><ul><li>Q(f,g) = quality of a functionality in supporting a goal </li></ul></ul><ul><li>G-Score =  I(f i ,g) * Q(f i ,g), for all functionalities 1..i </li></ul><ul><li>F-Score =  I(g j ) * I(f,g j ) * Q(f,g j ), for all goals 1..j </li></ul>
  26. 26. Experiment: group assignments <ul><li>Two courseware tools: Blackboard, CourseFlow </li></ul><ul><li>Goal: making group assignments </li></ul><ul><li>Four activities, 11 functionality modules </li></ul><ul><li>Actors: 2 nd year Information Management students, software manager </li></ul><ul><ul><li>2002: 62 students, 16 groups </li></ul></ul><ul><ul><li>2003: 46 students, 12 groups </li></ul></ul><ul><li>Questions </li></ul><ul><ul><li>Quality of tools for various group assignment activities? </li></ul></ul><ul><ul><li>Usefulness of various functionality modules? </li></ul></ul>
  27. 27. Activity scores
  28. 28. Functionality scores
  29. 29. Evaluation ++ <ul><li>More advanced goal concepts , e.g. maintainability </li></ul><ul><ul><li>However, tradeoff between methodological power and ease-of-use! </li></ul></ul><ul><li>Link to existing activity and quality aspect frameworks </li></ul><ul><ul><li>Activities: e.g. BPMN, workflow patterns </li></ul></ul><ul><ul><li>Aspects: IS quality frameworks, e.g. Delen and Rijsenbrij, DeLone and McLean </li></ul></ul><ul><li>Link to existing evaluation methods from quality IS literature </li></ul><ul><li>Contrast evaluations by different actors </li></ul><ul><ul><li>Students have different interests from the lecturer! </li></ul></ul><ul><ul><li>Build on techniques for multi-stakeholder dialogues </li></ul></ul><ul><li>Better balance informal and formal approaches : hermeneutic approaches meet conceptual structures </li></ul><ul><li>Link to applied pragmatic philosophy in IS devt, e.g. </li></ul><ul><ul><li>testbed devt methodologies – Keeler and Pfeiffer </li></ul></ul><ul><ul><li>trikonic architectonic – Richmond </li></ul></ul><ul><ul><li>active knowledge systems – Delugach </li></ul></ul><ul><ul><li>goal-oriented transaction modeling – Polovina et al. </li></ul></ul>
  30. 30. Evaluating virtual worlds
  31. 31. Conclusions <ul><li>Functionality selection = balancing </li></ul><ul><ul><li>Collaborative community requirements </li></ul></ul><ul><ul><li>Interoperable tool system </li></ul></ul><ul><li>Pragmatic evaluation of tool system interoperability </li></ul><ul><ul><li>Socio-technical evolution of </li></ul></ul><ul><ul><li>Tool system </li></ul></ul><ul><ul><li>Usage context </li></ul></ul><ul><li>Conceptual framework for pragmatic evaluation method construction and comparison </li></ul><ul><li>Fundamental problem </li></ul><ul><ul><li>Infinite variety of usage contexts </li></ul></ul><ul><ul><li>Balance needed between formal and informal interpretation </li></ul></ul><ul><li>Conceptual structures tools could be the missing link between the </li></ul><ul><ul><li>human capacity to interpret context with </li></ul></ul><ul><ul><li>computational power to analyze patterns </li></ul></ul><ul><li>Indispensable for the continuously evolving, context-sensitive collaboration systems of the future </li></ul>

×