Child pages
  • Further specifications for Reporting MiniSpec

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0

Reporting MiniSpec

  • Name of MiniSpec: I want to gather, analyze, display, and preserve portfolio data to support institutional,  programmatic, course, and individual assessment processes.
    • MiniSpec to be based on report samples (UDel and IU) and reporting requirements from our institutions
    • UDel sample report is from a department. May be more like a dashboard for creating a report. Includes a curriculum map, which we have not put into Portfolio Minispecs.
  • What are our data sources?
    • We want to be able to generate custom reports within the system that generate the following data.
    • We want to be able to export the data to tools like SPSS and relational data environments.
    • We want to be able to export qualitative data including all types of artifacts (student artifacts including uploaded files, linked files, snapshots of websites, and Inline text for  assignments, reflection, feedback, and evaluation; evaluator or reviewer artifacts including files with comments).
    • Most important is to make the data available for export so that institutions can create the reports they need outside of Sakai. Secondarily, making the data available for display within Sakai.
  • Data desired includes:
    • Rubric-based scores from evaluators, participants, and peers
    • Artifacts provided by students including uploaded files, linked files, snapshots of websites, and Inline text for assignments, reflection, feedback, and evaluation
    • Status of work (evaluated, pending evaluation, etc.)
    • A list of courses, assignments, artifacts that have been tagged or linked to specific learning outcomes
    • Associations of the above data with other data
  • What are the questions we are trying to answer with these reports?
    • What is the status of the participant and evaluator workflow? (How many have completed or are in process in a particular workflow? By outcome and by level of performance? Including percentages and counts of participants for each outcome and level)
    • Which courses or assignments in a program cover a specific learning outcome at a specific level? (Micro and macro views of curriculum mapping).
    • For each question, a summary and detail level.
    • What artifacts (all inclusive) have been supplied by students in accordance with program or course requirements (by outcome, level, and participant).
    • What artifacts (all inclusive) have been supplied by students that meet outcomes (not prescribed by institution or instructors) (by outcome, level, and participant)
    • For a specific outcome, what are the mean, median, and mode of rating scores (along with counts and standard deviations) for a specific population?
  • How does a report differ from a dashboard?
    • Two kinds of reporting, one that has to do with administration of work and another that has to do with evaluation of work? First kind of reporting overlaps with a dashboard.
    • The first kind of reporting should happen within Sakai, the second kind of reporting does not need to happen in Sakai but needs to be exported in a useable format.
    • There should be some basic reporting of evaluation data within Sakai and then much more data available to export from Sakai.
  • Is Sakai OAE on track for being able to export data?
    • Key is standard naming conventions
    • UI developer can create new fields on the fly in Sakai OAE, when it comes to portfolio use, it will be important to have standard conventions for reporting the data we want to report.
    • What standard naming conventions can we propose now? We need to do this now. We need to learn more about X Form Standard (introduced by Jacques Raynauld).
  • Additional topics to consider in formulating the reporting MiniSpec:
    • What do we need to do within Sakai and what is OK to do outside of Sakai?
    • How do we intend to analyze the data?
    • How do we want to display the data? For what purposes and audiences?
    • What does it mean to preserve the data?
    • What else needs to be done to support the assessment process?
    • Distinguish between what is most important now and what needs to have a placeholder in the software for the future.