There are multiple strategies being employed for reporting on portfolio data at various institutions. There is some interest in describing them in one place to observe where the is commonality or differences between them and to improve the general awareness of what is working in practice. Anyone who has specific reporting needs or is doing specific work on reporting should feel free to add an account of their project here.

These may end up on multiple pages, depending on the amount of detail included. Initially, some basic information about what kind of reports are needed and the general approach being employed would be helpful. Ideally, we will have a clear picture of the types of data being used, how it is being extracted, what summary or analysis is applied, and the tools in use.

Indiana University

Standardized evaluation form elements for generating summary and detailed reports on assessment results.

See: Solving the OSP Reporting Conundrum  (Presentation from Boston 2009 Meeting)

University of Michigan

Custom online extraction, XSL, and Javascript
See: Form Data Extraction and Summary Data Presentation | SAK-13476

Pentaho Sakai Integration


How about being able to report on your form data – evaluation forms, rubrics, feedback, etc – using any standard off-the-shelf SQL-based reporting product?

Normally, all data that users put into forms is stored in XML files in resources (usually on the file system). This makes it unavailable for most off the shelf reporting tools (and very difficult for even the Sakai reporting tool). A new tool is being developed that will piggyback on the existing data warehousing infrastructure to pull data out of those XML files and store them in database tables, freeing the data for reporting by a variety of off-the-shelf reporting tools (examples include Crystal Reports, Hyperion, Cognos or Sakai reports tool).

See also:

Virginia Tech

I'm happy to start the conversation about how we might begin assessing reporting needs and the multiple tools that are currently under development, and I can speak specifically from Virginia Tech's perspective.  We currently have 55 projects (utilizing a combination of matrix and/or presentation tools) spanning our entire university, and this number is growing at a rapid rate.  Some of our faculty are perfectly happy using matrices as a central place for electronically storing student materials and reflections on those materials.  These places offer useful spaces for organizing and maintaining materials, as well as guiding students to reflect on their learning over time. Most of these faculty are folks who have to read large quantities of student submissions, and they typically download a random sample of these materials and perform their assessment, and consequent data analysis, outside of the instance of Sakai.

However, we have a growing need for faculty and administrators to pull information, and data, out of these matrices.  The following is a list of some of the requests we hear most frequently from our faculty and administrators. I am also including some of the specific eP needs that Marc and I have identified:

I realize some of these issues/requests are probably addressed in Sakai 2.6/2.7, but we are still working on a version of 2.5 and will probably be in that instance for quite some time. That said, we we would like to work with folks to implement these tools in our instance.

I wonder if it would be useful for other institutions to join in creating a list of behaviors or functions that are needed and then perhaps we could compare the three reporting strategies listed here to see how they meet the different functional needs?

I hope this is a helpful way to get this conversation started.