Child pages
  • Reporting MiniSpec Draft
Skip to end of metadata
Go to start of metadata

Return to Portfolio Reports page

Draft.

Minispec title: I want to gather, analyze, display, and preserve portfolio data to support institutional,  programmatic, course, and individual assessment processes.

Status

Draft (minus)

Description

A MiniSpec to describe needs for reporting on portfolio data

Learning Capabilities Design Lenses and Facets

Assessment and Evaluation: Reporting
Learning Activities: Portfolio Processes
Learning and Teaching Management: Portfolio Design

Author(s)

Janice A. Smith

Endorser(s)

Portfolio Visioning Group

User goal summary:

I want to gather, analyze, display, and preserve portfolio data to support institutional, programmatic, course, and individual assessment processes. Creating a portfolio implies the development of skills and accomplishments over time and within the context of a course, a program, a degree, or a lifetime of learning. I want to track my individual development within any of these contexts. The faculty, programs, and institutions that support and guide my learning want to assess their success in helping me learn within their respective contexts.

Non goals:

  • Reports are not dashboards. They are much more specialized and relate to tasks and processes from the point of view of different roles. Dashboards for instructors would include summary data and a dynamic way to work with student submissions and evaluation processes. UCBerkeley is creating a student portal for Sakai OAE that includes a dashboard that is not specific to portfolios.
  • Reports are not intended to replace a detailed statistical analysis tool.
  • Reports are not gradebooks that submit grades to the student information system.
  • No report can be a catch-all for all desirable data. Reports must specify exactly what data is desired.

Terminology:

  • Assessment: The measurement of learning and the success of instructors, courses, programs, and institutions in supporting and guiding learning.
  • Evaluation: A summative judgment of the degree to which evidence of learning demonstrates that a particular criterion has been achieved.
  • Feedback:: Formative guidance offered by reviewers in relation to specific evidence of learning.
  • Status: The degree to which a learning process is complete.
  • Rubric: One or more criteria to be measured using two or more levels and descriptors describing desired performance for each criterion at each level.
  • Score: A predetermined calculation involving one or more levels of performance for one or more criteria.
  • Grade: An evaluation of learning based on the performance of a student in a course which may or may not be related to the comparative performance of other students in the course.
  • Artifact: Digital evidence of learning conveyed through any appropriate medium.
  • Reflection: Learner thoughts on the meaning, history, and success of learning artifacts.
  • Comments: A textual response to evidence of learning for formative or summative purposes.

Persona-based user stories:

  • Fatik, first year undeclared student: Fatik was introduced to the concept of portfolios in his First Year Experience class. As part of the class, he has been provide evidence of his learning along with his reflections in relation to the learning outcomes of the institution. He would like to view summary data of his instructors evaluation of those artifacts and reflections.
  • Courtney, 4th year industrial engineering: Courtney has been uploading multimedia artifacts into her portfolio and answering reflective questions on those artifacts for four years. She has recently been assigned to create a presentation portfolio for her capstone class. She would like to run a report on her artifacts and their evaluation by instructors, peers, and herself across all four years in order to determine which artifacts would be most appropriate to include in her portfolio.
  • Patrice, Writing Lecturer: As a general education instructor, he started using portfolios in his program last year. He believes in providing useful feedback and wants to do a quality job. He wants to track his own performance as an evaluator and to compare his results with those of his peers.
  • Girish, Tenure Track Engineering Professor: Girish has been working with other Engineering faculty to map the current curriculum to ABET outcomes and the institutional general education outcomes. After explaining to his students what artifacts (products of assignments) should be added to the portfolio and how these artifacts address the programs outcomes, he uses rubrics within the portfolio system to evaluate students performance on the artifacts in relation to the ABET and general education outcomes. He wants to track his own performance as an evaluator to better understand how his students performed as a group and how his evaluative judgments compare to those of his peers.
  • Mary Kate, Instructional Designer: Mary Kate has organized each department's program outcomes and which courses or experiences the faculty will be using to demonstrate competency for each outcome. She has installed the rubrics for faculty to use in evaluating student learning. Mary Kate monitors the progress of faculty in completing the evaluations and uses the portfolio system to report their findings to deans and department chairs.
  • Anderson, Associate Dean:Dean Anderson is getting ready for an accreditation visit and needs to determine which assessment data to pull together from each engineering program for the accreditation team. He asks the institutional assessment coordinator to run reports on the portfolio data he requires. He needs portfolio data that has been rolled up across programs, drilled down to specific programs and courses, and available by individual student.
  • Garrett, Institutional Assessment Coordinator : Garret is assembling portfolio data for for the Engineering accreditation report. He needs to run a portfolio data report on the evaluation of all Engineering students, as well as on students in each program and in selected courses. He also needs to use a random process to identify learning artifacts submitted for portfolio work in relation to each learning outcome. He wants to randomly select one student who was highly rated and one who received a satisfactory evaluation in relation to each learning outcome. He will use this evidence as way to illustrate the range of student performance in each of the Engineering programs.

Use cases:

  • As a student, I need to:
    • Track my performance in relation to the evaluation of my portfolio artifacts and reflections.
  • As an instructor, I need to:
    • Track the performance of my students in relation to the evaluation of their portfolio artifacts and reflections.
  • As an evaluator or provider of feedback, I need to:, I need to:
    • Track my evaluation or feedback in relation to student artifacts and reflections.
    • Calibrate evaluation or feedback in relation to student artifacts and reflections with that of other providers of feedback.
  • As a system administrator, portfolio administrator, assessment coordinator, department administrator, or institutional administrator, I need to:
    • Generate custom reports to display the following data:
      • Evaluation and/or feedback data for specific students in relation to specific artifacts.
      • Evaluation and/or feedback data for specific students in relation to all of that students' artifacts in a course, program, or institution.
      • Evaluation and/or feedback data provided by specific evaluators or providers of feedback in relation to students in a course, program, or institution.
      • Evaluation and/or feedback data for all students in a course, program, or institution.
      • Evaluation and/or feedback data for students with one or more demographic traits (gender, ethnicity, year in school)  within a course, program, or institution.
      • Status of learners (individually or grouped by course, program or institution) in completing a learning process.
      • Status of evaluators or providers of feedback (individually or grouped by course, program, or institution) in completing an evaluation or feedback process.
    • Generate custom reports that allow simple statistical analysis of the data (percentages, mean, median, mode).
    • Export the data to tools like SPSS, relational data environments, and/or business reporting engines.
    • Export qualitative data including all types of artifacts (student artifacts including uploaded files, linked files, snapshots of websites, and Inline text for assignments, reflection, feedback, and evaluation; evaluator or reviewer artifacts including files with comments).

Sample user scenario:

The Industrial Engineering program was awarded a grant from the Office of Educational Assessment to design, implement, assess, and sustain a teaching, learning and assessment portfolio as a required part of their majors' undergraduate experience. The portfolio spans every year of a student's time in the major, is focused upon integrative and reflective learning, and demonstrates student competency in achieving the program's student learning goals and at least three general education goals. General education goals at the University are embedded in each program, they are not separate courses that a student takes while attending the university. The portfolio is used to assess student learning goals and student mastery of general education competencies.

The department is working on this grant as a whole and has about six faculty currently using the portfolio. These faculty have been working with the Assessment Coordinator and Instructional Designer mapping the current curriculum to ABET outcomes and the institutional general education outcomes. They have linked which assignments students must complete and submit into the portfolio. They have created several reflective questions encouraging students to think more about what they are learning and how it relates back to other courses they have taken, projects they have completed, why it is important to the profession they are entering, and other experiences both within and outside of the department. They use general feedback in the form of comments and rubrics to evaluate student work.

At least three of the General Education goals must be evaluated as part of the portfolio. For institutional assessment the Assessment Coordinator will randomly select several portfolios to review. Using a group of people trained in using the AAC&U VALUE rubrics, this group will assess whether the general education goals have been met. Access needs to be given for this set of random reviewers to evaluate the portfolios; they should not see the instructors evaluation of that submission.

Data will be aggregated from the portfolio to create reports needed for the department, program, and institution. Because Industrial Engineering is an Engineering discipline, many of these reports will be shared with ABET, which is the recognized accreditor for college and university programs in applied science, computing, engineering, and technology. Reports will include summary data on the evaluation of all students in the program on each of the ABET and general education criteria as well as randomly chosen examples of student work representing exemplary achievement and student work representing a satisfactory level of achievement in relation to each criterion. Evaluation data must also be analyzed in relation to key demographic characteristics such as gender, ethnicity, and time to graduation.

Individual persona user journeys within this context include:

  • Courtney, 4th year industrial engineering:
    • Courtney has been assigned to create a presentation portfolio for her capstone class.
    • She runs a report on all artifacts she has created across her four years of school along with instructor, peer, and self-evaluations of those artifacts.
    • She uses a report to help determine what artifacts to include in her portfolio and how to write a reflective statement discussing her current strengths and weaknesses.
  • Girish, Tenure Track Engineering Professor
    • Girish and others uses portfolio functionality to map the current Industrial Engineering curriculum to ABET outcomes and to the general education outcomes for the institution.
    • He explains to his students what portfolio artifacts and accompanying reflections will be required to illustrate their mastery of  each learning outcome.
    • He uses rubrics within the portfolio system to evaluate student performance on the artifacts in relation to the ABET and general education outcomes.
    • He uses a report to track his own performance as an evaluator to better understand how his students performed as a group and how his evaluative judgments compare to those of his peers.
  • Garrett, Institutional Assessment Coordinator :
    • Garret is asked to aggregate, analyze, and display portfolio data for for the Engineering accreditation report.
    • He runs a report on the evaluation of all Engineering students, as well as on students in each program and in selected courses.
    • He uses a randomization process to identify learning artifacts submitted for portfolio work in relation to the learning outcomes, randomly selecting one student who was highly rated and one who received a satisfactory evaluation in relation to each outcome.
    • He uses this evidence as way to illustrate the range of student performance in each of the Engineering programs.
    • He shares these reports with other administrators to be added to the Engineering accreditation report.

Functional analysis:

We want to be able to generate customized reports to aggregate, display, and provide simple analytics for the following data sources:

  • Rubric-based scores from evaluators, reviewers, participants, and peers
  • Comments offered as feedback
  • Artifacts provided by students including uploaded files, linked files, snapshots of websites, and Inline text for assignments, reflection, feedback, and evaluation
  • Status of work (evaluated, pending evaluation, etc.)
  • A list of courses, assignments, artifacts that have been tagged or linked to specific learning outcomes
  • Associations of the above data with other data

We want to be able to export the data to tools like SPSS, relational data environments, and business reporting engines.

We want to be able to export qualitative data including all types of artifacts (student artifacts including uploaded files, linked files, snapshots of websites, and Inline text for assignments, reflection, feedback, and evaluation; evaluator or reviewer artifacts including files with comments).

Reports answer the questions like the following:

  • What is the status of the participant and evaluator workflow? (How many have completed or are in process in a particular workflow? By outcome and by level of performance? Including percentages and counts of participants for each outcome and level.)
  • Which courses or assignments in a program cover a specific learning outcome at a specific level? (Micro and macro views of curriculum mapping).
  • For each question, a summary and detail level.
    • What artifacts (all inclusive) have been supplied by students in accordance with program or course requirements (by outcome, level, and participant).
    • What artifacts (all inclusive) have been supplied by students that meet outcomes not prescribed by institution or instructors (by outcome, level, and participant).
    • For a specific outcome, what are the mean, median, and mode of rating scores (along with counts and standard deviations) for a specific population?

Is Sakai OAE on track for being able to export data?

  • The key is standard naming conventions. A UI developer can create new fields on the fly in Sakai OAE. Wen it comes to portfolio use, it will be important to have standard conventions for reporting the data we want to report.
  • We need to propose standard naming conventions now, possibly using X Form Standard.

Additional questions that need to be addressed include the following:

  • What reporting needs should be addressed within Sakai and what is OK or better to do outside of Sakai?
  • In what ways do we need to analyze the data?
  • How do we want to display the data? For what purposes and audiences?
  • What does it mean to preserve the data?
  • What else needs to be done with reporting to support the assessment process?
  • How can we distinguish between what is most important now and what needs to have a placeholder in the software for the future.
  • No labels