Child pages
  • Indiana University - Future Directions and Priorities
Skip to end of metadata
Go to start of metadata

Summary of ePortfolio Use Cases at Indiana University

(D. Goodrum, J. Gosney, & L. Ward)

(0) Introduction
Over the past several weeks, there have been many recent discussions at IU around the issue of whether the current ePortfolio application meets now -- or will meet in the future --  IU's core needs.  There have been other discussions at IU of whether there is a shared understanding of the core requirements and scope of the project (and the resulting understanding of required resources), specifically between the developers/technical support team and those charged (i.e. the campus teaching and learning centers) with guiding and implementing the pedagogical applications of the software.  All of those internal discussions have also been infused with the larger OSP community discussions, regarding the need for simplified workflows and possible refactoring of the ePortfolio application.
With all of these discussions in mind, the following document and mock-up screen shots are meant to represent a restatement of requirements for ePortfolio, with specific emphasis on assessment at both the program and institutional level. A key component of this restructuring concept hinges upon directly integrating several activities into course sites through the use of the Assignments tool that has been made "Goal Management Tool" aware: the attached mock-up screen shot is representative of what a student might see when completing an ePortfolio assignment in a Sakai course site.
Rather than be seen as a complete abandonment of the existing ePortfolio architecture, the document instead suggests a more direct, simplified process workflow that will take advantage of existing, robust, proven tools (e.g. the Assignments tool) while at the same time allowing the rich and varied applications of the ePortfolio (from program assessment to student presentation of work to a public audience) to be carried forth in a much simpler, yet no less comprehensive, fashion.
The goal of this document is to describe the mission-critical functionality of Indiana University's ePortfolio implementation. Rather than focus on desired changes to the OSP effort, this document attempts to delineate the core needs for the IUPUI campus, informed as well by initiatives just starting at the Kokomo and Bloomington campuses. Furthermore, it attempts to integrate many key ePortfolio activities into regular courses and their course sites thereby greatly simplifying the required workflows for students and instructors to participate in the ePortfolio initiative.
 
(1) Institutional and Programmatic Goal Setting

The institution may have a discreet list of broadly defined goals for student learning.  These goals are global and can be used in any course or project site containing "goal-aware" tools.

Schools/departments/programs of studies may also have discreet lists of goals for student learning (these goals are often driven by accreditors and professional associations).  Programmatic goals are not global to the institution.  Instead, they are available only in course and project sites associated with the school/department/program of study that defined and/or adopted the goals.   Program goals are often similar or identical in substance (if not verbiage) to global institutional goals.  In such cases, program goals can be linked or mapped to equivalent institutional goals.

Global and programmatic goals sets can be hierarchical.  Subgoals may define developmental levels of achievement (e.g., introductory, intermediate, advanced) or a list of competencies making up the parent goal.

Each institutional and programmatic goal set can define its own rating scale and developmental levels.  When mapping program goals to institutional goals, program administrators will also define rating and developmental equivalencies.

(2) Basic Collection of Artifacts

For a specific course, what is commonly identified on the syllabus as a primary "goal" of the course is a culminating milestone activity (or activities) such as writing a term paper, delivery of a final project/presentation, etc. These kinds of artifacts can be used as evidence for having reached the established goal(s) of the course.  In most environments, the locus for the collection of such artifacts is, then, the instructor-led course.

There is already a proven, simple interface for collecting these artifacts and providing instructor feedback: the Oncourse CL/Sakai Assignments tool.  In the context of the instructor-led classroom and the use of the assignments tool:

  • Most courses will have one or two (or a very small number) of artifacts that will be treated as "culminating evidence" of goal mastery. 
  • With the above in mind, a related small number of assignments would be linked to a school/department/programmatic goal, which through a goal hierarchy would be linked as well to a larger institutional goal.  Assignments may also be linked directly to institutional goals and in cases where there are no equivalent program goals.
  • Linked assignments can be both rated for goal mastery and graded as an item in the gradebook. 

In short, the most frequently used tool to add an artifact to the student's ePortfolio is by handing in an instructor-led classroom assignment via the assignment tool. Then, through the goal hierarchy, the assignment is linked to a programmatic goal as well as to an institutional goal.  Because of this linking, the program and the institution can collect, review, rate and retain artifacts of student accomplishment. In some cases, the instructor may also give an assignment that requires the student to select the most appropriate goal(s) demonstrated in a specific student work. 

(3) Independent Evaluation

Many programs, institutions and/or accrediting agencies will want a second assessment of an artifact, in addition to the original assessment or rating.  An internal or external evaluator may be assigned to focus on a specific goal or goal/level combination, and a subset of artifacts coming from one or more specific courses, groups of students, or randomly assigned quota.  Moreover, it is assumed this evaluator would still be closely familiar with the more specific programmatic requirements, and that the assigned scale, while perhaps different from the original, would remain conceptually related.  

(4) Guidance and Advising

Periodically, students need to gauge their overall academic progress.  In the past this progress review has primarily consisted of the following:

  • A review of courses taken (and passed at an acceptable level).
  • Grades (in a list of prescribed and elected courses) viewed as sufficient evidence of success and accomplishment, leading to the awarding of a degree.
  • Individuals outside of the institution (e.g. prospective employers) reviewing/confirming the awarded degree and the student's grade transcript.  
    Today there is more interest in seeing the tangible evidence of accomplishment, not just the grade.  A periodic review of progress would also include an examination of key intellectual achievements (i.e. student-submitted artifacts used as culminating evidence for goal mastery) This new process would start within the institution, most likely at the programmatic level.

In order for this more granular review of academic progress to be successful, a student -- in addition to or independently of an advisor, coach or mentor -- needs to see an overview of goal mastery as well as an organized collection of intellectual products (i.e. submitted artifacts) as they relate to the set of goals. All of those linked artifacts (assignments) though coming from many different courses would be accessible to student and advisor, coach or mentor.  
For the individual student, such an "overview" would need to visually highlight which goals had submitted evidence (that is, submitted artifacts) and which were rated as successful or as lacking in cumulative evidence of mastery.  Additional specific requirements of this overview process would include the following:

  • The student and the advisor, coach or mentor would be able to discuss overall progress towards the goals; therefore, the presentation of the student's progress would need to be highly intuitive and easily accessible.
  • The student and advisor, coach and/or mentor would be able to examine/discuss/comment on individual pieces of evidence: in other words, the overview presentation would need to allow for a "drill-down" approach so that individually submitted artifacts could be accessed and examined.
  • Based on this overview process, an advisor might recommend or require additional assignments (outside of class) that would help students create intellectual products that would provide sufficient evidence of progress towards one or more programmatic (and linked institutional) goals.
  • The student would be able to view his/her progress at any time (with or independent of an advisor) as well as add additional artifacts and link them to programmatic or institutional goals.

(5) Institutional and Programmatic Reporting

At both the programmatic (i.e., school/department/program) level and the institutional level, there will be an interest in overall statistics for progress towards institutional and programmatic progress by students (based upon independent evaluation of individual artifacts and overall individual student progress by guidance and advising) and an interest in reviewing selected artifacts.  Institutional and program administrators would be able to sort, filter, and/or group report data based on demographic criteria, course or program membership, year, and multiple other criteria necessitating links with student information systems and other institutional databases.

(6) Summary Tasks by Individual Role

(1) From the institutional assessment administrator POV
 
a.       I set up a list of institutional goals (or table of goals and levels) with sufficient explanation (expectations and rubrics), and publish them globally.

b.       I can see summary reports of progress towards institutional goals as well as examine a sample of actual artifacts.

(2) From the programmatic administrator POV
 a.       I set up a list of programmatic goals (or table of goals and levels) with sufficient explanation (expectations and rubrics) and publish them to course and project sites in my department, school, or program.

b.       I link/map my program goals to equivalent institutional goals, my program rating scale to the institutional rating scale, and the developmental levels for my program to institutionally defined developmental levels.
 c.       I can see summary reports of progress towards programmatic goals as well as examine a sample of actual artifacts.
 
(3) From the instructor POV
 a.       I set up a list of assignments (i.e. course goals with sufficient explanation) and link them to programmatic and/or institutional goals, and publish them within the course.
(4) From the student POV
 a.       I submit one or two "culminating evidence" assignments in each course.  I understand from the instructor that these assignments (artifacts) should show evidence of my achieving programmatic and institutional goals in additional to knowledge and skills directly related to the content of the course.  I can see a description of the goals (and associated rubrics) to which these assignments are linked.

b.       I may submit additional evidence on my own and link it to program or institutional goals.

c.       Periodically, I review my progress towards programmatic goals and see how my various courses have helped me reach larger goals related to my field of study and my institution's emphasis.  I may perform this "overview process" either alone, and/or in conjunction with a mentor and/or advisor.

d.       Related to "b" above, I carry on a dialogue with an advisor, coach or mentor about my overall progress and intellectual development; I may receive additional assignments from this person to help me achieve both my programming and institutional goals.
(5) From the independent evaluator POV
 a.       a. I rate individual artifacts against a given scale that has been assigned to me for review. As an evaluator, I understand that the institution perceives me as still closely familiar (if not an expert) with specific programmatic requirements (in other words, I am not asked to rate artifacts outside of my field of expertise). Moreover, the assigned scale given to me for review, while perhaps different from the original, would remain conceptually related.
(6) From the advisor/coach/mentor POV
 a.       Periodically, I review the progress of individual students in reaching programmatic and institutional goals and carry on a dialogue with the individual student.

b.       I may recommend additional assignments to individual students that would help them show intellectual progress towards programmatic and institutional goals.

(7) An Additional Note about Student ePortfolios for External/Public Audiences

The individual student may wish to highlight their intellectual accomplishments for outside audiences, including such groups as:

  • Potential employers
  • Potential schools (i.e. graduate or professional schools)
  • Family, friends and student colleagues
    If the student is interested in highlighting their intellectual accomplishments as defined by institutional and programmatic goals, the student would at most wish to select a subset of goals to highlight, and for each of those goals a small subset of artifacts related to that goal. Students will be interested in an aesthetic presentation with much opportunity for customization and personalization.  

(8) ePortfolio tool in Sites

The following describes how most of the capabilities described above could be provided through a single ePortfolio tool in addition to the artifacts submitted through the Assignments tool:
A group of institutional assessment administrators would have a project site where they create goals (and accompanying rating scales/rubrics), create and view reports, as well as use other CL tools to communicate and collaborate on these activities. An ePortfolio tool provides areas for Goals and Reports. Evaluators could be members of this site as well if they are assigned at this level to specific goals; in the ePortfolio tool they would have an additional area of access for Evaluation; access to goals and reports would be read-only for evaluators.
Each programmatic area would have a project site where its group of programmatic assessment administrators would create and link goals (and accompanying rating scales/rubrics), create and view reports, as well as use other CL tools to communicate and collaborate on these activities. An ePortfolio tool has areas for Goals and Reports. Evaluators could be members of this site as well if they are assigned at this level to specific goals; in the ePortfolio tool they would have an additional area of access for Evaluation; access to goals and reports would be read-only for evaluators.
Each advisor in a programmatic area would have a project site with the ability to view the progress of assigned students' portfolios. The advisor also could use the Assignment tool to create additional opportunities for students to upload artifacts to their ePortfolio.  An ePortfolio tool has an area to View Progress (student views own; advisor selects from a list of students); in addition, access to Goals and Reports would be read-only for advisors and students. Evaluators could be members of this site as well if they are assigned at this level to a specific group of students; in the ePortfolio tool they would have an additional area of access for Evaluation; access to goals and reports would be read-only for evaluators (that is, Evaluators would not see the View Progress area).
A student could also go to this programmatic-related site to see his/her overview and independently add artifacts created outside of the usual course structure.

(9) MyPages tool in MyWorkspace

This could be an area where students put together one or more public views of their academic life and progress and provide view access to people outside the institution.  A student could base a public view on selected goals within an institutional goal set or program goal set and by selecting a subset of the artifacts related to selected goals.
Also in this space there could be an overview of the students ePortfolio artifacts organized by an institutional goal set (if it exists) and by the one or more programmatic goal sets of their academic discipline(s).  Here, too, the student could add new artifacts.
To extend the idea beyond ePortfolio initiatives, the MyPages area could be a central area for students to upload web pages they have created and wish to share to the public, also podcasts, personal blogs, personal profile, and so on.

Some possibilities (or other possibilities!) with this:

(1) What might it look like if, in the "MyPages" too, we were to provide a template for students to use of their goal sets as a basis for an ePortfolio?  Using Google Pages functionality, as an example, see the attached document, "My Pages Template Screenshots.pdf".  Also, see this link (http://dgoodrum.googlepages.com/home) as an example of the front page and navigation might look like.

(2) For another approach to a Web template for student portfolios based on the KEEP toolkit functionaly, see the attached document, "Mypages template alt version.pdf" (if run locally, it is our understanding that the KEEP toolkit can allegedly be integrated with Sakai).

Most of the remaining items on this page were proposed for the 2.4, but didn;t make it in to the final release. 

Fixes to Existing Tools

Interface Clean-up

  1. Hide Instructions/Rationale/Examples Labels on Main page of Wizard.
    If the main or first page of a wizard does not contain data in the Instructions /Rationale /Examples areas, the labels for these areas should not display in the published wizard.
  2. Hide buttons/labels for unused forms.
    If a matrix cell or wizard page does not contain a reflection or evaluation or feedback form then related buttons/labels should not show up for the 'student' user.
  3. Clean-up Matrix Guidance
    Change presentation order and number (singular to plural) from Instruction, Example, Rationale to Instructions, Rationale, Examples for consistency with authoring and main page.  Currently the link to full guidance does not display if Instructions is empty, even if the other two gudiance fields contain data.  Link should display if any of the three contains data. 
  4. Clean up Wizard Guidance.
    On the wizards main page, the labels for Instructions, Rationale, and Examples (25, 26, 27) display even when they do not contain content. Suppress label for items that do not contain content.
  5. Fix title display on Wizard subpages
    Currently, the title on Wizard subpages displays twice.

Enhancements

  1. Allow suppression of Select Item(s) button
    For some applications, the user is not supposed to attach items to the page. In such cases, it would be helpful to be able to remove the button from the participant view.
  2. Button naming and placement.
    A suggestion for the Published Cell/Page View for both Matrix and Wizard is to standardize placement of the completion buttons at the bottom of the page, like other Sakai tools, and to shorten the text of the 'Submit for Evaluation confirmation' button.
    For the Wizard main page, the two buttons would remain in the current position at the  bottom of the page with the submit button text slightly revised:
    Return to Wizards    Submit for Evaluation
    For the Matrix page, the Submit for Evaluation button would be moved to the bottom of the screen and paired with the 'Back to matrix button, whose text would be revised:
    Return to Matrix       Submit for Evaluation
  3. Increased Page/Cell Status Control.
    Currently, the only way a cell can be converted to completed status is immefdiately following submission of en evaluation and only the owner can put the cell into pending status. There are occasions when the coordinator needs to the ability to convert a cell from ready to pending or pending to complete.

The Manage Status link associated with each page or cell would offer the following options:

Change Status to READY:
For this user only
For all users

Change Status to Pending:
For this user only
For all users

Change Status to Complete:
For this user only
For all users

  1. Revising Evaluations and Feedback.
    Allow reviewers/evaluators to revise feedback and evaluation forms. Once an evaluation or feedback form is created it cannot be changed by the creator. Add a revise link (similar to the revise link for reflections) and allow revisions when the cell is not locked.
  2. Hide a published matrix/wizard.
    Allow author to hide or archive a wuzard that is no longer in use, but for which the data continues to be of value.
  3. Show/Hide tools based on permissions.
    Make changes to ePortfolio to take advantage of the ability in Sakai to hide tools based on permissions.

Key permissions for IU's configuration:
Evaluations tool: Evaluate
Glossary tool: Add
Forms tool: Create
Presentation Layouts tool: Create
Presentation Templates tool: Create
Styles tool: Create
Reports tool: View

  1. Support for optional form on main page of wizard.
    Optional forms cannot be added to first or main page of a hierarchical or sequential wizard. There are some occasions when a wizard just needs to be one page long. Rather than requiring a click to a second page to get to the form, why not simply allow optional forms on the main page.
  2. Easier term substitutions.
    Create ability for an institution to more easily substitute its own tool names and similar core terms in the application.
  3. Wizard page status.
    There is no easy method for users to see which wizard pages contain participant input and/or have been submitted without opening each page in each wizard. The Wizard manager shows how many pages of a given wizard have been submitted, but not the specific page numbers or titles. For the hierarchical wizard, the gray page bullets could be replaced by green, yellow, or blue bullet to show page status. For the sequential wizard, perhaps there could be an outline view that gives the status of each page.
  4. Location of Usernames drop down for Wizards.
    The username drop down for Matrices appears on each individual matrix, not on the Matrix Manager page. But in the case of Wizards, the username drop down is on the Wizards Manager page. This requires the evaluator/ reviewer/ coordinator to return to the manager page in order to view the work of a different participant. For consistency and ease of use, remove the dropdown from the main Wizards Manager page and place a drop down at the top of the first page of each individual Wizard.
  5. New reviewer role and workflow.
    According to the document "Understanding the Open Source Portfolio (OSO) version 2.1, a reviewer is someone who provides guidance and feedback of an informal or formative nature. This is a valuable concept and role, but currently there is no role or workflow support for reviewers. Review takes place before the page has submitted for final evaluation. How is the reviewer supposed to know that a cell or page is ready for review? There either needs to be a set of tools similar to those provided for evaluation (i.e., participants see a "submit for review" button, cell turns a color associated with "review pending" and a Review tool similar to the Evaluation tool where all pages submitted for review by a specific reviewer are listed). DAG - Should be useful for peer review within one's peer group as well.
  6. Improve/enhance group support.
    Make all tools group aware. Currently, group membership determines who coordinators and evaluators can see in Matrix and Wizard tools. However, group support is not available for specific tools or in contexts where it would be helpful (i.e., when selecting evaluators or sharing portfolios)
  7. More intuitive presentation of optional forms.
    Display the empty form inline in the wizard or matrix page with a Save button. A list of the completed versions along with links to revise, delete, and duplicate would appear underneath. In the same area, a button or link labeled Add Existing Form to List allows user to select a form from resources. Currently, the add form/select exiting form links are very confusing unless the wizard/author provides detailed instructions on what to do with them. It would be much nicer if these forms could display inline by default or as an option because often the form is the only element in use on the page.
  8. Global assignment of forms and evaluators in wizard or matrix.
    Allow author to set defaults for all pages that can be overridden on individual pages. Currently assignment of forms and evaluators must be done on a page by page basis, which is very tedious.
  9. Wizard synchronization across sites.
    For some applications, the same wizard or matrix will be used in multiple sites.  If the matrix/wizard needs to be changed in some way, manually making the changes in each site can be very time consuming and result in errors and inconsistencies..  A tool for synchronizing these identical wizard with a master wizard would reduce the labor.
  • No labels