Child pages
  • EVALSYS configuration page UI
Skip to end of metadata
Go to start of metadata

The EVALSYS configuration view would be more useful if an admin could easily navigate the choices and understand the consequences of each choice. It has grown somewhat organically since the last tag as each institution has customized things to meet specific needs. Below some ideas.

  1. Categorize the settings - below is just a proposal
  2. Change the control labels to better explain what the control does
  3. Add an explanatory text when the label is not sufficient. Explanatory text may also expand on benefits, drawbacks.
  4. Some settings cancel or affect others. If you set "Instructors allowed to create evaluations" to false, a whole set of further choices is moot, for example. We really should indicate this somehow to avoid some head scratching. Maybe even disabling/presetting the affected controls and indicating in the corresponding labels that this control is dependent on another one.

Below: for each label, if label seems unclear next line emphasized indicates alternate label. If label is not enough to explain things or additional info is needed I have added a grey line with some additional explanation. A question mark indicates was not sure of alternate. Just a question mark indicates that did not understand the original label enough to even suggest an alternate. Red indicates meta-talk.

Finally - maybe the "Institutional Specific Settings" settings should be discarded and those settings should be migrated to the appropriate grouping or new ones. Just because it was Michigan that initiated importing and exporting of evaluation data, it does not mean that another institution may not be interested in it or affected by the setting at any rate.

Also - am assuming that the defaults control values make sense. But maybe this also needs examination.

Comments in green - added by Ellen Borkowski, University of Maryland
ACM - added by Adam Marshall, Oxford Uni

INSTRUCTOR SETTINGS

  1. Instructors allowed to create evaluations
    Instructors can create evaluations
  2. Instructors allowed to email students
    ? Instructors can set email reminders
    When an evaluation is created the author can specify that email reminders be sent. This setting controls the ability of the instructor role to do so. Q: can content, frequency be set as well?
  3. Instructors allowed to view results
    Instructors can view evaluation results
    ACM: can they just see the answers to the questions pertaining to them, or all questions or all questions except ones pertaining to other lecturers?
  4. Instructors must use evaluations from above in the hierarchy
    ? Instructors must use evaluations set for them institutionally
    ACM: I think the whole issue of hierarchy needs explaining somewhere else. We've been using the tool for a year and still haven't a clue about the hierarchy!
    ACM: Evaluations may be created and assigned at an institutional level. This setting controls whether instructors can opt out of these or not
  5. Instructors may add this number of questions to evaluations from above in the hierarchy before the release date
    ? Instructors may add this number of questions to evaluations set for them
    For evaluations that are created at an institutional level, this setting controls how many questions an instructor can add to evaluations assigned to their classes.

STUDENT SETTINGS

  1. Students may leave questions unanswered
    ACM: At the moment, 29/06/10, all questions seem to be compulsory and this switch appears to do nothing.
  2. Students may edit their responses up to the due date
  3. Students allowed to view results
    ? Q: What results? Their own answers? Everyone's answers? Comments included?

ADMINISTRATOR SETTINGS

  1. Admins below the hierarchy level of the owner may add this number of questions to the evaluation before the release date
    ?
    This helps us to control how many questions can be entered at each level of the hierarchy.  So, if you wanted to limit each level down in the hierarchy so only 5 questions can be added at each level you select this in this option.
  2. Admins allowed to view instructor added question results
    ?
    _Since instructor questions are actually not part of the hierarchy, this is a separate setting where you can define how many questions an instructor can add to the overall evaluation process when using hierarchy._
  3. Admins allowed to view results from items added below them in the hierarchy
    ?
    This allows one to see results below you.  So, for example, at Maryland, we would want certain people who are admins at the University level to be able to view all results below them which, in this case, would be all the college results in addition to the university results.

HIERARCHY SETTINGS

  1. Use Hierarchy menus, options, and data in the system
    ?
    This option essentially "turns on" hierarchy.  If this option is not selected, then the menu option "Control Hierarchy" is not enabled and all other pieces of the hierarchy are not enabled.
  2. Display hierarchy node headers on the take eval and preview eval views
    _?_This option allows one to choose to display the actual hierarchy node headers (or names as I like to think of it) when displaying the evaluation.  At Maryland, we use this as it displays the university name and college names for the relevant sections of the evaluation (so all University questions are group and labeled as such, each set of college questions are grouped and labeled as such) automatically.

GENERAL SETTINGS

Creating templates/evaluations

  1. Not Applicable allowed when creating questions
    Not Applicable (NA) option allowed when creating questions
  2. Enable the optional use of text comments on non-text items
    ? Enable allowing comments when creating non-text item questions
    A question author may want to allow answerers to add a comment to the question. This setting controls the question's author ability to do this.
  3. Maximum number of questions allowed in a question group
    An evaluation creator may group questions together so that they share a common set of labels(color). This setting controls how many questions can be grouped together this way Q: what ramifications does this have, why would an evaluation author want to do this?
  4. Template sharing and visibility setting
    I think it is the control choices that can use some clarification
    Template visibility set by creator
    Template is visible and editable only by creator
    Template is visible to anyone, editable by any admin
  5. Enable the use of a category key for grouping when setting up evaluations
    ?
  6. All questions default to course category only (instructor category not allowed)
    ?
  7. Use expert created templates
    ?
  8. Use expert created questions
    ?
    ACM: 'Expert' questions can be made available to all users for use in their own surveys

Assigning evaluations

  1. Allow use of adhoc groups
    ? Allow targeting an evaluation to a site group
    ACM: If 'Login to Sakai' is selected then these users must have an account in the system otherwise they will not be allowed to take the survey. Accounts are not created on the fly as in other tools.
  2. Allow use of adhoc users
    ? Allow targeting an evaluation to specific selected users
    ACM: See above.

Timing evaluations

  1. Allow evaluations to be closed early
  2. Allow evaluations to be reopened after they close
    Please note that the original close dates are silently lost when reopened
  3. Use dates and times for evaluations, otherwise use dates only
    ?
  4. Use stop date (grace period) for evaluations, otherwise just use the due date
    ? Use stop date (grace period) for closing evaluations, otherwise just use the due date
    The suggested wording change makes this option clearer.
  5. Use view date for evaluations, otherwise just use the due date
    ?
    View date is the date when users are allowed to see the results.  So, if one wants the date that results are available to be the same date as the due date, then this option would be turned on.
  6. Use same view date for all users (or use separate dates for students, instructors, admins)
    ?
    View date is the date when results are available to view so if you wanted results to be available at different times for students vs. instructors/administrators this option would be used.
  7. Number of days old can an eval be and still be recently closed
    ?
    ACM: There's a panel labelled 'recently closed surveys' this controls what is displayed therein.
  8. Minimum time difference (in hours) between start date and due date of an evaluation
    ?
    ACM: What is the minimum time a survey can be open

    Interface options

  1. Show the "Evaluations I am creating or administering" box (Administrator widget) on the main screen
    An explanation of why one would do or not do this, performance issues, etc. would be good
  2. Show the sites summary box on the main screen
    An explanation of why one would do or not do this, performance issues, etc. would be good
  3. Show the instructor summary box on the main screen
    An explanation of why one would do or not do this, performance issues, etc. would be good
  4. Show links to My Evaluations, My Templates, My Items, My Scales and My Email Templates
    An explanation of why one would do or not do this, performance issues, etc. would be good

Other

  1. Number of responses required before results can be viewed
    ACM: This is to do with anonymity. Responses are supposed to be anonymous and if you allow the results to be viewed after just one response this would be lost. 2 is the barest minimum, 3 or 4 may be safer.
  2. Allow responses to evaluations to be removed
    Explain: Why one (who?) would want to do this? 
    We have instances where students email saying they thought there were evaluating a different professor, or the switched the scales and ask us to remove their submission.  This option will allow the admin to do this rather than having the IT staff do it on the backend in the database.

INSTITUTION SPECIFIC SETTINGS

  1. Enable option to flag specific items in evaluations for database export and sharing (UMD)
  2. Enable importing options and controls (UM)
  3. Disable the Item Bank
  4. Disable Question Blocks
    ACM: This should be a general setting. We have found that QBs can be confusing to the user, they dont really need to have direct access via a link since they are able to reuse their questions anyway.
  5. Enable Teaching Assistant item category (UMD/UCT)
  6. Enable Instructor/Assistant Selections for grouping instructor related items (UCT)
    ACM: When we enabled this is prevented any surveys from being submitted - in other words it completely broke the tool!
  • No labels