Instructor FAQ/Helptext and Troubleshooting

Instructor FAQ/Helptext and Troubleshooting Support

Instructions on using the CATME system and common questions asked about using CATME.

To learn more about what CATME is, visit our "About" page

Jump to:

Account Access

What do I do if I forgot my password?

If you enter your email address into this form, the system will send you an email containing a special link that will allow you to reset your password. This email is similar to the one you received when your account was initially created in the system (although the link in the email is different). If you’ve lost (or never received) this email when your account was created, you may use this “Forgot Password” form to create your initial password in the system.

You MUST enter the exact email address that was used when your account was originally created. The system will not alert you if an incorrect email address is used– you will simply not receive any email. If you have multiple email addresses and are not sure which one was used to create your account, simply try entering each address into the form until one of them works. Student users should also be able to ask their instructor for the correct email address.

I can’t sign in. What do I do?

You must have an account to access this system. If you already have an account, simply enter your email address and password to log in. Please remember the following:

  • Email addresses in the system are NOTcase-sensitive (“bailey.edu” is the same as “BAILEY.EDU”) but passwords are case-sensitive (“password” and “PASSWORD” are different).
  • If you have multiple valid email addresses, please make sure to use the email address that was used to create the account originally (if you are a student user and unsure what email address was used to create your account, please ask your instructor).
  • If you have forgotten (or don’t know or never set up) your password, you may use the “Forgot your password?”link to set a new password.

The process for creating new accounts is different, depending whether you are a student or faculty user of the system.

My students can’t log in.

When you upload a CSV file with your student data, students receive an email from the CATME system telling them how to set up their account. If a student doesn’t receive the email,  these are the most common reasons.

  • The message is in the junk or spam folder of the student’s email account. Ask the student to look in their spam folder first. If they don’t find the e-mail, they should click the “Forgot Password” button and type in the same e-mail address that you used to load the student into the system.
  • The student received the e-mail but didn’t know what it was and deleted it. If this happens, ask the student to click the “Forgot Password” button and type in the same e-mail address that you used to load the student into the system.
  • The student’s ISP has to permit catme.org. This is relevant if they are working from a home computer or an off-campus server.
  • The student’s software or email server is blocking the CATME IP addresses. If this is the case for a lot of your students, contact your school’s Technical Support Staff and ask them to approve the ISP addresses registered for CATME. Our IP addresses for catme.org are 128.46.32.80 and 128.46.32.81.

Have your students go to the main page and click the “Forgot your password” link. They should enter the email you uploaded using the CSV file, and CATME will email them a URL link to reset their account password. This is a good test to see if your students can receive CATME emails.

How do I request a Faculty Account?

Faculty

Please request a new account using the “request faculty account” link in the upper-right corner of the login screen. If you would like to review the format of the on-line instrument the students will see, you may also view a sample instrument. A link to the system’s informational web site is also provided (“Find out more”).

Student Problems

Student Problems

How can I allow a student to re-enter a survey (using Student Editor)?

Allow Survey Re-Entry

 Normally the system prevents the student from re-entering the survey once it has been completed. This button preserves the student’s original responses, but allows them to re-enter the survey and change some of their responses.

Note that clicking any of these three buttons will perform the selected action and then automatically take you back to the “Survey Editor” screen. If you have also made changes in some of the other informational fields on this screen, those changes will be LOST unless you first click the Save button to save your changes.

My students claim that the people in their teams are not right.

A likely possibility is that they are looking at a rater practice exercise and a hypothetical team. If they are viewing team members that are not in your class, chances are they are viewing a rater practice exercise. Rater practice exercises are provided to familiarize the students with the CATME peer rating system. If the student is seeing people in their class but not on their team, then the student data was uploaded incorrectly. Please review the “Data Uploading” part of the Instructor FAQ for instructions on how to format and upload student data.

How do my students create an account?

Students cannot create their CATME accounts. When you create a CATME survey and load your course csv file, you create their CATME accounts. Or if a student has already participated in a previous CATME survey, their account is already created.

To access their student account, your students must open the link that was emailed to them by CATME. Once they do this, they must type in as their name the email address that you used to create their CATME student account in the CATME database. Then they enter their personal password into CATME. You do not give them a password.

If your student ignored the initial CATME message they can enter their email address into the login screen and click “Forgot Password”. They will then receive another email from CATME to create their password.

When are my students contacted by CATME?

When you create a class and upload your student roster, the CATME system immediately sends the students an email asking them to create their password. Their account is created by CATME with their email address as their username. It is a good idea to tell students that their CATME name is their student email address you uploaded using the CSV file. This is their CATME user name and that they will create their own password when they click on the link in the CATME welcome email.. Students need to know that they must enter their own password into CATME; the system does not create their password, nor do you.

At midnight of the start date CATME sends students an email asking them to complete the survey. There is a link in the email that takes them directly to the survey. If you set the start date for your survey to the current date, then the survey request also goes out immediately. Two days before the survey end date, CATME sends a reminder email asking all students who have not yet completed the survey, asking them to complete it before the due date. You can also hit the “Send Reminder” button next to survey that is currently open. This will immediately send reminder emails only to those students who have not yet completed the survey. These reminders also contain a direct link to the survey.

My student changed his/her email. How do I update the account?

The student can log in to their CATME student account and change their email address without losing access to your survey, or you can create a new account in their survey for the student either by clicking “Edit Students” on the Activity Editor page or by adding a row in your CSV file and reloading it.

My student claims they can’t see my survey but other students can access it. Why?
  • This is usually a problem with the student’s email address.  Verify that the email you uploaded for the student is correct and that this is the same email address the student is using to log in. 

  • Sometimes, students use multiple email accounts.  If the student can see another instructor’s CATME surveys, but not yours, the student may be in the CATME system with more than one email address. 

  • To correct any problems related to incorrect student email addresses, go to your CATME faculty account class page and edit the student’s email address.

  • If the student is logging into their CATME student account using the email you uploaded using the CSV file, they will be able to see your survey. If they claim this is not the case but they still have access to CATME, they must be logging into a different CATME account using an incorrect email. To observe this, view the CATME Student Introductory Video.

How do I reopen an activity for a student?

First, go to the Summary page and click on the activity name so you can access the Activity Editor page. Next, click the Edit Students button. Then, click the checkbox by the student’s name. There is a button at the bottom of the list labeled “Allow Student Re-entry”. Clicking it will re-open the survey for the student. Make sure the end date is such that the activity is still open long enough. Remember to tell the student(s) that the survey has been reopened.

Data Uploading

data uploading

What format should the student file be in order to upload to CATME?

Import File Format

The file import action expects to read either a “Comma-Separated Values” (CSV) file or a tab-delimited text file. The easiest way to create these files is to get all the student data into a Microsoft Excel spreadsheet and then select “File…Save As…” to save the worksheet in one of the two accepted formats.

The file import action looks for a single “header” line at the top of the input file that maps out where the various required fields are. After the header line come the individual student records. For example:

First, Last, Email, ID, Team, Section

Chase, Miller, chase@sysiphus.com, 1010111, Squirrels, Tues

[… additional student records as needed…]

Some notes about the file format:

  • The different columns containing the various fields can appear in any order you like. Just make sure you get the field names in the “header” line in the right spots.
  • The “First”, “Last”, “Email”, and “ID” fields are required in all input files.
  • The “Team” field is required when loading data for CATME Peer Evaluation surveys. This field is simply ignored when importing data into Team-Maker surveys.
  • The “Section” field is optional and only used when loading class lists for “multi-section” surveys. You can skip this field entirely if you do not need to define multiple sections (see theMulti-Section Survey information above for more information).
  • When importing student lists for Team-Maker surveys, you may optionally provide any or all of the “Sex”, “GPA”, and “Pre” columns containing the student’s gender information, overall GPA, and grade in an associated previous course for this class. If you do not have this information at the time you’re loading the student list, the Team-Maker survey can always request that each student enter this information.
  • The capitalization of the “header” line field names is unimportant. Thus, all of the following are equivalent: “Team”, “team”, “TEAM”, “tEAM”.
  • In fact, in most cases you can just use the first letter of each field name and the interface will recognize the field properly– the exception is the “Sex” field name, which must be fully spelled-out in order to distinguish it from the “Section” field.
  • Do not include anything in the header fields except the expected field names. While “First” and “f” are both valid header field names, “First Name” isnot  Similarly, “email” is valid, but “e-mail” is not.
  • Your input file is allowed to have the “First” and “Last” name fields combined together in a single field. For example:

Last:First, Email, ID, Team, Section        Miller:Chase, chase@sysiphus.com, 1010111, Squirrels, Tues           [… etc …]

  • When the first and last names are combined in a single field, the names don’t have to be separated by colons– ANY character or set of characters will work. Also, the first and last names can appear in any order. What’s important here is that the “header” line expresses whatever character(s) are used to separate the first and last names, and the order you expect the names to appear in.
  • You are allowed to import a file that contains fields other than the fields used by the interface– these “extra” fields will simply be ignored. Just make sure the “header” line has entries for the unused fields that DON’T match one of the standard fields (i.e., something other than “First”, “Last”, “Email”, “ID”, and “Team”, “Section”, “Sex”, “GPA”, or “Pre”).
  • Spaces at the beginning and end of each field are automatically removed as the file is being imported.

A description of the various fields follows:

“First”

 

Student’s first name
Max Length: 20 characters
Allowed characters: letters, numbers, hyphen, space, apostrophe, period

“Last”

 

Student’s last name
Max Length: 20 characters
Allowed characters: letters, numbers, hyphen, space, apostrophe, period

“Email”

 

Student’s email address
Max Length: 40 characters
Allowed characters: something that appears to be a valid email address

“ID”

 

Student’s personal university ID number
Max Length: 15 characters
Allowed characters: letters, numbers, hyphen, and space
NOTE: If you are concerned about privacy issues, you may make up a different ID number for the student purely for use in this system. However, please be consistent across multiple surveys– the student ID number is the only tool the research team has to track students from survey to survey.

“Team” (CATME Peer-evaluation surveys only)

 

The name of the team the student has been assigned to
Max Length: 30 characters
Allowed characters: letters, numbers, hyphen, space

“Section”

 

The name of the class section the student is enrolled in
Max Length: 10 characters
Allowed characters: letters, numbers, hyphen, space

“Sex” (Team-Maker surveys only)

 

The student’s gender
Max Length: N/A
Allowed characters: the value in this field must begin with an ‘M’ or an ‘F’
(case is unimportant)

“GPA” (Team-Maker surveys only)

 

The student’s current overall GPA
Max Length: N/A
Allowed characters: must be a valid decimal number between 0.0 and 5.0

“Pre” (Team-Maker surveys only)

 

The student’s grade in an associated previous or prerequisite course for this class
Max Length: N/A
Allowed characters: must either be a numeric grade (a valid decimal number between 0.0 and 5.0), a letter grade (“A”-“F” with an optional “+”/”-” afterwards; “E” and “F” are equivalent), or “P”/”F” (representing pass/fail grades)

Screen 1 – Instructions

 

Basic instructions for navigating the interface. Click the Next >> button to proceed.

Screen 2 — Pick Survey Type

 

This screen allows you to select between a Team-Maker or a Peer Evaluation survey. If this is the first survey for this class, then the Team-Maker choice will be selected by default, though you are free to bypass the Team-Maker functionality entirely and just use the Peer Evaluation instrument.

Screen 3 — Basic Survey Information

You must enter a survey name in the field provided. Survey names may be up to 30 characters and contain letters, numbers, parentheses, colon, period, comma, space, hyphen (“-“), and the octothorpe or “number sign” (“#”).

By default the survey start date is set for tomorrow, and the end date a week later, but you may change these dates as necessary. Note that the survey becomes active at 12am on the start date and closes at 11:59pm on the end date. The interface will allow you to change the start and end dates for the survey whenever you wish, even when the survey is “active”.

Screen 4 — Survey Content

This screen allows you to select the content of your Team-Maker or Peer Evaluation survey.

For Team-Maker surveys there are a number of pre-defined questions to choose from (if you wish to suggest additional questions, please contact catme@catme.org). By default, the set of questions you used on your last Team-Maker survey will be selected. If you have not previously used the system for Team-Maker surveys, then the “Schedule”, “Gender”, “Race”, and “GPA” questions will be selected since research has shown that these are the most critical questions for forming optimal teams. You may customize the survey by selecting or de-selecting questions from the list.

The standard Peer Evaluation instrument assesses student performance based on five behavioral characteristics: Contributing to the Team’s Work, Interacting with Teammates, Keeping the Team on Track, Expecting Quality, and Having Related Knowledge, Skills, and Abilities. However, you may customize the survey by only selecting those characteristics for which you are interested in gathering data. There are also a number of different sets of follow-up questions that you may choose to add after the standard Peer Evaluation survey. These additional questions may give you additional perspectives on how the teams are functioning. If you have previously administered Peer Evaluation surveys via the system then the categories and follow-up questions you used on your last survey will be selected. Otherwise all behavioral categories are selected by default, but none of the follow-up questions.

Regardless of the type of survey, each student will see an introductory page with survey instructions before the survey begins. The Edit Survey Intro button will take you to a separate screen where you may change this instructions text. However, the default text provided by the system is appropriate for most situations.

Screen 5 — Load Students

You must provide a list of students (grouped into teams) for this survey:

  • For Peer Evaluation surveys, if you have already created other surveys for this class then you may simply import the student list and team assignments from a previous survey using the left-hand drop list. If you have previously used the Team-Maker interface to create team assignments for this class, you should see the name of the Team-Maker survey in this drop list, allowing you to import these team assignments into the current survey.
  • Otherwise you can Importthe data from a specially formatted file (file format issues are described in the notes above).

Note that once you have added students to the survey using either of the above methods, you will still be able to adjust the student list by adding additional students from another survey or a file using the Append button or by using Replace to discard your initial student list and replace it with a different list of students. As you import student lists, you will see student information updated in a table at the bottom of the screen.

 

Screen 6 — Additional Information

 

This screen only appears for Team-Maker surveys where you have selected the “Grade in prerequisite course” and/or “Software skills” questions. Both of these questions expect you to provide a value to be filled into the question when the survey is presented to the student. In the case of the “Grade in prerequisite course” question you should enter the name of the prerequisite course. For the “Software skills” question, enter the name of the software package on which you wish the students to rate their competency.

Screen 7 — Delegate Faculty

By default the settings, student lists, and results of all of your surveys are only accessible to you. However, for team-taught courses and large multi-section courses you may wish to allow other faculty (or teaching assistants) access to your survey information. As the creator of the survey, you have the option of sharing this survey with one or more additional faculty. In fact, faculty access can even be delegated by section if you have created a multi-section survey by importing a student list with the “Section” field defined.

Clicking the Delegate Faculty button will take you to two additional screens that will help you assign additional faculty access. Otherwise you may simply skip this delegation step and just click the Next >> button to continue to the final screen of the wizard.

Optional Screen 1 — Delegate Faculty

The appearance of this screen varies slightly depending on whether this is a “multi-section” survey (see commentary above for more information) and whether you reached this screen from the survey creation “wizard” or directly from the “Survey Editor” page (see below). However, the basic functionality of this screen is the same in any event:/p>

  • If this is a multi-section survey, you will see a control with a list of the different sections defined for this survey. Select the section or sections you wish to assign faculty to.
  • There will also be a control that lists other faculty with accounts in the system that you may delegate access to. The first part of this list shows faculty at your institution or faculty you have previously shared survey access with. The remainder of the list shows all faculty accounts in the system in alphabetical order. Select one or more faculty members from the list. Note that if you do not include your own name in the group then you will not have access to whatever section(s) you are delegating access to, although as the survey creator you will always have access to the “master” survey view that allows you to see all students in all sections and set the default start/end dates for the survey (though not necessarily the start/end dates on specific sections).
  • Click the Delegatebutton to update the system with your changes. This will also update the information at the bottom of the screen showing the current delegation settings.

If you wish to assign different faculty to different sections, simply repeat the above steps for each different section.

Clicking the Reset Delegation button in the bottom-left corner of the screen will reset all delegation information back to the default, which means only you will have access to the survey information. Click Finished (upper-right corner) when you are satisfied with the delegation settings.

Optional Screen 2 — Additional Access Control

If you have chosen to delegate survey access to other faculty members besides yourself, then you must also choose exactly what authority those faculty members have as far as managing the survey. Delegated faculty always have the ability to view survey results and send automated reminder messages to students. However the checkboxes on this screen allow you to grant these faculty members additional authority as follows:

Modify Survey Start/End Dates

 

This is particularly useful for “multi-section” surveys where the sections meet on different days.

Import New Student Lists

 

This would allow the delegated faculty to re-populate the student lists for the survey by importing student lists from a file. It’s usually rare that the delegated faculty members would need to make wholesale changes like this.

Release Survey Data (Peer Evaluation surveys only)

 

Students are unable to see the results of the survey until the data has been reviewed and “released” by the survey owner. Clicking this checkbox allows the delegated faculty to review and release the survey data themselves (possibly on a section-by-section basis if this is a “multi-section” survey). This checkbox will not appear for Team-Maker surveys, since Team-Maker surveys require that team assignments be created before releasing the survey, and only the faculty owner of the survey may create team assignments.

Remember that for “multi-section” surveys, the additional authority granted only applies to the specific sections delegated to a given faculty member.

 

Screen 8 – Completion

 

At this point the survey creation process is complete. This screen merely provides some final helpful information. Click Done when you’ve finished reading the information.

My CSV file isn’t loading correctly. What should the format be?

The file import action expects to read either a “Comma-Separated Values” (CSV) file or a tab-delimited text file. The easiest way to create these files is to get all the student data into a Microsoft Excel spreadsheet and then select “File…Save As…” to save the worksheet in one of the two accepted formats.

The file import action looks for a single “header” line at the top of the input file that maps out where the various required fields are. After the header line, the individual student records follow below. For example:
First, Last, Email, ID, Team, Section
Chase, Miller, chase@sysiphus.com, 1010111, Squirrels, Tues [… additional student records as needed …]

Some notes about the file format:

  • The different columns containing the various fields can appear in any order you like. Just make sure you get the field names in the “header” line in the right spots.
  • The “First”, “Last”, “Email”, and “ID” fields are required in all input files.
  • The “Team” field is required when loading data for CATME Peer Evaluation surveys. This field is simply ignored when importing data into Team-Maker surveys.
  • The “Section” field is optional and only used when loading class lists for “multi-section” surveys. You can skip this field entirely if you do not need to define multiple sections.
  • When importing student lists for Team-Maker surveys, you may optionally provide any or all of the “Sex”, “GPA”, and “Pre” columns containing the student’s gender information, overall GPA, and grade in an associated previous course for this class. If you want access to this information, but do not have it whenat the time you’re loading the student list, you can set up your Team-Maker survey to request that each student enter this information.
  • The capitalization of the “header” line field names is unimportant. Thus, all of the following are equivalent: “Team”, “team”, “TEAM”, “tEAM”.
  • In fact, in most cases you can just use the first letter of each field name and the interface will recognize the field properly– the exception is the “Sex” field name, which must be fully spelled-out in order to distinguish it from the “Section” field.
  • Do not include anything in the header fields except the expected field names. While “First” and “f” are both valid header field names, “First Name” is not valid. Similarly, “email” is valid, but “e-mail” is not.
  • Your input file is allowed to have the “First” and “Last” name fields combined together in a single field. For example:
    Last:First, Email, ID, Team, Section
    Miller:Chase, chase@sysiphus.com, 1010111, Squirrels, Tues [… etc …]
  • When the first and last names are combined in a single field, the names don’t have to be separated by colons– ANY character or set of characters will work (for example, commas or semicolons). Also, the first and last names can appear in any order. What’s important here is that the “header” line expresses whatever character(s) are used to separate the first and last names, and the order in which the names will appear.
  • You can import a file that contains fields other than the fields used by the interface– these “extra” fields will simply be ignored. Just make sure the “header” line has entries for the unused fields that DON’T match one of the standard fields (i.e., something other than “First”, “Last”, “Email”, “ID”, and “Team”, “Section”, “Sex”, “GPA”, or “Pre”).
  • Spaces at the beginning and end of each field are automatically removed as the file is being imported.

A description of the various fields follows:
“First”
Student’s first name
Max Length: 20 characters
Allowed characters: letters, numbers, hyphen, space, apostrophe, period
“Last”
Student’s last name
Max Length: 20 characters
Allowed characters: letters, numbers, hyphen, space, apostrophe, period
“Email”
Student’s email address
Max Length: 40 characters
Allowed characters: something that appears to be a valid email address
“ID”
Student’s personal university ID number
Max Length: 15 characters
Allowed characters: letters, numbers, hyphen, and space
NOTE: If you are concerned about privacy issues, you may make up a different ID number for the student purely for use in this system. However, please be consistent across multiple surveys.
“Team” (CATME Peer Evaluation surveys only)
The name of the team the student has been assigned to
Max Length: 30 characters
Allowed characters: letters, numbers, hyphen, space
“Section”
The name of the class section the student is enrolled in
Max Length: 10 characters
Allowed characters: letters, numbers, hyphen, space
“Sex” (Team-Maker surveys only)
The student’s gender
Max Length: N/A
Allowed characters: the value in this field must begin with an ‘M’ or an ‘F’
(case is unimportant)
Students can also identify their own gender within the Team-Maker Survey if you ask for that information.
“GPA” (Team-Maker surveys only)
The student’s current overall GPA
Max Length: N/A
Allowed characters: must be a valid decimal number between 0.0 and 5.0
“Pre” (Team-Maker surveys only)
The student’s grade in an associated previous or prerequisite course for this class
Max Length: N/A
Allowed characters: must either be a numeric grade (a valid decimal number between 0.0 and 5.0), a letter grade (“A”-“F” with an optional “+”/”-” afterwards; “E” and “F” are equivalent), or “P”/”F” (representing pass/fail grades)

How do I access my students’ data?

First, you need to access the Class Editor page by clicking on the class on the Summary Page. Then, click on the Data & Teams button next to the activity on the Class Editor page. This button is not available before the start date. Only the faculty members who created the survey or are delegated into a course can see the student data. If you wish to see which students have entered data and/or what data they have entered, click on the Raw Data button.

Creating and Modifying a Survey

creating modifying survey

How can I modify survey start and end dates?

Start and End Date

 Note that you are allowed to adjust the survey start and end date even when the survey is currently “active”. For example, you may want to extend the survey end date to allow more students to complete the survey. If you extend the end date to re-open a survey that has ended, students who have not completed the survey will automatically receive a notice (via email) from the system that the survey has been re-opened. Similarly, they will receive the standard survey close warning 48 hours before the new end date is reached.

How can I delegate faculty so that others can have access to the survey information?

Screen 7 — Delegate Faculty

By default the settings, student lists, and results of all of your surveys are only accessible to you. However, for team-taught courses and large multi-section courses you may wish to allow other faculty (or teaching assistants) access to your survey information. As the creator of the survey, you have the option of sharing this survey with one or more additional faculty. In fact, faculty access can even be delegated by section if you have created a multi-section survey by importing a student list with the “Section” field defined.

Clicking the Delegate Faculty button will take you to two additional screens that will help you assign additional faculty access. Otherwise you may simply skip this delegation step and just click the Next >> button to continue to the final screen of the wizard.

Optional Screen 1 — Delegate Faculty

The appearance of this screen varies slightly depending on whether this is a “multi-section” survey (see commentary above for more information) and whether you reached this screen from the survey creation “wizard” or directly from the “Survey Editor” page (see below). However, the basic functionality of this screen is the same in any event:/p>

  • If this is a multi-section survey, you will see a control with a list of the different sections defined for this survey. Select the section or sections you wish to assign faculty to.
  • There will also be a control that lists other faculty with accounts in the system that you may delegate access to. The first part of this list shows faculty at your institution or faculty you have previously shared survey access with. The remainder of the list shows all faculty accounts in the system in alphabetical order. Select one or more faculty members from the list. Note that if you do not include your own name in the group then you will not have access to whatever section(s) you are delegating access to, although as the survey creator you will always have access to the “master” survey view that allows you to see all students in all sections and set the default start/end dates for the survey (though not necessarily the start/end dates on specific sections).
  • Click the Delegatebutton to update the system with your changes. This will also update the information at the bottom of the screen showing the current delegation settings.

If you wish to assign different faculty to different sections, simply repeat the above steps for each different section.

Clicking the Reset Delegation button in the bottom-left corner of the screen will reset all delegation information back to the default, which means only you will have access to the survey information. Click Finished (upper-right corner) when you are satisfied with the delegation settings.

How can I edit the survey introduction?

Edit Instruction Text

This screen allows you to modify the instructions text the students will see at the beginning of the survey. You may modify the text in box in the middle of this screen (or simply delete this text and replace it with your own). The Preview Text button will update the bottom half of the screen and allow you to see how your text will appear to the students.

Note that the text should be formatted using the standard HTML markup language. Common HTML formatting commands include:

  • Create paragraph breaks by putting a <P> tag at the end of the paragraph
  • To make the text Bold type <B>, to end bold type </B>
  • To make the text Italicized type <I>, to end italics type </I>
  • To Underlinethe text type a <U>, to end underline type </U>
  • To change the font color type <font color=”color”>, to end the font color type </font>
    (where color equals red, blue, purple, green, etc.)

You can make the text bold, italicized, underlined, and a different color by putting your commands next to each other (<B><I><U><font color=”green”>your text</font></U></I></B>). Don’t forget to end your formatting by placing the appropriate < / > for each format that you have added.

When you are satisfied with your changes, click the Save button in the upper-right corner. Save and Return will save your changes and return to the previous screen. Cancel returns to the previous screen without saving your changes.

How can I modify class information?

Class Editor

This screen allows you to modify information about an existing class. You may also use this interface to create new classes quickly, instead of using the multi-screen “wizard” interface (see the discussion of the “My Profile” page for further information).

The informational fields on this screen include:

Class

 

Enter the name of the class in this field. Class names may be up to 30 characters long and may contain letters, numbers, space, and hyphen (“-“).

Term

Choose appropriate values from the drop-down lists.

Type

 

Indicate the class format by making the appropriate selection from the drop-list

Institution

 

This field is normally pre-filled with the name of the institution you provided when signing up for your faculty account (this information may be changed via the My Profile button on your system home page). However, you may change this information on a class-by-class basis simply by modifying this field. Institution names may be up to 35 characters long and contain letters, numbers, period, comma, space, and hyphen (“-“).

Time Zone

 

The time zone information is used so that student survey start/end times happen at the correct times for your institution’s local time zone. Again, this field is normally populated automatically from the information you provided when setting up your faculty account but may be changed on a class-by-class basis by modifying this field. If you’re not sure what time zone is appropriate, you may find this link useful.

Enable Extra Messages

 

The CATME Peer Evaluation instrument uses the data it collects from students to recognize certain “exceptional conditions”– dysfunctional teams, over- or under-performing students, etc. When the system recognizes one of these conditions, the system normally provides the affected students with an additional informational message at the top of their survey results. You may chose to have the system NOT display these messages by unchecking this box. More information these “exceptional” conditions and the associated student messages can be found here. This setting has no effect on Team-Maker surveys.

 

After making changes to any of the above fields, be sure to click the Save button in the upper-right corner to save your changes. Save and Return saves your changes and automatically takes you back to your personal home page in the system. Cancel takes you back to your home page without saving any changes.

At the bottom of this screen you will see a list of the surveys for this class (if any) with start/end date information and a completion gauge that indicates how many students have completed the survey. Click on a survey name to modify information about that survey. While the survey is active, the Send Reminder button will appear next to each survey and allow you to send reminder emails to students who have not yet completed the survey (remember that the system automatically sends a reminder email 48 hours prior to the end date of the survey, so you rarely have to use this button).

Once the survey start date has been reached, another button will appear depending on the type of survey. For Team-Maker surveys, the Data & Teams button will let you view the survey data collected from students and use the Team-Maker interface to create teams of students based on their survey responses. CATME surveys will have a View Results button that allows you to view the collected survey data, along with student comments and notes about “exceptional conditions”.

Use the Add Survey button above the list of surveys to add a new survey for this class. If no surveys are currently defined for this class, you will see a Delete Class button in the lower-right corner instead of the usual survey list. In other words, if you wish to delete a class from the system you must first make sure that all surveys associated with this class have been deleted (you may delete a survey by clicking on the survey name link and using the “Survey Editor” form as described below).

What are “multi-section” surveys and how can I use them for classes with multiple sections?

A Word About “Multi-Section” Surveys

Surveys may be made up of several distinct sections, which can be administered separately, and administration may even be delegated across many different faculty users. Or you may quickly create surveys with just a single group of students and not bother delegating access to other faculty members.

The “multi-section” survey functionality in the interface is primarily designed for large, team-taught classes with multiple sections that may meet at different times. Special functionality that is available to “multi-section” surveys includes:

  • The ability to set different survey start/end dates for different sections of the class.
  • Grant access to modify survey parameters, student lists, and access survey data to different faculty members on a section-by-section basis.

Frankly, if you do not need to set different survey start/end dates and you are the only faculty member at your institution that is interested in working with this system, then there is no reason to complicate things by using “multi-section” surveys. In fact, even if you don’t set up a “multi-section” survey, you can still allow other faculty members to access survey information, assuming they also have accounts in the system (useful for team-taught classes that do not actually have multiple sections).

A survey is created as a “multi-section” survey when you load a student list that defines the special “Section” field (see notes below for information on importing student lists). Another alternative to “multi-section” surveys is to create a separate survey for each section and manage each such survey individually. Feel free to experiment with both “multi-section” and basic survey functionality and decide which makes the most sense for your particular application.

 

At any time in the interface you can click the Help link in the upper-right corner of the screen to view this help text. Logout logs you out of the system (as will exiting your web browser). Contact allows you to quickly send email to the our research and development team if you have questions or other feedback.

The standard action buttons on most screens in the interface are located in the upper-right corner as well. For example, on your default “home page” in the interface there are buttons to start the process of creating a new class and set of surveys in the system (Add New Class) and edit your personal information, including your system password (My Profile).

As you create surveys in the system, they will be displayed on your personal home page. Each survey entry will display the class name, the survey name, the start and end dates during which the survey is available to students, and a gauge indicating what percentage of the students have completed the survey. If you created the survey with multiple sections, then you will see one entry for the survey as a whole which allows you to administer all sections simultaneously, and then one additional entry per-section (look for the section names in parenthesis after the survey name) which allows you to administer a given section as an individual unit.

 

Clicking on a class name in the list of surveys will take you to a page that allows you to modify information about that class, while clicking the survey name takes you a page for modifying information about that survey. Typically, the information you are most likely to change is on the survey page, since this is where the start and end dates for the survey are defined, as well as the list of students to be surveyed.

In addition, once the start date is reached and the survey becomes “active”, you will see two additional buttons next to the survey listing. The Send Reminder button will send a reminder email to all students who have not yet completed the survey asking them to log in and complete the survey. Note that reminder messages are automatically sent by the system 48 hours prior to the end of the survey period, so you may never need to use this button.

The second button varies depending on the type of survey instrument. Team-Maker surveys will have a Data & Teams button, which allows you to view survey results and form teams based on the student survey responses, and save those team assignments so that they can be used when creating future Peer Evaluation surveys. CATME surveys will display a View Results button that gives you access to the survey data that has been collected to date, along with student comments and notices of “exceptional conditions”.

In some cases, a second list of surveys will appear above the normal survey list. These are surveys that require your attention for some reason– the reason will be listed to the right of the survey, and clicking on this text will take you to a screen where you can address the problem. The conditions that may require your attention include:

No surveys defined

 

You have created a class, but have not yet created any surveys for this class. Go to the class page for this class and click the Add Survey button in the bottom-left corner to add one or more surveys.

Please add students

 

You have entered the basic survey information, but not actually assigned any students to this survey. Go the to survey page and add students manually or by importing them from a file or previous survey.

Addtl info required

 

Your Team-Maker survey includes the “Grade in prerequisite course” and/or “Software skills” questions, but you have not provided the course and/or software package name for the survey. Enter the required information via the survey page.

Please define teams

 

The end date for this Team-Maker survey is passed but you have not yet used the Team-Maker interface to create team assignments based on the survey results. Click on the link provided or use the Data & Teams button for this survey to create teams.

Survey data not released

 

The end date for this survey is passed but you have not yet released the survey data to the students in the class and/or the research team. Clicking on the link will take you to the appropriate page to release the data, depending on the type of survey (Team-Maker or Peer Evaluation). Alternatively, you can go to the survey page for this survey and extend the end date– for example, when not enough students have completed the survey. Note that the system will send you reminder messages every week until you release the survey data.

How do I create surveys for a class?

Creating Surveys with Multi-Screen “Wizard”

By default, the system provides a simple multi-screen “wizard” interface for creating new surveys (similar to the wizard used for creating classes that is described above). At any time you can abort the wizard by hitting the Cancel button in the upper-right corner of each screen. You may use the buttons to navigate within the wizard or use the navigation bar to move to the next step or any completed step.

As described on the initial screen of the wizard, it is a good idea to have the student list for this survey already prepared before proceeding with the wizard. While you can import a student list from any survey you have previously created for this class, in most cases student lists are loaded from a file. The format of this file is described in the following notes. Specific instructions for the survey wizard appear after these notes.

How do I create classes in CATME?

Creating Classes with Multi-Screen “Wizard”

By default, the system provides a simple multi-screen “wizard” interface for creating new class entries. At any time you can abort the wizard by hitting the Cancel button in the upper-right corner of each screen. You may use the buttons to navigate within the wizard or use the navigation bar to move to the next step or any completed step.

Screen 1 – Instructions

 

Basic instructions for navigating the interface. Click the Next >> button to proceed.

Screen 2 — Basic Class Information

 

Most of the information on this screen will already be pre-filled from the personal information you entered when requesting your faculty account (if you wish to change these defaults, use the My Profile link off of your system “home page”). You have the option of changing any of the pre-filled values you want, but are required to at least enter a class name in the appropriate fields. The class name may be up to 30 characters and contain letters, numbers, space, and/or hyphen (“-“).

Screen 3 — Extra Messages to Students

 

The CATME Peer Evaluation instrument uses the data it collects from students to recognize certain “exceptional conditions”– dysfunctional teams, over- or under-performing students, etc. When the system recognizes one of these conditions, the system normally provides the affected students with an additional informational message at the top of their survey results. You may chose to have the system NOT display these messages by unchecking the box on this screen. More information these “exceptional conditions” and the associated student messages can be found here. This setting has no effect on Team-Maker surveys.

Screen 4 – Completion

 

At this point the class creation process is complete. This screen merely provides some final helpful information. Click Done when you’ve finished reading the information.

CATME is creating duplicate surveys. Why?

An apparent duplicate survey is created when you have one class but use the section column. One apparent survey is the master survey for a class with multiple sections and there will also be one survey listed for each section. Unless you have more than one section in a class doing a CATME survey DO NOT use the section column. When you delete the section column in your CSV file only one survey will be created.

How do I delete an activity?

Open the survey and scroll down to the bottom of the page. Click on the Delete button. Note that the system is designed to make it unlikely that you would delete activity survey by accident.

Survey Question Editing

survey question editing

How can I access advanced question settings in Question Editor?

Advanced Question Settings

This screen is used to select advanced options for questions; these options are all defaulted to the most common settings, but you may use this to tailor the question for more complex circumstances.

Student Surveys

 “Student must answer” vs. “Student may skip” the question — this control defaults to “Student must answer” (i.e., question is mandatory), but if it is actually an optional question, you may set this to “Student may skip” and the student will have the option of bypassing the question on the survey.

Survey Answers — the answers may be displayed to the students in a dropdown list (the default) or with radio buttons. An example is displayed to right of this control, with the first two actual answers you had provided for the question.

Faculty Results

 

Weight range — Team-Maker scoring has a numeric range associated with it, and this is where you can control the range for this question. More details can be found here.

Scoring labels — this control can be used to shift between two different labeling schemes for the weight range, Distribute/Ignore/Don’t Outnumber, or Group Dissimilar/Ignore/Group Similar. Like weight range, a fuller discussion of the effects of this control can be found here.

Results shown — results can be shown based on either the answers’ display values or their storage values. When those are different from each other, one is usually human-readable (and more suitable for display in results), and one may consist of alpha-numeric codes more suitable for storage in the database).

Answers listed — for the survey results shown to faculty, the answers can either be displayed in the inherent answer order (the order they were entered on the anwer-creation page), or in order of frequency that they were chosen.

Append gender to answers — the example included on the screen is the most useful in understanding this setting, Sports. For Sports, which has this attribute set, the choices are effectively subdivided by Gender automatically, so the Gender of the respondent is crossed with the choice of this question. Thus, the choice of Lacrosse would be documented as either Mens’ Lacrosse or Womens’ Lacrosse, based on the answer for Gender.

Loading data

 

Use column name

Minority- and Gender-related

Minority weighting — this is one of our most complex controls, and the most prominent example in the system for the purposes of explaining this is Gender. One uses this option to help the system distribute team members evenly who have similar answers for question with minority weighting.

Majority answer — the setting of this attribute is only pertinent when minority weighting is turned on and discrete answers are being requested of those surveyed. That is, this is irrelevant if the question is requesting a numeric or text answer. For Gender, the majority answer would be Male: for Race, it would be White.

Segments other minority groups — this controls the interplay between multiple minority groups for the purposes of forming teams. The system question regarding Gender has this set, for example, and it means that another question with this set would multiple these options by those of Gender. If an question had two possible answers, A and B, and a survey had this and Gender, it means that there would be four minority groups: Male & A, Male & B, Female & A, and Female & B.

How can I create my own questions for surveys?

Creating Questions with Multi-Screen “Wizard”

 

Screen 1 – Instructions

 

Basic instructions for navigating the interface. Click the Next >> button to proceed.

Screen 2 – Basic Question Information

 

This screen contains three fields: the Title displays back on the Question Manager and on the Survey Editor, the Question displays to students taking the survey, and the Type allows you to choose a question whose answer is Numeric, Multiple-Choice, or Textual. If you have chosen Multiple-Choice, you also have the opportunity of creating one where a student can select only one answer, or one where they can indicate all answers that apply.

Screen 3 – Answers

 

This screen only displays if the question is one of the two Multiple-Choice types. You should provide the possible answers to your question, in display order, in the left column fields, “Display Text”. The right column, “Stored As”, is the value that the system should store when this answer is chosen.

Screen 4 – Community-Created Listing

 

This screen contains a single checkbox, which controls whether your question is to be listed on other faculty’s Community Question Import screens.

Screen 5 – Completion

 

At this point the question creation process is complete. This screen provides some final helpful information. Click Done when you’ve finished reading the information.

How can I customize the questions in my survey?

Survey Editor

This screen allows you to modify information about an existing survey. You may also use this interface to create new surveys quickly, instead of using the multi-screen “wizard” interface (see the discussion of the “My Profile” page for further information). Information on importing student lists via the “Import Students from File” control can be found here.

Note that some of the input fields and other controls described below may not appear on the screen if you are using this interface to view a survey that was created by another user. For example, the start and end date fields would not be editable unless the survey owner explicitly granted you access to change the survey dates.

The informational fields on this screen include:

Survey Name

 

Enter the name of the survey in this field. Survey names may be up to 30 characters and contain letters, numbers, parentheses, colon, period, comma, space, hyphen (“-“), and the octothorpe or “number sign” (“#”).

Start and End Date

 

Note that you are allowed to adjust the survey start and end date even when the survey is currently “active”. For example, you may want to extend the survey end date to allow more students to complete the survey. If you extend the end date to re-open a survey that has ended, students who have not completed the survey will automatically receive a notice (via email) from the system that the survey has been re-opened. Similarly, they will receive the standard survey close warning 48 hours before the new end date is reached.

Surveyed Categories

For Team-Maker surveys there are a number of pre-defined questions to choose from (if you wish to suggest additional questions, please contact catme@catme.org). You may customize the survey by selecting or de-selecting questions from the list. If you are using this form to create a new survey then by default the set of questions you used on your last Team-Maker survey will be selected. If you have not previously used the system for Team-Maker surveys, then the “Schedule”, “Gender”, “Race”, and “GPA” questions will be selected since research has shown that these are the most critical questions for forming optimal teams.

The standard CATME Peer Evaluation instrument assesses student performance based on five behavioral characteristics: Contributing to the Team’s Work, Interacting with Teammates, Keeping the Team on Track, Expecting Quality, and Having Related Knowledge, Skills, and Abilities. However, you may customize the survey by only selecting those characteristics for which you are interested in gathering data. There are also a number of different sets of follow-up questions that you may choose to add after the standard Peer Evaluation survey. These additional questions may give you additional perspectives on how the teams are functioning. If you have previously administered Peer Evaluation surveys via the system then the categories and follow-up questions you used on your last survey will be selected. Otherwise all behavioral categories are selected by default, but none of the follow-up questions.

The following controls will also appear as checkboxes if you have chosen to delegate access to this survey to other faculty users in the system:

Modify Start/End Dates

 

Allows delegated faculty members to change the start and end dates for the survey. This is particularly useful for “multi-section” surveys where the sections meet on different days. Note that even if you are the survey owner, this control will become inactive as soon as the start date for any section of the survey is reached (too many strange error conditions can happen if the state of this control is changed once a survey has commenced). You could theoretically change the start date for all sections of the survey in order to reactivate this control, but that could also potentially cause difficulties.

Import Student Lists

 

This would allow the delegated faculty to re-populate the student lists for the survey by importing student lists from a file. It’s usually rare that the delegated faculty members would need to make wholesale changes like this.

Release Survey Data (Peer Evaluation surveys only)

 

Students are unable to see the results of the survey until the data has been reviewed and “released” by the survey owner. Clicking this checkbox allows the delegated faculty to review and release the survey data themselves (possibly on a section-by-section basis if this is a “multi-section” survey). This checkbox will not appear for Team-Maker surveys, since Team-Maker surveys require that team assignments be created before releasing the survey, and only the faculty owner of the survey may create team assignments.

After making changes to any of the above fields, be sure to click the Save button in the upper-right corner to save your changes. Save and Return saves your changes and automatically takes you back to the previous screen (either your personal home page or the “Class Editor”). Cancel takes you back to the previous screen without saving any changes.

The Edit Survey Intro button to the right of the above input fields will take you onto a separate screen where you can edit the introductory text students will see at the beginning of the survey (however, the default intro text provided by the system is appropriate for most situations). The Delegate Faculty button will take you to another screen where you can grant access to your survey information to other faculty users in the system. Both of these screens are more fully documented elsewhere.

Below the standard input fields you will see a list of the students assigned to this survey (sorted by section name, team name, last name, and first name). Clicking on a student’s name will take you to a “Student Editor” screen where you can change basic information about that student– fix typos in student’s name, change the email address that the system uses, etc (clicking on the student’s email address will allow you to quickly send an email to that student). The “Student Editor” also has controls that allow you to delete the student or the data they submitted via the on-line survey, or to re-open the survey for that student so that they can go back and change some of their answers.

The Edit Students button next to the student list takes you to a different screen where you can delete multiple students, erase their survey answers, or allow survey re-entry to multiple students simultaneously. The Add Student button will allow you to add individual student records manually.

However, in most cases you will load (or re-load) student lists using the “Import …” controls at the bottom of the screen. “Import Students From Survey” is only active for Peer Evaluation surveys, and only if you have previously created other surveys for this class. This control allows you to quickly re-use student and team assignments from a previous survey. If you have created team assignments using a Team-Maker survey, then the name of the Team-Maker survey should appear in the drop-down list, allowing you to import those team assignments into this survey. Otherwise you may “Import Students From File” (the format for these input files is described in the notes above).

Note that once you have added students to the survey using either of the above methods, you will still be able to adjust the student list by adding additional students from another survey or a file using the Append button or by using Replace to discard your initial student list and replace it with a different list of students. As you import student lists, you will see student information being updated in the table above the “Import …” controls.

It is important to note that importing student lists has no impact on survey data that may have already been collected for a given student. So if you notice some mistakes in your student list after the survey has commenced, you can simply correct your original student list file and re-import it using the Replace action to quickly fix the errors. Similarly the interface is careful to not “spam” the students with repetitive emails when the user list is re-imported, so students will only receive one copy of the standard system “Welcome” message, etc. The only caveat is that the system tracks unique student users by their email address. So if you change a student’s email address and then re-import the student list, that student will appear to be a “new student” as far as the system is concerned. You will effectively “lose” any survey data collected for that student, and the student will receive a new “Welcome” and survey reminder message at their new email address. The “safe” way to change a student’s email address is to click their name in the student list and use the “Student Editor” screen to update their email address in the system.

In the lower-right corner of the screen, below the student list and “Import …” controls, you will find a “Delete Survey” button. If you click this button, the survey AND ALL DATA ASSOCIATED WITH IT will be deleted from the system. There is no way to recover the data after clicking this button, so please take care!

What are the Follow-Up questions in the survey?

Information on Follow-Up Questions

The different measures below may optionally be included after a CATME Peer Evaluation survey.

Follow-Up Questions

From Ohland, M. W., Bullard, L. G., Felder R. M., Finelli, C. J., Layton, R.A., Loughry, M. L., Pomeranz, H. R., Schmucker, D. G., & Woehr, D. J. (working paper) “Comprehensive Assessment of Team Member Effectiveness: A Behaviorally Anchored Rating Scale.” The “Liking” scale (I like this person as an individual, I consider this person to be a friend, I enjoy spending time with this person) was adapted from Jehn, K. A., & Mannix, E. A. (2001), “The dynamic nature of conflict: A longitudinal study of intragroup conflict and group performance”, Academy of Management Journal, 44, 238-251.

Scale: 1 = Strongly Disagree, 2 = Disagree, 3 = Neither Agree Nor Disagree, 4 = Agree, 5 = Strongly Agree

  • I would gladly work with this individual in the future
  • If I were selecting members for a future work team, I would pick this person
  • I would avoid working with this person in the future (scale reversed)
  • I like this person as an individual
  • I consider this person to be a friend
  • I enjoy spending time with this person

Team Conflict

From Jehn, K. A. & Mannix, E. A. (2001), “The dynamic nature of conflict: A longitudinal study of intragroup conflict and group performance”, The Academy of Management Journal, 44, 238-251.

Scale: 1 = None or Not at all, 2 = Little or Rarely, 3 = Some, 4 = Much or Often, 5 = Very Much or Very Often

Task Conflict

  • How much conflict of ideas is there in your work group?
  • How frequently do you have disagreements within your work group about the task of the project you are working on?
  • How often do people in your work group have conflicting opinions about the project you are working on?

Relationship Conflict

  • How much relationship tension is there in your work group?
  • How often do people get angry while working in your group?
  • How much emotional conflict is there in your work group?

Process Conflict

  • How often are there disagreements about who should do what in your work group?
  • How much conflict is there in your group about task responsibilities?
  • How often do you disagree about resource allocation in your work group?

Team Satisfaction

 

From Van der Vegt, G. S., Emans, B. J. M., & Van de Vuert, E. (2001), “Patterns of interdependence in work teams: A two-level investigation of the relations with job and team satisfaction”, Personnel Psychology, 54, 51-69 (with minor modifications).

Scale: 1 = Strongly Disagree, 2 = Disagree, 3 = Neither Agree Nor Disagree, 4 = Agree, 5 = Strongly Agree

  • I am satisfied with my present teammates
  • I am pleased with the way my teammates and I work together
  • I am very satisfied with working in this team

Team Interdependence

From Van der Vegt, G. S., Emans, B. J. M., & Van de Vuert, E. (2001), “Patterns of interdependence in work teams: A two-level investigation of the relations with job and team satisfaction”, Personnel Psychology, 54, 51-69 (with minor modifications).

Scale: 1 = Strongly Disagree, 2 = Disagree, 3 = Neither Agree Nor Disagree, 4 = Agree, 5 = Strongly Agree

  • My teammates and I have to obtain information and advice from one another in order to complete our work
  • I depend on my teammates for the completion of my work
  • I have a one-person job; I rarely have to check or work with others (scale reversed)
  • I have to work closely with my teammates to do my work properly
  • In order to complete our work, my teammates and I have to collaborate extensively

Team Cohesiveness

From S.A. Carless, S. A., & de Paola, C. (2000), “The Measurement of Cohesion in Work Teams”, Small Group Research, 31, 71-88 and Loughry, M. L., & Tosi, H. L. (2008), “Performance Implications of Peer Monitoring”, Organization Science, 19 (6): 876-890.

Scale: 1 = Strongly Disagree, 2 = Disagree, 3 = Neither Agree Nor Disagree, 4 = Agree, 5 = Strongly Agree

Task Attraction

  • Being part of the team allows team members to do enjoyable work
  • Team members get to participate in enjoyable activities
  • Team members like the work that the group does

Interpersonal Cohesiveness

  • Team members like each other
  • Team members get along well
  • Team members enjoy spending time together

Task Commitment

  • Our team is united in trying to reach its goals for performance
  • I’m unhappy with my team’s level of commitment to the task (scale reversed)
  • Our team members have conflicting aspirations for the team’s performance (scale reversed)

Peer Influences

 

From Loughry, M. L., & Tosi, H. L. (2008), “Performance Implications of Peer Monitoring”, Organization Science, 19 (6): 876-890 (with some modification).

Scale: 1 = Almost Never, 2 = Rarely, 3 = Sometimes, 4 = Often, 5 = Almost Always

How often do members of your team…

Notice

  • See what team members do on the team’s work?
  • Notice what team members are doing on the team’s work?
  • Notice how team members behave?
  • Observe how team members do their part of the team’s job?

Praise

  • Congratulate team members if they are recognized for doing good work?
  • Let others know that a team member is doing good work?
  • Tell team members that they did a good job?

Correct

  • Take action if a team member is doing the job incorrectly?
  • Correct team members when they make mistakes?
  • Let team members know if they are doing something wrong?

Report

  • Tell the instructor if a team member is doing something wrong?
  • Let the instructor know if a team member is not meeting expectations?
  • Tell the instructor if a team member is not keeping commitments to the team?
  • Talk with the instructor about a team member who is letting the team down?
  • Mention to the instructor that a team member is doing a poor job?

Discuss

  • Talk within the team about how team members do the job?
  • Discuss with team members how everyone performs team tasks?
  • Have team conversations about team members’ performance?
  • Communicate openly within the team about members’ performance?

Gossip

  • Gossip about team members who do not perform like the rest of the team?
  • Gossip about team members?
  • Gossip about team members’ performance?
  • Gripe about team members’ performance when they are not present?
  • Complain about team members behind their backs?

Avoid Underperformers

  • Refuse to socialize with teammates who perform poorly?
  • Avoid team members who perform poorly?
  • Exclude poorly performing team members from social interactions?
  • Avoid speaking to poorly performing team members?

Avoid Overachievers

  • Limit contact with team members who have higher expectations for the team’s work?
  • Limit communications with team members who pressure the team to exceed expectations?
  • Avoid contact with team members who push teammates for higher performance?

Urge

  • Pressure other team members to work harder?
  • Push other team members to do better work?
  • Urge other team members to do the best possible job?
  • Try to get other team members to do more work or better work?

Team Transition Processes

 

From Marks, M. A., Mathieu, J. E., & Zaccaro, S. J. (2001), “A temporally based framework and taxonomy of team process”, Academy of Management Review, 26, 356-376 (with minor modifications).

Scale: 1 = Not At All, 2 = Very Little, 3 = To Some Extent, 4 = To A Great Extent, 5 = To A Very Great Extent

To what extent does our team actively work to…

Mission Analysis

  • Identify our main tasks?
  • Identify the key challenges that we expect to face?
  • Determine the resources that we need to be successful?
  • Develop a shared understanding of our purpose or mission?
  • Understand the requirements for our team’s work products?

Goal Specification

  • Set goals for the team?
  • Ensure that everyone on our team clearly understands our goals?
  • Link our goals with the project specifications provided by the instructor or client?
  • Prioritize our goals?
  • Set specific timelines for each of our goals?

Strategy Formulation & Planning

  • Develop an overall strategy to guide our team activities?
  • Prepare contingency (“if-then”) plans to deal with uncertain situations?
  • Know when to stick with a given working plan, and when to adopt a different one?
  • Periodically re-evaluate the quality of our working plans?
  • Specify the sequence in which work products should be accomplished?

Team Action Processes

 

From Marks, M. A., Mathieu, J. E., & Zaccaro, S. J. (2001), “A temporally based framework and taxonomy of team process”, Academy of Management Review, 26, 356-376 (with minor modifications).

Scale: 1 = Not At All, 2 = Very Little, 3 = To Some Extent, 4 = To A Great Extent, 5 = To A Very Great Extent

 

To what extent does our team actively work to…

Monitoring Progress Towards Goals

  • Regularly monitor how well we are meeting our team goals?
  • Use clearly defined metrics to assess our progress?
  • Seek timely feedback from outside the team (e.g., professors or other knowledgeable people) about how well we are meeting our goals?
  • Know whether we are on pace for meeting our goals?
  • Let team members know when we have accomplished our goals?

Resource and Systems Monitoring

  • Monitor and manage our resources?
  • Monitor important aspects of our work environment?
  • Monitor events and conditions outside the team that influence our operations?
  • Ensure the team has access to the right information to perform well?
  • Manage our personnel resources?

Team Monitoring and Backup

  • Develop standards for acceptable team member performance?
  • Balance the workload among our team members?
  • Assist each other when help is needed?
  • Inform team members if their work does not meet standards?
  • Seek to understand each other’s strengths and weaknesses?

Coordination

  • Communicate well with each other?
  • Smoothly integrate our work efforts?
  • Coordinate our activities with one another?
  • Re-establish coordination when things go wrong?
  • Have work products ready when others need them?
What are the default Team-Maker questions?

Information on Team-Maker Questions

Here are the current set of default Team-Maker questions. Information shown in parentheses indicates how the responses will be displayed in the “detailed data” view of the Team-Maker results.

Note that we owe a debt of gratitude to Dave Stienstra of Rose-Hulman University for some of the early work on the list of questions below.

If you would like to suggest additional questions or changes to existing questions (including amendments to the list of possible responses to each question), please email catme@catme.org.

Gender
Question: What is your gender?
Responses: Student selects one of: ‘Female’ (F), ‘Male’ (M)
Race
Question: Please indicate the racial/ethnic group with which you most identify:
Responses: Student selects one of: ‘White, Hispanic origin’ (Hispanic), ‘White, not of Hispanic origin’ (White), ‘Black, African-American’ (Black), ‘American Indian or Alaskan Native’ (Native), ‘Asian or Pacific Islander’ (Asian), ‘Other/Mixed-heritage’ (Other)
GPA
Question: Your overall GPA is:
Responses: Student enters appropriate value.
Previous Course Grade
Question: Your grade in the previous course <supplied by faculty> was:
[Letter grades automatically converted to numbers (A = 4.0)]
Responses: Student enters appropriate value.
Schedule
Question: Please check the times that you are busy and unavailable for group work:
Responses: Student indicates times that they are busy by selecting checkboxes in a weekly schedule grid.
Weekend Meetings
Question: How willing are you to participate in team activities on the weekend?
Responses:
  • ‘I have no time for team activity during the weekend’ (No)
  • ‘I would prefer to avoid working with my team during the weekend’ (Avoid)
  • ‘The weekend is as good as any other time’ (OK)
  • ‘Weekends are the best time for me to meet with my team’ (Prefer)
Commute
Question: How long does it take you to get to campus?
Responses: Student selects one of: ‘I live on campus’ (On-Campus), ‘Less than 15 minutes’ (15min), ’15-30 minutes’ (30min), ‘More than 30 minutes’ (Longer)
Credits
Question: How many credit hours are you attempting this term?
Responses: Student enters appropriate value.
On-Campus Job
Question: On average, how many hours do you work at an on-campus job each week?
Responses: Student enters appropriate value.
Off-Campus Job
Question: On average, how many hours do you work at an off-campus job each week?
Responses: Student enters appropriate value.
Class Year
Question: What is your class year?
Responses: Student selects one of: ‘Freshman’, ‘Sophomore’, ‘Junior’, ‘Senior’, ‘Masters student – 1st year’ (1yr Masters), ‘Masters student – 2+ years’ (2yr+ Masters), ‘Doctoral student’ (Doctoral)
Age
Question: What is your age, in years?
Responses: Student enters appropriate value.
Major
Question: What is your major or primary area of study?
Responses:
  • ‘Arts and humanities (art, dance, literature, music, English, foreign languages, religion, philosophy)’ (Humanities)
  • ‘Business or legal studies (accounting, economics, information systems, finance, law, management, etc.)’ (Business)
  • ‘Education (elementary, secondary, special, physical, etc.)’ (Education)
  • ‘Health-related (dentistry, medicine, nursing, physical therapy, veterinary medicine, etc.)’ (Health)
  • ‘Science, Technology, Engineering, or Math (agriculture, biology, chemistry, physics, construction, computer science, engineering tech)’ (STEM)
  • ‘Social science (history, journalism, psychology, sociology, political science)’ (Soc Science)
Discipline

(STEM)

Question: Please indicate one or more areas of disciplinary study:
Responses: Student selects one or more of: ‘Aerospace Engineering’ (Aerospace), ‘Agricultural Engineering’ (Agriculture), ‘Biology’, ‘Bioengineering/Biomedical Eng’ (Bioeng-Biomed), ‘Chemical Engineering’ (Chemical Eng), ‘Chemistry’, ‘Civil Engineering’ (Civil), ‘Computer or Software Engineering’ (Comptr/Softwr Eng), ‘Computer Science’ (Comp Sci), ‘Electrical Engineering’ (EE), ‘Engineering Design’ (Design), ‘Environmental Engineering’ (Environmental), ‘Industrial Engineering’ (Industrial), ‘Mathematics’ (Math), ‘Mechanical Engineering’ (Mechanical), ‘Mining Engineering’ (Mining), ‘Nuclear Engineering’ (Nuclear), ‘Optical Engineering’ (Optical), ‘Petroleum Engineering’ (Petroleum), ‘Physics’, ‘Robotics’
CivE Subdiscipline
Question: Please indicate your emphasis within Civil Engineering:
Responses: Student selects one of: ‘Coastal & Ocean’ (Ocean), ‘Construction’, ‘Environmental’, ‘Geotechnical’, ‘Hydro’, ‘Materials’, ‘Municipal’, ‘Structural’, ‘Transportation’ (Transport)
Discipline (Business)
Question: Please select the option that best describes your area of disciplinary study
Responses:
  • ‘Numbers (Accounting, Finance, etc)’ (Numbers)
  • ‘People (Management, Human Resources, etc)’ (People)
  • ‘Systems (MIS, Operations, Logistics, etc)’ (Systems)
  • ‘Marketing (Advertising, Sales, etc)’ (Marketing)
Software Skills
Question: Your ability to use <supplied by faculty> is:
Responses:
  • ‘Never used it before’ (None)
  • ‘Some experience, basic skills’ (Basic)
  • ‘Lots of experience, basic skills’ (Good)
  • ‘Lots of experience, advanced skills’ (Expert)
English Skills
Question: Please rate your facility with the English language:
Responses: Student selects one of: ‘Very Comfortable’ (1), ‘Comfortable’ (2), ‘Uncomfortable’ (3), ‘Very Uncomfortable’ (4)
Writing Skills
Question: Rate your writing skills:
Responses: Student selects one of: ‘Need improvement’ (None), ‘Marginal’ (Basic), ‘Average’, ‘Above Average’ (Good), ‘Exceptional’ (Expert)
Hands-On Skills
Question: Rate your skill with hands-on build or repair tasks:
Responses: Student selects one of: ‘Need improvement’ (None), ‘Marginal’ (Basic), ‘Average’, ‘Above Average’ (Good), ‘Exceptional’ (Expert)
Shop Skills
Question: Rate your skill with tools or in the machine shop:
Responses: Student selects one of: ‘Need improvement’ (None), ‘Marginal’ (Basic), ‘Average’, ‘Above Average’ (Good), ‘Exceptional’ (Expert)
Commitment Level
Question: In this course, you intend to work how many hours per week
outside of class (not counting lectures or labs):
Responses: Student selects one of: ‘1 hour per week’ (1), ‘2-4 hours per week’ (2), ‘5-7 hours per week’ (5), ‘8-10 hours per week’ (8), ‘Whatever it takes’ (11)
Leadership Role
Question: What is your preferred leadership role?
Responses:
  • ‘Strongly prefer to be a follower rather than a leader’ (Follower)
  • ‘Prefer to be a follower, but will lead when necessary’ (Pref Following)
  • ‘Enjoy leading and following equally’ (Balanced)
  • ‘Prefer to be a leader, but will follow when necessary’ (Pref Leading)
  • ‘Strongly prefer to be the leader; do not enjoy being a follower’ (Leader)
Leadership Preference
Question: Which of the following team leadership structures do you prefer?
Responses:
  • ‘Teams with one strong leader’ (Single Leader)
  • ‘Teams with one leader who gets lots of team input’ (One Leader w/ Input)
  • ‘Teams where leadership is shared equally among all team members’ (Shared Leadership)
Big Picture/ Detail-Oriented
Question: Please select the statement you most closely identify with:
Responses:
  • ‘I have more ideas in 5 minutes than most folks have all day, but hate to do the detail’ (Visionary)
  • ‘I prefer the idea phase but can do details’ (Prefers Ideas)
  • ‘I am balanced between ideas and details’ (Balanced)
  • ‘I prefer the details but can come up with ideas’ (Prefers Detail)
  • ‘While the visionaries are dreaming, I can get the project done and the report written’ (Details)
Fraternity/ Sorority
Question: Please select the fraternity/sorority to which you belong (or choose ‘None’):
Responses: Too numerous to list
Sports
Question: If you belong to a sports team this term, select all to which you belong:
Responses: Student selects zero or more of: ‘Baseball’, ‘Basketball’, ‘Bowling’, ‘Cricket’, ‘Cross Country’, ‘Dressage’, ‘Field Hockey’, ‘Football’, ‘Golf’, ‘Gymnastics’, ‘Hockey’, ‘Lacross’, ‘Polo’, ‘Rifle/Shooting’ (Shooting), ‘Rugby’, ‘Soccer’, ‘Softball’, ‘Swimming’, ‘Tennis’, ‘Track and Field’ (Track), ‘Volleyball’, ‘Water Polo’, ‘Wrestling’

Team-Maker Setup

Team-Maker Set-Up

Can I add custom questions to the CATME Peer Evaluation Survey?

CATME does not allow users to add their own questions to the Peer Evaluation system. There are however a variety of optional follow-up questions about team members and teams. These have been selectedby the CATME development team based on scientific evidence of their relevance, reliability, and validity. If there is a published measure that you would like added to the system you can make a request and the development team will consider it.

Some users create custom questions using Team-Maker’s Question Manager function and merge the data with their CATME Peer Evaluation data. Other faculty members ask their students to put answers to specific questions in the in the comments box at the end of CATME Peer Evaluations.

Analyzing Peer Results

Analyzing Peer Review Results

If I release the Peer Evaluation survey results to my students, do they see numbers for their scores or for adjustment factors?

Only faculty members see numbers in the results. Student feedback is displayed as arrows indicating the student’s self rating, the average of how teammates rated the student, and the team’s average rating on each of CATME’s behaviorally anchored rating scales. You can see the specific feedback results that any particular student will receive by going to the Summary Report Page for the Peer Evaluation, looking for the text that says “Preview results page for student:” and clicking on the drop-down menu to find the particular student whose feedback you wish to preview, then clicking the “Preview” button.

Do the students see comments?

CATME only shows students’ comments in CATME’s Team-Maker and CATME Peer Evaluation surveys to the faculty member who
created the survey and any other faculty members to whom they
delegated access to the survey. Comments are not realeased to
students, nor are they released to researchers if you choose to release the ratings data to researchers.

You can see the specific feedback results that any particular student will receive by going to the Summary Report Page for the Peer Evaluation, looking for the text that says “Preview results page for student:” and clicking on the drop-down menu to find the particular student whose feedback you wish to preview, then clicking the “Preview” button.

What results do the students see from the CATME Peer Evaluation Surveys?

After the the CATME Peer Evaluation data is collected and the survey end-date has occurred, you (the instructor) have the option to release feedback to the students. If, and only when you choose to release the data to students, then they will receive an e-mail inviting them to log into the system to see their feedback. This includes:

A display of each of the five CATME Peer Evaluation behaviorally anchored rating scales (or just the ones you assigned if you didn’t assign all five dimensions), which includes:

a. The student’s self-rating – indicated by an arrow in the first column labeled “How You Rated Yourself”

b. The average rating assigned to the student by the team – indicated by an arrow in the second column labeled “How Your Teammates Rated You”

c. The average rating on the team – indicated by an arrow in the third column labeled “Average Rating For You and Your Team”

d. The complete behaviorally anchored rating scale on which the Peer Evaluation ratings were made.

e. (optional) If you checked the box on the Class Editor page to include extra messages to students, they will see research—based suggestions to improve their performance on each dimension that you surveyed.

You can see the specific feedback results that any particular student will receive by going to the Summary Report Page for the Peer Evaluation, looking for the text that says “Preview results page for student:” and clicking on the drop-down menu to find the particular student whose feedback you wish to preview, then clicking the “Preview” button.

How do I export the CATME Peer Evaluation Survey Results?

There is a CSV export function in CATME on the Raw Data page.You get to the raw data page by clicking on the Raw Data button on the Summary page. Then click on the export to csv button. Another option is to copy the cells on the summary page or raw data page and paste them to Excel. To use the latter procedure, you must first un-check the popup display button to disable pop-up messages in the data fields.

How do I view a summary of the data collected by a Peer Evaluation Survey?

CATME Summary Report

This screen is a high-level summary of the data collected to date for a CATME Peer Evaluation survey, plus some additional fields calculated from this data (see below). This screen is reachable by clicking the View Results button next to the survey entry on your faculty “home page” or from the Class Editor page. However, this button is only available once the survey period has started.

If you wish to manipulate the data on this screen using another application, simply select the data and “cut and paste” the information into a spreadsheet– Microsoft Excel should preserve the tabular formatting. Note that various informational “pop-up” texts encoded on the web page can interfere with this “cut and paste” operation, so be sure to uncheck the Enable pop-up texts display option at the top of the form and click the Re-Display button before attempting the “cut and paste” operation.

The table of data shows the average rating for each student for the standard behavioral categories covered by the survey: “Contributing to the Team’s Work”, “Interacting with Teammates”, “Keeping the Team on Track”, “Expecting Quality”, and “Having Task-Related Knowledge, Skills, and Abilities”. To get detailed information on the survey data collected to date, click the View Raw Data button in the upper-right corner of the screen.

In addition to this basic data, an “Adjustment Factor” value is also calculated that attempts to show the contribution of a given student relative to the other members of their team. Specifically, this “Adjustment Factor” is the average rating of the student divided by the overall average rating for all members of the team. In fact, two different “Adjustment Factor” values are calculated: one with the student’s self-ratings factored in, and one without. In some cases, the “exceptional conditions” described below may have an impact on the validity of these “Adjustment Factor” values, and in these cases the “Adjustment Factor” columns will be highlighted.

Note that the “Adjustment Factor” value is “capped” at a maximum of value of 1.05. Also, values above 0.95 are rounded up to 1.00 (our findings are that values in this range are just “noise”). If you wish to see the “raw” Adjustment Factor values without these corrections, simply click the appropriate checkbox (located just above the table of raw data) and hit the Re-Display button.

The interface also uses the raw data to detect various “exceptional conditions” that may occur. If one of these conditions occurs, you will see a highlighted note in the far right column of the table. You can roll your mouse over each entry to get more information, or consult the table below. This table not only provides information on how these exceptional conditions are defined, but also shows the informational messages that will appear on the student results when one of these exceptional conditions is encountered (see the discussion of the “Class Editor” page above).

If a student submitted a comment at the end of the survey, then their name in the table will appear as a link. You can simply roll your mouse over the student’s name to view the comment. Clicking on the student’s name or clicking the View Comments button in the upper-right corner of the screen will take you to a separate page where all student comments are displayed. The “raw data” report (click the View Raw Data button in the upper-right corner) also contains a table with all of the student comments, located at the end of the report.

After the survey period has ended (the end date for all sections of the survey is passed) you must manually “release” the data before the students in the class (and the research team) can view the data from the survey. This manual process allows you to review the data, and possibly confer with one or more students, before making the data available more widely. After the end of the survey period, you will find a special control at the bottom of the screen that allows you to release the data. First select one of the choices from the drop-list:

Students and Researchers

 

This choice means that both the students and the research team will have access to the data. This is the default choice, and we ask that you make your data available to the research team unless you have a compelling reason to do otherwise. Remember that the system will strip out any “personally identifiable information” from the report seen by the research team.

Researchers only

 

There may be times when you are reluctant to release survey data to your students– for example when small team sizes may compromise the anonymity of student results. However, if you feel comfortable making the data without “personally identifiable information” available to our research team, we ask that you choose this option.

 

Students only

 

The students will be able to retrieve their results, but the survey data will not be made available to the research team.

Nobody

 

You will still be able to see the data, but neither the students nor the research team will be able to view the results. Please do not use this option without good reason. This option will stop the weekly messages from the system reminding you to release the survey data.

After you have selected one of the above choices, click the Release button. The survey data will be released according to your specifications and the control will disappear. In other words, you can’t change your mind once you’ve released the data.

Note that you may use the Preview controls to see the results that will be shown to the students before choosing your release method from the list above.

Raw Data

This screen is simply a raw dump of the data collected for a given survey, plus some additional fields calculated from this data (see below). If you wish to manipulate the data further, simply select the data and “cut and paste” the information into a spreadsheet– Microsoft Excel should preserve the tabular formatting. Note that various informational “pop-up” texts encoded on the web page can interfere with this “cut and paste” operation, so be sure to uncheck the Enable pop-up texts display option at the top of the form and click the Re-Display button before attempting the “cut and paste” operation.

The first table of data shows the ratings collected for the five standard behavioral categories: “Contributing to the Team’s Work”, “Interacting with Teammates”, “Keeping the Team on Track”, “Expecting Quality”, and “Having Task-Related Knowledge, Skills, and Abilities” (a key to the one-letter abbreviations is provided to the right of the raw data table). Each row of the table displays how a given student was rated by themselves and their fellow team members, whereas the “vertical blocks” in the table show the ratings by a given student for themselves and the other students in the team– an “empty block” indicates that the student never completed the instrument. Notice that each team member is assigned a “Rater number” from 1-<n> in order to correlate rating data across the various team members.

In addition to the raw data in the table, an “Adjustment Factor” value is also calculated that attempts to show the contribution of a given student relative to the other members of their team. Specifically, this “Adjustment Factor” is the average rating of the student divided by the overall average rating for all members of the team. In fact, two different “Adjustment Factor” values are calculated: one with the student’s self-ratings factored in, and one without. In some cases, the “exceptional conditions” described below may have an impact on the validity of these “Adjustment Factor” values, and in these cases the “Adjustment Factor” columns will be highlighted.

Note that the “Adjustment Factor” value is “capped” at a maximum of value of 1.05. Also, values above 0.95 are rounded up to 1.00 (our findings are that values in this range are just “noise”). If you wish to see the “raw” Adjustment Factor values without these corrections, simply click the appropriate checkbox (located just above the table of raw data) and hit the Re-Display button.

The interface also uses the raw data to detect various “exceptional conditions” that may occur. If one of these conditions occurs, you will see a highlighted note in the far right column of the table. You can roll your mouse over each entry to get more information, or consult the table below. This table not only provides information on how these exceptional conditions are defined, but also shows the informational messages that will appear on the student results when one of these exceptional conditions is encountered (see the discussion of the Class Editor page above).

If the survey also included follow-up questions then there will be an additional table of ratings data for each of the included the follow-up questions. The format of these tables varies depending on the type of follow-up questions chosen (note that in some cases students do not enter follow-up question data about themselves, so it is normal to see a “diagonal” pattern of empty cells in this table for some question groups). Look at the small key to the right of this table to see the actual text of these follow-up questions.

After the follow-up questions data (if any) there will be another table showing the comments from all students.

View Comments

This is just a simple screen which displays all of the student comments in one easy-to-view table. When you’re done reviewing the comments you can leave this page by clicking the appropriate button in the usual spot in the upper-right corner of the screen.

Question Manager

This screen allows you to control the order of display of the System questions, as well as questions that you author. It can be navigated to from the “home page” of the faculty interface.

There are two sets of tools for reordering: a series of numeric entry fields for larger movements of questions, and up- and down-arrow buttons for smaller movements. To move a question quickly to the top, you can enter a 0 into the order field for that question. After modifying the numeric fields, you are expected to press one of the Save buttons at the top of the screen. The fields will be redisplayed in their new order.

The questions will then be displayed on the Survey Editor in the new order.

How do I release Peer Evaluation Survey data students?

Release Survey Data (Peer Evaluation surveys only)

 

Students are unable to see the results of the survey until the data has been reviewed and “released” by the survey owner. Clicking this checkbox allows the delegated faculty to review and release the survey data themselves (possibly on a section-by-section basis if this is a “multi-section” survey). This checkbox will not appear for Team-Maker surveys, since Team-Maker surveys require that team assignments be created before releasing the survey, and only the faculty owner of the survey may create team assignments.

After making changes to any of the above fields, be sure to click the Save button in the upper-right corner to save your changes. Save and Return saves your changes and automatically takes you back to the previous screen (either your personal home page or the “Class Editor”). Cancel takes you back to the previous screen without saving any changes.

The Edit Survey Intro button to the right of the above input fields will take you onto a separate screen where you can edit the introductory text students will see at the beginning of the survey (however, the default intro text provided by the system is appropriate for most situations). The Delegate Faculty button will take you to another screen where you can grant access to your survey information to other faculty users in the system. Both of these screens are more fully documented elsewhere.

Below the standard input fields you will see a list of the students assigned to this survey (sorted by section name, team name, last name, and first name). Clicking on a student’s name will take you to a “Student Editor” screen where you can change basic information about that student– fix typos in student’s name, change the email address that the system uses, etc (clicking on the student’s email address will allow you to quickly send an email to that student). The “Student Editor” also has controls that allow you to delete the student or the data they submitted via the on-line survey, or to re-open the survey for that student so that they can go back and change some of their answers.

The Edit Students button next to the student list takes you to a different screen where you can delete multiple students, erase their survey answers, or allow survey re-entry to multiple students simultaneously. The Add Student button will allow you to add individual student records manually.

However, in most cases you will load (or re-load) student lists using the “Import …” controls at the bottom of the screen. “Import Students From Survey” is only active for Peer Evaluation surveys, and only if you have previously created other surveys for this class. This control allows you to quickly re-use student and team assignments from a previous survey. If you have created team assignments using a Team-Maker survey, then the name of the Team-Maker survey should appear in the drop-down list, allowing you to import those team assignments into this survey. Otherwise you may “Import Students From File” (the format for these input files is described in the notes above).

Note that once you have added students to the survey using either of the above methods, you will still be able to adjust the student list by adding additional students from another survey or a file using the Append button or by using Replace to discard your initial student list and replace it with a different list of students. As you import student lists, you will see student information being updated in the table above the “Import …” controls.

It is important to note that importing student lists has no impact on survey data that may have already been collected for a given student. So if you notice some mistakes in your student list after the survey has commenced, you can simply correct your original student list file and re-import it using the Replace action to quickly fix the errors. Similarly the interface is careful to not “spam” the students with repetitive emails when the user list is re-imported, so students will only receive one copy of the standard system “Welcome” message, etc. The only caveat is that the system tracks unique student users by their email address. So if you change a student’s email address and then re-import the student list, that student will appear to be a “new student” as far as the system is concerned. You will effectively “lose” any survey data collected for that student, and the student will receive a new “Welcome” and survey reminder message at their new email address. The “safe” way to change a student’s email address is to click their name in the student list and use the “Student Editor” screen to update their email address in the system.

In the lower-right corner of the screen, below the student list and “Import …” controls, you will find a “Delete Survey” button. If you click this button, the survey AND ALL DATA ASSOCIATED WITH IT will be deleted from the system. There is no way to recover the data after clicking this button, so please take care!

How do I use CATME Peer Review results for grading?

Using CATME for grading depends on the instructor’s preference. We suggest that you refer to the Adjustment Factor (wo/ Self) for grading. The adjustment factor is calculated by calculating the average student rating compared to the team average rating for all five CATME teamwork dimensions. The scale is out of 1, which means that some students can get higher than a 1.0 Adjustment Factor [they performed better than the team average in the judgement of their peers]. Adjustment Factors >0.95<1.0 are rounded to 1.0.

How do I further analyze the CATME Peer Review results?

You can further analyze the CATME Peer Review results by first exporting the Peer Evaluation Survey Results into a CSV file. Refer to the question “How do I export the CATME Peer Evaluation Survey Results?” under the Analyzing Peer Results section of the FAQ.

This file can be loaded into excel. This exported CSV file includes the rating that each person gave him/herself and his/her teammates on all five CATME teamwork dimensions.

What are peer-to-peer comments?

Peer to Peer Comments are not generally released to CATME Instructors at this time.

If you wish to use this new CATME system feature please email Dr. Ferguson at dfergus@purdue.edu

Peer-to-peer comments are comments students make about each team member, including themselves. If you check the “Peer-to-Peer Comments” box below, students will see a separate peer-to-peer comment box column for each team member including themselves, This is in addition to the “Confidential Comments to Instructor” comment box that all students see at the end of all peer-evaluation surveys. After the peer-evaluation period ends, if you release the peer-evaluation results to students, students will see the peer-to-peer comments each team member made about themselves and the other members of the team as well as their own comments. This feedback will clearly show who made each comment. The “Confidential Comments to Instructor” is not released to students.

The “Release Peer-to-Peer Comments to Students” is unchecked by default. If you choose to check the box to release peer-to-peer comments, we suggest you to inform your students about this choice before they complete the peer evaluation, and encourage them to be professional in their comments.

The “Anonymize Peer-to-Peer Release to Students ” is unchecked by default as well. If you choose to anonymize the release, this means that students will see the peer-to-peer comments made by all team members about everyone on the team including themselves but will NOT see the name of the student who made the comments. Columns of comments prepared by a team member are randomly ordered and rows within columns [comments about a specific team member] are also randonmly ordered. If you choose to enable this option, you may not disable it later. Students taking the survey will see a note explaining that the peer-to-peer comments they make about themselves and other students will be viewable but not the name of the commenter.

IMPORTANT: If you do not check the “Release Peer-to-Peer Comments to Students” box when you create the Peer Evaluation Activity, you will not be able to change your survey decision later.

Technical Difficulties

Technical Difficulties

I encountered a blank screen when using Mozilla Firefox. Why?

While a few problems with a couple versions of Mozilla Firefox have been reported, the design team has not been able to replicate them. If you have an issue when using Mozilla Firefox, please use a different browser, such as Chrome or Internet Explorer.

Why can’t I receive messages from CATME?

If you don’t receive email messages from CATME, these are the most common reasons.

  • CATME messages are in the junk or spam folder of the your email account.
  • Your software or email server is blocking the CATME IP addresses. Your ISP has to permit catme.org. This is relevant if you are working from a home computer or an off-campus server. Try moving to a campus computer and logging in from there.

 

In any case, go to the main page and click the “Forgot your password?” link. You should enter the email your professor/institution uses to contact you, and CATME will email you a URL link to reset your account password. This is a good test to see if you can receive CATME emails.

I clicked on the “Forgot your password?” link on the login page, but I still haven’t received any emails. Why not?

If you don’t receive email messages from CATME, these are the most common reasons.

  • CATME messages are in the junk or spam folder of the your email account.
  • Your software or email server is blocking the CATME IP addresses. Your ISP has to permit catme.org. This is relevant if you are working from a home computer or an off-campus server. Try moving to a campus computer and logging in from there.
  • The e-mail you are using is a different account than the one the professor used to input you into the CATME system.  

General Questions

General Questions

How does the CATME system accommodate individuals with disabilities?

Development of the CATME system began more than 10 years ago when website accessibility guidelines were just being developed.  We are aware that the CATME system does not meet current accessibility guidelines, but have not yet been able to secure funding to upgrade the system in this area.  We have posted a Word Document version of the instrument on the Instructor Support tab our website.  Some people have found this useful because it can be viewed in a larger print size.

What does the “enable extra messages” option do?

Enable Extra Messages

 The CATME Peer Evaluation instrument uses the data it collects from students to recognize certain “exceptional conditions”– dysfunctional teams, over- or under-performing students, etc. When the system recognizes one of these conditions, the system normally provides the affected students with an additional informational message at the top of their survey results. You may chose to have the system NOT display these messages by unchecking this box. More information these “exceptional” conditions and the associated student messages can be found here. This setting has no effect on Team-Maker surveys.

 

After making changes to any of the above fields, be sure to click the Save button in the upper-right corner to save your changes. Save and Return saves your changes and automatically takes you back to your personal home page in the system. Cancel takes you back to your home page without saving any changes.

At the bottom of this screen you will see a list of the surveys for this class (if any) with start/end date information and a completion gauge that indicates how many students have completed the survey. Click on a survey name to modify information about that survey. While the survey is active, the Send Reminder button will appear next to each survey and allow you to send reminder emails to students who have not yet completed the survey (remember that the system automatically sends a reminder email 48 hours prior to the end date of the survey, so you rarely have to use this button).

Once the survey start date has been reached, another button will appear depending on the type of survey. For Team-Maker surveys, the Data & Teams button will let you view the survey data collected from students and use the Team-Maker interface to create teams of students based on their survey responses. CATME surveys will have a View Results button that allows you to view the collected survey data, along with student comments and notes about “exceptional conditions”.

Use the Add Survey button above the list of surveys to add a new survey for this class. If no surveys are currently defined for this class, you will see a Delete Class button in the lower-right corner instead of the usual survey list. In other words, if you wish to delete a class from the system you must first make sure that all surveys associated with this class have been deleted (you may delete a survey by clicking on the survey name link and using the “Survey Editor” form as described below).

How much does CATME cost?

There are no user charges for CATME in 2016 as has been true for the past 10 years due to NSF funding. After 7-1- 2017 there will be very modest student user fees for all students loaded into CATME surveys.

How do I get rid of ‘release survey’ reminder emails?

Reminder emails can be eliminated by simply releasing the survey and the options include releasing to students, researchers or no one.

How do I implement the Rater Practice tool in a survey?

As you are setting up a class on the Class Editor” page , you can set the Rater Calibration setting to Calibrate Once. If you would like to assign rating practice fora class that already exists in the CATME system, go to the “Class Editor” page and change the Rater Calibration setting to Calibrate Once. Once the Rater Practice button is turned on, students must do the rater practice once before they can complete any more Peer Evaluation activities.

Can CATME be used for other things besides students?

If the CATME account holder is a staff or faculty member at a university, the surveys that they create are not monitored. So yes, CATME can be used for surveys of other than students. We ask that faculty members DO NOT RELEASE the results to researchers if the surveys were not used to collect data from students in higher education classes.

If two or more CATME activities are occurring simultaneously in one or more classes, is that a problem?

This is not a problem. Two different surveys are two different processes. All of the required surveys by each student’s instructors will appear in [his or her] student account.

How does CATME protect the confidentiality of student data?

The Family Educational Rights and Privacy Act (FERPA) guidelines require the protection of confidential student data. Even the system designers and user support members don’t have access to instructors’ identifyable student data. If instructors release their data for research purposes, all of the identifying information about students (their names, e-mail addresses, and ID numbers) are stripped from the data and replaced with codes that cannot be traced to specific students.