Project History
The CATME SMARTER Teamwork system is the result of the collaborative efforts of researchers on multiple projects. This page describes the system’s development.
CATME Peer Evaluation
Richard Felder (at North Carolina State University) developed a simple, one-item, behaviorally anchored peer-evaluation scale and scored it based on the work of Robert Brown of Royal Melbourne Institute of Technology. During 1999-2003, Felder’s peer evaluation instrument developed some popularity and was used in several studies, but was not subjected to psychometric evaluation. During that time, in discussions about the benefits and shortcomings of this instrument, Matthew Ohland (a member of the Clemson University faculty at the time), Douglas Schmucker (then at Valparaiso University), and Rich Felder agreed that there was an opportunity to develop a more robust instrument and to computerize data collection and processing to make using the instrument more convenient. This was the genesis of the National Science Foundation (NSF) grant, “Designing a Peer Evaluation Instrument that is Simple, Reliable, and Valid,” (NSF Award 0243254, $644,590, 6/1/2003—5/31/2008).
Matt Ohland was the Principal Investigator (PI) on the grant. Lisa Bullard (at North Carolina State University), Richard Layton (at Rose Hulman Institute of Technology), and Cynthia Finelli (then at Kettering University) were recruited to the project, which required a diverse group of colleagues who had classroom experience teaching in teams and the ability to contribute to the research necessary to develop the instrument. Misty Loughry (a member of Clemson University’s Management faculty at the time) was recruited for her knowledge of research on peer influences and teams.
The Comprehensive Assessment of Team-Member Effectiveness (CATME)
Misty Loughry and Matt Ohland compiled a large list of team-member behaviors from the published literature and conducted two studies to determine the factors to be measured by the new instrument, which was called the Comprehensive Assessment of Team-Member Effectiveness (CATME). The results were published in 2007 in Educational and Psychological Measurement with co-author DeWayne Moore (a member of the Psychology faculty at Clemson University), who consulted on statistics. The resulting Likert-type instrument has 87 items measuring five broad (second-order) factors of team-member behavior. There is also a 33-item short version of the instrument. The 29 first-level factors, measured by three items each, can be used to measure specific types of team-member behaviors.
In 2004, work began on creating a behaviorally anchored rating scale (BARS) instrument mapping to the same five factors. The BARS instrument was developed by Matt Ohland, Misty Loughry, Lisa Bullard, Rich Felder, Cindy Finelli (by then at the University of Michigan), Richard Layton, and Doug Schmucker (by then at Western Kentucky University). Three studies provided evidence for the validity of the new instrument, including consistency with the earlier Likert-version. Additional co-authors on the resulting paper included Hal Pomeranz, the system developer, and David Woehr (at University of North Carolina at Charlotte), who analyzed the data. The paper was published in December 2012 at Academy of Management Learning & Education. The article won the 2013 Maryellen Weimer Scholarly Work on Teaching and Learning Award.
The initial design of the functional requirements of the CATME web interface was created primarily by Matt Ohland and Hal Pomeranz during the ASEE Conference in Portland in June 2005. The first online version of this new instrument was deployed August 30, 2005, with the assistance of Harlan Feinstein (subcontractor to Deer Run Associates). The other research team members reviewed and commented on the interface design to make sure that it met the needs of a broad base of potential users. Since it was first introduced, the interface has been updated in many ways, but has always maintained a focus on the design principles of keeping the system simple and robust. We avoid the use of system-specific functionality when possible and make sparing and purposeful use of color.
CATME Peer Evaluation has a number of features that make it convenient and appropriate. Students rate themselves and their teammates using a secure, web-based interface. This provides confidentiality, especially compared to paper and pencil evaluations completed during class. However, the system allows instructors to view each student’s ratings of every team member, which increases students’ accountability for their ratings. Students can also make confidential comments in the system, which go only to their instructor. The system flags a number of “exceptional conditions” in the rating patterns to alert instructors to teams or students who might benefit from their attention. The system also allows instructors to release feedback to their students. The feedback shows students their self-rating, the average rating that teammates gave the student, and the team-average rating for each of the five dimensions of the CATME Peer Evaluation scale. In addition, the feedback suggests behaviors that could improve students’ ratings in each of the five dimensions. Instructors can also choose to release messages alerting students to any exceptional conditions in their rating patterns.
Team-Maker
Richard Layton created the first version of Team-Maker shortly before the CATME project began. Layton designed an automated team-formation system and the questions for the first student team-formation surveys. The initial version of the Team-Maker application was written as part of the class projects in Rose-Hulman’s Software Engineering I and II classes during the 2002-2003 school year. Layton served as the client for the students on the project: Ryan Cavanaugh, David Aramant, Mark Newheiser, Brian Klimaszewski, Brian Kopecky, and Robert Drake. Don Bagert advised the students. Layton obtained funding from the Educational Research and Methods division of the American Society for Engineering Education (ASEE) through its Mini-grant funding program for September 2003 through June 2004. Student Matt Ellis and Professor Mark Ardis joined the project to continue development of the Team-Maker 1.0 interface and algorithm. On May 12, 2005, the code for Team-Maker version 1.0 was posted on SourceForge, a repository of open-source software, and is still available there. The Rose-Hulman server that housed the Team-Maker system fell victim to viral attack and was disabled, and Team-Maker 1.0 ceased operation. The original Team-Maker version 1.0 source code was incorporated into iPeer version 1.6. iPeer represents a divergent intellectual stream that is not directly related to this project.
By the time the Team-Maker 1.0 server was taken off-line, the CATME research project was thriving and the National Science Foundation agreed to supplement the original award to incorporate the Team-Maker functionality into the CATME system. The functional requirements for Team-Maker 2.0, including changes to the algorithm, were developed by Richard Layton, Matt Ohland (still at Clemson University), and Hal Pomeranz.
Based on the functional requirements, Hal Pomeranz and Harlan Feinstein developed the interface. Team-Maker became publicly available on the CATME system on June 24, 2007. Team-Maker 2.0 was written in Perl, making no use of earlier source code, and the current source code is not open source. A study supporting the validity of the Team-Maker system, together with an explanation of the program’s algorithms, was published in 2010 in Advances in Engineering Education with co-authors Richard Layton, Misty Loughry (by then at Georgia Southern University), Matt Ohland (by then at Purdue University), and George Ricco (a doctoral candidate at Purdue University).
Growth of the CATME / Team-Maker system
Once the original CATME system went public in August 2005, the user community grew by word-of-mouth. After one year, the system had 34 faculty users at 23 institutions. By the end of year two, it had 170 faculty users at 52 institutions. Current information about the growth of the CATME system and the institutions where CATME is used is provided on the “Our User Base” link on this website.
The original CATME research team designed, developed, and tested CATME using standard psychometric techniques. We published evidence that Team-Maker could form teams faster and more consistently than an experienced faculty member. The system grew in popularity.
SMARTER Teamwork—Expanding our Mission
As work on the original NSF grant was winding down, Matt Ohland, Misty Loughry, and Richard Layton discussed how to take the CATME system into a new phase of development. As the CATME system proved to meet the needs of a growing number of faculty and students, we wanted to provide more tools to support the effective use of teamwork in higher education and also to disseminate our work to reach a broader audience of users.
We recruited Eduardo Salas (at University of Central Florida) and David Woehr (then at University of Tennessee, Knoxville) to assist with the next phase of development. Eduardo Salas was invited because of his strong record in teamwork research. Dave Woehr was invited to provide expertise in analyzing the complex, multi-level data collected through the team formation and peer evaluation systems.
Ohland, Layton, Loughry, Salas, and Woehr submitted a CCLI Phase 3 proposal to the National Science Foundation with three goals: 1) to equip students to work in teams, 2) to equip faculty to manage teams, and 3) to equip researchers to understand student teams. NSF funded the proposal: “SMARTER Teamwork: System for Management, Assessment, Research, Training, Education, and Remediation for Teamwork” (NSF Award 0817403, 8/15/2008—7/31/2013, expected total award $2,000,000).
The SMARTER Teamwork grant funded a number of enhancements to the CATME system. Based on the results of a formal usability study conducted by Eduardo Salas’ Ph.D. student, Davin Pavlas, we made system modifications to enhance the user experience for faculty and students. One finding from this study was that users referred to both the CATME peer evaluation tool and Team-Maker team-formation tool as “CATME.” We therefore, decided that we would use “CATME” as an umbrella brand for the broader system, even though the name originated as an acronym for the peer evaluation instrument.
Matt Ohland’s Ph.D. students, George Ricco and Daniel Ferguson, provided high-quality system support for CATME faculty and student users and brought problems, concerns, and requests of the CATME user community to the attention of the development team. This facilitated continuous improvement of the system.
In December 2011 we launched a new tool, CATME Rater Practice, to familiarize students with CATME Peer Evaluation’s behaviorally anchored rating scale and improve their ability to accurately rate team-member contributions using the instrument. Students perform the rater calibration exercise by rating descriptions of fictitious team members. To develop these descriptions, a large pool of students’ actual comments were taken from a sample of students at Purdue University and categorized into the five dimensions measured by the CATME Peer Evaluation scale. This ensured that the descriptions reflected the language of students. Eduardo Salas’ Ph.D. students Wendy Bedwell and Rebecca Lyons directed the categorization efforts. Hal Pomeranz, Richard Layton, and Matt Ohland provided technical direction for the Rater Practice tool. Misty Loughry provided suggestions to enhance faculty and students’ experiences and pilot tested the tool. Dan Ferguson identified and resolved scenario descriptions that created difficulties for students. Further upgrades to the Rater Practice tool will be released in August 2013.
A number of optional follow-up questions were added to the CATME Peer Evaluation tool in December 2011. These questions, taken from published studies (citations are provided in the system’s Help text), allow instructors to gather additional information about their students’ team experiences and include variables shown to be important in the teamwork literature, such as team conflict and cohesiveness. Misty Loughry chose the follow-up questions in consultation with Dave Woehr and Matt Ohland. Dave Woehr and his doctoral students, Andrew Loignon and Paul Schmidt, are analyzing the data collected by the system.
We added a new feature called “Question Editor” to CATME Team-Maker in August 2012. This was a feature that many faculty users had requested during the years since Team-Maker was first launched. Question Editor allows instructors to create their own questions for Team-Maker surveys and to share them, if they desire, with the CATME user community. The new functionality also allows instructors to manage the order of questions in their Team-Maker surveys. Hal Pomeranz, Harlan Feinstein, and Julie Baumler (new subcontractor to Deer Run Associates) performed the development work on this feature.
A New Informational Website
As the functionality of the CATME system continued to improve, the www.CATME.org website that housed the system remained barebones and out of date. In December 2012, we replaced the old website with a new one that is more attractive and informative, even though it is not yet complete.
The new website design is the result of several years of planning and collaboration with the goal of designing a website that would disseminate our work in a positive way and provide useful information to faculty and students. We worked with designer Grá Linnaea to develop a logo and color scheme to create a unified feel to our system while differentiating the various CATME tools and system components from one another. We decided that all of our tools would use CATME in the name. Linnaea created similar logos for each tool using “CATME green” stylized font. This helped to resolve user communication problems about the team-formation tool, now known as CATME Team-Maker.
We outlined a structure for the new website at a meeting in Charlotte in June 2012. Matt Ohland, Richard Layton, Misty Loughry, Dave Woehr, and Eduardo Salas’ Ph.D. students Wendy Bedwell and Kyle Heyne developed an organization scheme for the information that we wanted to present. Kyle Heyne recorded our decisions in a graphical format.
Amy Masson and Susan Sullivan of Sumy Designs built the new website around the design and color scheme chosen earlier. Matt Ohland and Dan Ferguson gathered and sorted information about our faculty users, including an alphabetical list of institutions where the CATME system is used and an alphabetical list sorted by country. Using this data, Richard Layton created maps of institutions where the CATME system is used in North America and throughout the world. Ohland and Layton created a graph to display growth in the users of the CATME system.
Misty Loughry and Matt Ohland wrote the other content for the web pages and made decisions about website details. Misty Loughry worked with Amy Masson to ensure that all of the website content was correct and displayed in an effective and attractive manner.
Misty Loughry developed the CATME Meeting Support documents after noticing that many students had difficulties planning for and following up on team meetings. The documents include templates for exchanging information about team members, preparing a team charter, creating agendas, and recording minutes of team meetings. Matt Ohland converted the documents into fill-in forms.
Dissemination Efforts and Recognition
An important objective of the SMARTER Teamwork grant was to increase awareness of the CATME system among faculty in higher education. Our dissemination efforts have included publications in peer-reviewed journals as well as workshops, seminars, symposia, and poster sessions at academic conferences. These are detailed on this website’s “Research” link under the heading “Publications and Presentations.”
Our dissemination efforts have been aided by three awards that recognize the quality of our contributions to education. The journal article published in Academy of Management Learning & Education describing the development and validation of the CATME Peer Evaluation system received the 2013 Maryellen Weimer Scholarly Work on Teaching and Learning Award, which “recognizes outstanding scholarly contributions with the potential to advance college-level teaching and learning practices.” The CATME Peer Evaluation development team was recognized with the 2009 Premier Award for Excellence in Engineering Education Courseware. In addition, our symposium at the 2011 Academy of Management Conference won the 2011 MED Best Symposium in Management Education and Development Award for the symposium that offered the most significant contribution to advance management education and development.
In November 2011, we created the CATME Users Group on LinkedIn to help users connect with one another and share ideas about how to use the CATME SMARTER Teamwork system to enhance teamwork. We expect that this group will also increase awareness of the existence of the CATME system among higher education faculty who are active on the LinkedIn website.
In Development - Student Teamwork Training Modules
One component of the SMARTER Teamwork grant that is still in development is web-based training modules to teach students team skills. The system will use video-demonstrations of effective and ineffective team-member behaviors in the five dimensions of team-member contributions measured by the CATME Peer Evaluation system.
The research team discussed what the content of the training modules should be, what format the training materials should take, colors and formatting of the visual presentations, and length of the training segments. Wendy Bedwell worked with the University of Central Florida Legal Office to arrange the rights to use video clips from movies and television shows in the training materials. Rebecca Lyons identified appropriate scenes from various movies and television shows. Kyle Heyne used a video editing program to create video clips of just the portions needed for the training program. Wendy Bedwell, Rebecca Lyons, and Davin Pavlas prepared sample training materials and conducted a small pilot test. Eduardo Salas’ Ph.D. student, Tripp Driskell and post-doc, Shirley Sonesh assisted with the development of the training materials. The research team reviewed the prototype training materials at a meeting in Orlando, Florida in January 2013 and provided direction for future development work on this project.
Development Team
Faculty
Matt Ohland Ph.D.
Professor of Engineering Education
Purdue University
Lisa G. Bullard, Ph.D.
Teaching Professor and Director of Undergraduate Studies
Chemical and Biomolecular Engineering
North Carolina State University
Richard M. Felder, Ph.D.
Hoechst Celanese Professor Emeritus of Chemical Engineering
North Carolina State University
Daniel M. Ferguson, Ph.D.
Engineering Education
Purdue University
Cynthia J. Finelli, Ph.D.
Director, Center for Research on Learning and Teaching in Engineering and Research Associate Professor of Engineering Education
University of Michigan
Richard A. Layton, Ph.D., P.E.
Associate Professor of Mechanical Engineering
Rose-Hulman Institute of Technology
Misty L. Loughry, Ph.D.
Professor of Management
Crummer Graduate School of Business
Rollins College
Hal R. Pomeranz
Principal, Deer Run Associates
Douglas G. Schmucker, Ph.D., P.E.
Associate Professor, Lecturer of Civil Engineering
University of Utah
David J. Woehr, Ph.D.
Professor and Chair, Management
University of North Carolina at Charlotte
Student
Rebecca Lyons
Doctoral Student (with Salas)
Institute for Simulation and Training
University of Central Florida
Jonathan Maier, Ph.D.
Lecturer, General Engineering
Davin Pavlas, Ph.D.
(as postdoc with Ohland)
Lecturer, General Engineering
Clemson University
Amy Yuhasz, Ph.D.
(as postdoc with Ohland)
Department of Homeland Security
National Cyber Security Division
Operations and Service Center