Tuesday, 22 January 2008: 4:15 PM
Classroom Response Systems in Statistics Courses
209 (Ernest N. Morial Convention Center)
Michael B. Richman, Univ. of Oklahoma, Norman, OK; and T. J. Murphy, C. C. McKnight, and R. Terry
Poster PDF
(155.9 kB)
Education research has provided evidence that active learning strategies have a considerable positive impact on student understanding of concepts. Classroom response system technology can facilitate some of these strategies, such as student-student discussion of topics. The technology includes a handset (known as a clicker) that allows a student to respond anonymously to multiple choice questions. Once the students click their answers, the signal is picked up by a receiver and processed by software that records each response and offers a bar graph of class responses for student and instructor feedback. Use of this technology enables timely and frequent feedback to both the students and instructors, which leads to cognitive gains through increased student engagement in active learning. Moreover, published research on these systems shows increased student attendance and enthusiasm. The current project involves three departments at the University of Oklahoma (Mathematics, Meteorology and Psychology). The four principal investigators meet weekly to develop items in the form of multiple choice questions and annotations for the response system. We have developed conceptual and data intuition items crossed with novice and advanced items. This allows for cognitive mental mapping of students and instructors. In writing the questions, considerable effort has been expended in forming distractors that measure specific misconceptions. The project has begun to answer the following questions:
1. What impacts does the use of clicker questions have on student learning, including changes in mental maps, of specific statistics topics?
2. What impacts does the use of clicker questions have on instructor decisions about content or pedagogy?
3. Which clicker questions lend themselves to which kinds of pedagogical value (e.g., some questions will be useful in generating discussions, whereas others will be useful for providing feedback about content mastery)?
4. How has the process of writing questions and distractors affected topics taught by each of the PIs? We have noted some convergence in topics since parallels have been identified in areas thought previously to be unique to each discipline (e.g., the fields of meteorology and psychology use receiver operating characteristic curves!).
Insights gained to date concerning these questions will be presented.
Supplementary URL: