Physical Review Physics Education Research
The increasing and diversifying student enrollments in introductory physics courses make reliable, valid, and usable instruments for measuring student skills and gains ever more important. In introductory physics, in addition to teaching facts about mechanics, we also seek to teach our students the skills of “thinking like a physicist,” or expertise in and intuition for physical problem solving. How and when these expert, intuitive problem-solving skills emerge during a STEM education, or what the most effective teaching methods might be, are not certain. A facile survey to measure students’ “physics-thinking” skills in a pretest and post-test format is therefore desirable to measure and evaluate different pedagogical approaches. Prior investigators codified these skills as “epistemic games” (e.g., order-of-magnitude estimation, evaluating extreme cases) and developed and validated the math epistemic games survey (MEGS) to measure students’ ability to employ these techniques. The original survey instrument is reliable and valid but has drawbacks in its length and in students’ ability to recall questions between administrations. We employed factor analysis to split the MEGS into two mutually exclusive subtests and measured them to be equivalently reliable and valid as the full-length MEGS as originally formulated. The “split MEGS” is well suited for use as a pretest and post-test instrument to measure gains in expertise in problem solving in introductory physics courses.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Stephen Hackler et al.
"Development And Reliability Analysis Of A Split-Administration Test Of The Math Epistemic Games Survey".
Physical Review Physics Education Research.