Abstract :
In the fall semester of 2008, students in a first-year engineering course at Purdue University completed three Model-Eliciting Activities (MEAs): Paper Airplane Challenge, Just-In-Time Manufacturing, and Travel Mode Selection. MEAs are realistic, open-ended, client-driven engineering problems designed to foster students´ mathematical modeling abilities. The primary artifact produced by each team (N=295 teams, 1166 students) is a memo to the client describing a procedure for solving the engineering problem. Within each MEA, teams of students produced three iterations of their procedure, receiving feedback after each iteration. Between versions of student work, we compute a normalized measure of the addition, substitution and deletion of words between drafts. We found that student drafts changed an average of 105.4% from draft 1 to 2, and 43.8% from draft 2 to 3. Of this change, we can attribute all but 37.7% of the change from draft 1 to 2 to increased length, and all but 23.6% of the change from draft 2 to 3 to increased submission length. Knowing how much change is induced with each iteration of feedback, and how this change is related to the source and number of feedback iterations, has important implications for instructors planning feedback activities in the classroom.
Keywords :
educational courses; user modelling; Levenshtein distance; engineering course; instructors planning feedback; students mathematical modeling; team model eliciting activity solutions; Atmospheric modeling; Cities and towns; Conferences; Educational institutions; Jacobian matrices; Mathematical model; Plagiarism; Levenshtein distance; feedback; model-eliciting activities;