A.Baker
Jake waterfall05-4
Anonymity: To make sure that anonymity methods are employed consistently, the Director of Composition will anonymize all essays after they’re submitted and before they’re scored. Method: When an instructor submits an essay in print or electronically, it will likely include identifying features (e.g., names in headers, headings). Each essay will be assigned an alphanumeric code upon its receipt, and either scissors, whiteout, or black marker will be used to remove or obscure the original identifying features (names and section numbers). Any key linking the coded essays to original essays will be kept in the Director of Composition’s locked office.
Evaluation/Scoring: Instructors will meet for a single-day scoring session after the spring semester. The Director of Composition will lead scorers through a sample essay or two (or more) to calibrate/train them to use the scoring system reliably (see attached Score Sheet/rubric). Each essay will be evaluated by two scorers. Discrepancies between scorers will not be settled with additional evaluation. Rather, the sum of the two scores will serve as the student’s score for each category. With a 0-3 scale, a student will earn from 0 to 6 points for each outcome. Other strategies would be far more complicated and would impose and value inter-rater reliability artificially and unnecessarily.
Each student essay receives a numerical score of up to 6 points for each of the 5 outcomes, along with a total score of up to 30 points. We’ll use these scores for each outcome to make claims about students’ proficiency in each of the outcomes areas.
The Director of Composition will write a summary/explanation of the various assignments or projects that generated the essays for the data set.
It’s also possible to use some student demographic information to help us make distinctions among students’ achievement levels. We’d need to collect this information at the point when the instructor submits the essays from their sections. Gender, age, race, major, whether the student has taken the course previously, perhaps course grade (although we won’t have this info until the end of the semester, after the instructors have submitted essays). We may also look at students’ achievement scores by type of assignment/project, by individual instructor, by time of section, by rank/level of instructor, by locale (on-campus and dual-credit), and perhaps other variables
Jake waterfall05-4
Anonymity: To make sure that anonymity methods are employed consistently, the Director of Composition will anonymize all essays after they’re submitted and before they’re scored. Method: When an instructor submits an essay in print or electronically, it will likely include identifying features (e.g., names in headers, headings). Each essay will be assigned an alphanumeric code upon its receipt, and either scissors, whiteout, or black marker will be used to remove or obscure the original identifying features (names and section numbers). Any key linking the coded essays to original essays will be kept in the Director of Composition’s locked office.
Evaluation/Scoring: Instructors will meet for a single-day scoring session after the spring semester. The Director of Composition will lead scorers through a sample essay or two (or more) to calibrate/train them to use the scoring system reliably (see attached Score Sheet/rubric). Each essay will be evaluated by two scorers. Discrepancies between scorers will not be settled with additional evaluation. Rather, the sum of the two scores will serve as the student’s score for each category. With a 0-3 scale, a student will earn from 0 to 6 points for each outcome. Other strategies would be far more complicated and would impose and value inter-rater reliability artificially and unnecessarily.
Each student essay receives a numerical score of up to 6 points for each of the 5 outcomes, along with a total score of up to 30 points. We’ll use these scores for each outcome to make claims about students’ proficiency in each of the outcomes areas.
The Director of Composition will write a summary/explanation of the various assignments or projects that generated the essays for the data set.
It’s also possible to use some student demographic information to help us make distinctions among students’ achievement levels. We’d need to collect this information at the point when the instructor submits the essays from their sections. Gender, age, race, major, whether the student has taken the course previously, perhaps course grade (although we won’t have this info until the end of the semester, after the instructors have submitted essays). We may also look at students’ achievement scores by type of assignment/project, by individual instructor, by time of section, by rank/level of instructor, by locale (on-campus and dual-credit), and perhaps other variables