The research team wrestled with the question of methods over much of the implementation phase of the project, striving to determine what would prove most useful in the context of the classroom, in the context of the larger wiki research project, and in the context of the TechMediated research teams' attempts to improve usability heuristics. Our methods were developed to generate measurable data about what happens in the knowledge-construction process during collaborative writing. We hoped to measure to what extent the wiki influenced the degree and frequency of collaboration. As instructors, we also imagined that the wikis might provide a tool for the assessment of individual contributions when wiki-based collaborative writing tasks are introduced to Writing Across the Curriculum (WAC) courses.
To ensure our methodology was appropriate to the theoretical questions driving our work, the research team began by brainstorming ways wikis might help to solve two of the biggest challenges that group work presents, one concerning epistemology and the other assessment. Concerning epistemology, we identified a typical problem accompanying the incorporation of group projects into class assignments. When asked to provide one “group essay,” students usually break the work into component parts and then assemble the pieces into one document just before it is submitted. Although this process seems “efficient” and sensible to students, it often circumvents the dialogue necessary to reach an engaged and critical consensus about a group-authored paper at every step of the writing process: determining key contributions or ideas (invention or ideation), strategizing organization (arrangement), choosing and writing with a unified tone or voice (style), and agreeing upon the overall format (arrangement & delivery). Instructors who introduce group-authored projects might implicitly hope that students will engage and learn from one another, collaboratively building the group’s knowledge base in order to produce more sophisticated final projects. But students are usually more concerned about getting the job done quickly and painlessly, while ideally earning a good grade at its completion. This last part--grading--leads us to a second key problem faced by instructors in assigning group work: how to identify who wrote what and how to assess or evaluate individual students’ contributions. Our team wanted to see if wikis’ ability to track multiple versions of a document could be used to counter the “race to consensus,” thus encouraging students to use a more iterative, reflective, and interactive process of deep collaboration, while also providing a means to more fairly assess individual student contributions.
Learning from our experiences in the Fall 2007 Pilot Study, in the Spring 2008 semester we created a new assignment that better accounted for the technical shortcomings of the proprietary wiki software while also paying more explicit attention to the deep collaboration we hoped to foster. This assignment defined deep collaboration and explained to students that the goal was to use the wiki, with its comment and revision/history features, to help them write a specific part of their project within a specified amount of time. The assignment in modified versions was introduced in two writing intensive courses: a required Introduction to Engineering Design course (IED) and a junior-level Product Design and Innovation (PDI) course (IED Assignment, PDI Assignment). Both courses already included a substantial writing component with semester-long group writing projects that culminated in a final group-authored project—a final prototype, presentation, and report for the engineering course and a final business proposal for a new concept or idea in the PDI course.
To establish a framework for generating measurable data, we used three separate instruments. First, we distributed two paper surveys—one basic survey at the beginning of the class (before the students began work on the wiki) to determine how well the students knew each other and how much experience they had using wikis or other computer-mediated communication technologies (instant messaging, email, text, etc). A total of 54 surveys were distributed (24 in the PDI course and 30 in the IED course) and 44 returned (21 in the PDI course and 23 in the IED course), with a response rate of 81.5 percent (Survey 1).
Second, we distributed an additional paper survey after the students completed the “deep collaboration” task. The questionnaire asked 16 Likert-style (five-scale) questions, 3 yes-or-no questions, and 3 open-ended questions. The survey was distributed to 54 students. In the PDI course 19 students responded, and in the IED course 27 students responded, for a total of 46 responses, and a response rate of 85.2 percent (Survey 2).
Third and finally, at the end of the semester once the students had completed their final projects, two groups were selected (one from each class) and each member of the group was interviewed by telephone for 15 minutes to get more information about how they engaged in the task as well as what they found useful and less useful about the activity, the wiki, or both. Based on the results of the first set of surveys, we developed the phone questionnaire totaling 33 questions. This questionnaire was composed in Google Documents in collaboration with Dustin Kirk and Samantha Good (graduate students in Human Computer Interaction at RPI) and was then refined and revised after feedback from Drs. Montoya and Chi. The phone interviews were designed to solicit in-depth responses to open-ended questions, post-assignment. They were conducted with the members of two complete teams, one from PDI and from IED, for a total of 8 students interviewed (Phone Survey).
Continue reading the Case Studies section.