UNIVERSITY  OF  CALIFORNIA,  MERCED                                                                                                                                                                                                 CENTER  FOR  RESEARCH  ON  TEACHING  EXCELLENCE  (CRTE)  

SPRING  2014  COUNCIL  OF  GRADUATE   STUDIES  PROJECT:   Adding  value  and  expectancy  to  the  student   learning  experience  in  ENVE/ENGR  120  Fluid   Mechanics  Laboratory    

Brandi  McKuin   6/9/2014          

 

 

Contents 1. Abstract ....................................................................................................................................... 3 2. Introduction ................................................................................................................................. 3 3. Methods....................................................................................................................................... 4 3.1 Lesson plans .......................................................................................................................... 4 3.2 Active learning activities ....................................................................................................... 4 3.3 Needs Assessment Survey ..................................................................................................... 5 3.4 Targeted Feedback ................................................................................................................. 5 3.5 Transparency in Evaluation ................................................................................................... 5 3.6 Articulation of Expectations.................................................................................................. 6 3.7 Mid-semester survey ............................................................................................................. 7 4. Results ......................................................................................................................................... 8 4.1 Needs Assessment Survey Results ........................................................................................ 8 4.2 Mid-Semester Survey Results ............................................................................................... 9 4.3 Lab Report Re-grades.......................................................................................................... 10 4.4 Signature Assignment Grades ............................................................................................. 11 5. Conclusion ................................................................................................................................ 12 6. References ................................................................................................................................. 13 7. Appendix ................................................................................................................................... 14 7.1 Spring 2014 ENGR 120 Fluid Mechanics Course Syllabus .............................................. 14 7.2 Needs Assessment Survey ................................................................................................... 18 7.3 Lab Report Grading Rubric ................................................................................................. 19 7.4 Detailed lab grade point distribution ................................................................................... 21 7.5 Example Abstract ................................................................................................................ 22 7.6 Example Introduction .......................................................................................................... 23 7.7 Example Methods ................................................................................................................ 24 7.8 Example Results .................................................................................................................. 26 7.9 Example Conclusion ........................................................................................................... 28 7.10 Mid-Semester Survey ........................................................................................................ 30 7.11 Needs Assessment Survey Results .................................................................................... 32 7.11 Mid-Semester Survey Results ........................................................................................... 38

7.12 Lab Report Re-grade Results ............................................................................................ 47 7.13 Signature Assignment Results ........................................................................................... 53  

1. Abstract While teaching the Fluid Mechanics laboratory component of the class (ENGR 120) and participating in the UC Merced CGS project this spring, I developed and refined teaching skills with the goal of: • •

identifying what motivates students to improve their ability to conduct experiments, analyze and interpret data, and write scientific reports adding value and expectancy to the students’ learning experience

To achieve these goals, feedback from students and evaluation of student lab reports was used to learn about student needs and guide teaching practices. Furthermore, adding value and expectancy was achieved by making the standards of excellence clear, making the methods of evaluation transparent and helping students improve performance through targeted feedback. Results show that grade improvement was an effective incentive for students to improve writing skills. However, student survey results indicate students place a higher value on writing skills for career development than grades. Students that acted on written feedback for extra points had higher gains in final lab scores than the cohort that did not participate; 12% and 4%, respectively. Based on the average gain in lab scores of 7 ± 14% (percent change between the first and final lab for all 38 students), the tools used to add value and expectancy enhanced students’ abilities to conduct experiments, analyze and interpret data, and write scientific reports. 2. Introduction The upper division course, ENGR 120 Fluid Mechanics course is one of the courses selected for direct assessment of the Mechanical Engineering program learning outcomes (PLOs). PLO B, is defined as the ability to design and conduct experiments, as well as to analyze and interpret data. PLO B is directly assessed with embedded assignments in the form of a laboratory report and is directly aligned with one of the key course learning outcomes identified on the course syllabus, the “ability to design and conduct experiments, analyze data, and communicate results in written and oral technical reports”. In every lab exercise, in addition to key concepts related to introductory fluid dynamics, the assignment learning outcome was also directly aligned with the course and program learning outcomes. The signature assignment selected for the certificate program was an embedded assignment in the final laboratory report which follows both the program level and the course level assessment methods. The final lab report is an appropriate signature assignment because it is an authentic real-world task (Ambrose et al., 2010). The student grades for the final laboratory report (signature assignment) provide a way to measure the gains in student improvement. It has been shown that the discrepancy between the initial performance and desired performance can be reduced by providing appropriate and specific goals, clarifying goals, and increasing effort to reach goals through written feedback

(Hattie and Timperley, 2007). However, an important component of student improvement is student motivation. Authentic, real-world tasks are a way to add value to lessons. Adding value to lessons can in turn, increase student motivation (Ambrose et al., 2010). Additional strategies that can add value and expectancy include creating assignments that provide the appropriate level of challenge, articulation of expectations, providing rubrics, and providing targeted feedbacks (Ambrose et al., 2010). The tools described in the preceding paragraph form the basis of my pedagogical project, namely, to (a) identify what students value about their learning experience and use this information as a motivation tool to improve student performance in conducting experiments, analyzing and interpreting data, and writing scientific reports; and (b) add value and expectancy to the student’s experience by making the standards of excellence clear, making the methods of evaluation transparent, and to help students improve performance through targeted feedback. 3. Methods Surveys were used to identify student needs, learn what students’ value about their learning experience, and assess teaching strategies. Several strategies were used to add value and expectancy including: the lesson plan, active learning activities, making what is expected of students clear by providing examples of excellent work, transparency in evaluation with a detailed point distribution rubric, the provision of targeted feedback, and providing opportunities for students to act on feedback. 3.1 Lesson plans To add value and expectancy, careful preparation went into the lesson plans. The lesson plans were based on the concept of backward design (Wiggins & McTighe, 2005). Backward lesson planning begins with the end goal in mind; the planning of the assessment begins working backward to the beginning. The focus throughout the lesson plan was on student understanding of the key concepts necessary to complete the lab report. Student assessment for the lab lecture was accomplished through feedback at the end of the lesson in the form of an exit quiz. 3.2 Active learning activity – Exit Quiz It has been shown that active learning and highly structured lessons that provide practice with problem-solving lead to learning gains (Haak et al., 2011). Active learning, in the form of an exit quiz, was used to promote analysis, synthesis, and evaluation of the class content. As the name implies, the exit quiz was administered after the laboratory procedure was complete and was the final component of in-class activities. The exit quiz questions were generally the key equations or concepts necessary for completing the data analysis component of the lab report. The students were instructed to write their answers on paper and submit the quiz for a brief review before leaving the laboratory. In some instances, I observed students working together to answer the questions (which was never discouraged). Students that had recorded the correct answer were permitted to leave. Students that recorded incorrect answers were required to stay and discuss the concepts covered in the quiz. For the students that stayed behind, the review usually involved a group discussion where the notes on the whiteboard were re-visited. It was my personal observation that in this setting, the students were more engaged in discussion than when I

addressed the entire class at the beginning of the class period. Because the exit quiz wasn’t assigned a point value and the collaborative nature of the quiz, it was a way to reinforce student learning without penalizing students for being wrong. I also liked the timing of the assessment. Students received immediate feedback rather than waiting a week or more for a graded assignment. To encourage the process of taking action following feedback, I would have students revise their written responses. On several occasions students asked if they could take the quiz home because it would be helpful in their preparation of the lab report. To assess student perception of how helpful the exit quiz was, a question in the mid-semester survey was devoted to the topic (see question 9, Appendix 7.10). 3.3 Needs Assessment Survey The needs assessment survey was administered by SATAL after the students had completed their first lab report but before they commenced work on their second lab assignment. The needs assessment survey was designed to be a collaborative effort. Feedback from students was used as guidance for my teaching strategy. The survey also provided feedback to the students about the importance of the lab grading rubric. Two (yes or no answer) questions were devoted to the lab grading rubric: “Have you read the lab grading rubric?” and “I use the lab grading rubric as a guide when I am working on lab write ups”. The third question was designed to provide information about what part, if any, of the lab rubric is confusing. The fourth question asks the students how frequently they write technical laboratory reports, so that I could gauge the skill level of the class. Finally, the last question asks them to reflect on the first report and evaluate what components they found most difficult on a scale of 1-10: data collection (lab procedure); data analysis (computation/statistics; drawing conclusions about data); and technical writing (written lab report). 3.4 Targeted Feedback Targeted feedback was provided to the group in oral form and to individuals in written form. The feedback delivered to the group addressed the common mistakes students made in the previous lab, as well as how the group was improving. The written feedback included a score for each section, critical comments that detailed how the lab did not meet the criteria for excellent work, and highlighted grammar and spelling errors. Drawing on my own experience in writing workshops offered by the University, the process of acting on written feedback has significantly improved the quality of my writing. Hoping to encourage action on written feedback, an incentive was provided to the students with the instructor’s permission. The students were offered the opportunity to rewrite labs for additional points. For the first two labs, students were permitted to re-submit labs for a complete re-grade. This offer was also extended to students for the third and fourth labs but with a reduced incentive; the re-grade was an average of the original and resubmission scores. 3.5 Transparency in Evaluation One of the strategies that can add value and expectancy is providing rubrics. While the ENGR 120 (Fluid Mechanics) lab grading rubric provided detailed criteria for excellent work, it does not include a point distribution. To enhance the course grading rubric, a detailed lab grade point

distribution was implemented during the grading of the third lab report, and was also provided as a resource on CROPS. The detailed point distribution not only streamlined the targeted feedback process, but also made the assessment more transparent because students could see why they lost points. In addition to this point system, some written feedback was also provided but was generally limited to one or two sections of the report. 3.6 Articulation of Expectations Another strategy to add value and expectancy is the articulation of expectations. To clearly articulate what was expected, examples of excellent work were modeled for the students both inclass and in the form of supplements to review on their own time. An example of how student expectations were articulated was an in-class review of a peerreviewed journal article. During the lab discussion a printed copy was passed around the room and we discussed how the organization of the article is similar to our lab reports including the headings: abstract, introduction, methods, results, conclusion, and references. We discussed how the peer-reviewed journal article exemplified the criteria for the “distinguished category” for the results and discussion section of the lab grading rubric. It was pointed out how the data and analytical outcomes are presented logically, the results are presented in graphical and tabular form with captions describing the figures, and the discussion provides relevant and significant insights. Similarly, we discussed how the professional tone of the article and the abstract exceed the rubric standards. The class was encouraged to model the writing style of the peer-reviewed paper for their lab report (lab 4) and every report thereafter. In addition, I took action to articulate what was expected from students by changing the format of the lab notes written on the whiteboard for students. Prior to the fourth lab, the majority of the lecture notes were devoted to data analysis (based on the needs assessment survey results). The new format for the lecture notes resembled a lab report including the headings “introduction”, “methods”, “results and discussion”, “conclusion”, and “references”. A written description of what was expected for each section was provided, and in some cases, the writing for that section was provided in the form of a template. In addition to the templates for each section, some questions were written (and discussed in-class) to stimulate thinking about relevant and significant insights for the results and conclusion sections of the lab report. Finally, I articulated what was expected of students by providing “examples of excellent work” (Huba and Freed, 2000). A sample lab report from a heat transfer class was provided on CROPS by the course instructor. This lab report was an excellent example of the organization and writing style that was expected of student lab reports. However, because of student feedback, additional written examples based on assignments the students had previously completed were provided: the abstract was based on the fourth lab assignment, the introduction was based on the third lab assignment, the methods was based on the fourth lab assignment, the results was based on the first lab assignment, and the conclusion based on the third lab assignment. The example supplements were made available to students on CROPS a week before their first voluntary rewrite was due. Because the written examples were based on assignments that students were

familiar with, the examples were more relevant and may have had more impact than the heat transfer lab report. 3.7 Mid-semester survey The mid-semester survey was administered by SATAL before students began work on their fourth lab assignment. The survey was designed to identify what motivates students to improve their laboratory performance, communicate the importance of the lab grading rubric, and to receive feedback on how well some of the teaching strategies were working. The timing of the survey allowed for some fine tuning that could be employed in the weeks leading up to the signature assignment. The first question of the survey gets to the heart of student motivation by asking the open-ended question, “In one phrase or sentence, please state why technical writing skill is important to you”. While the answer to this question may provide insights on what might motivate a student to improve their scientific writing, it also provides an opportunity for the student to do some reflection, and self evaluation. The second and third questions of the survey were devoted to the lab grading rubric. Here, the questions were designed to provide information (whether students are using and how) but also to communicate to the student the importance of the lab grading rubric. The (yes or no answer) question “Do you regularly use the grading rubric when preparing your lab reports?” re-appears from the Needs Assessment Survey. However, an open-ended follow-up question asks, “If you answered yes, how do you use the grading rubric?”. Several questions were devoted to assessment of group and written feedback. Question four was designed to inform whether students were reading written feedback with the (yes or no answer) question, “Have you read the written feedback on your lab report?”. Questions five and six were designed with a Likert scale (response choices: strongly agree, agree, disagree, strongly disagree) and designed to assess the effectiveness of written feedback, “Written feedback in this class has been helpful”, and “What I learned from the written feedback will influence how I write reports in the future”. Question seven asks students their preference for group feedback with the question, “Please check the box for which form of feedback you would find most useful”. The eighth question was designed to learn about student writing needs. The question asks, “What aspects of technical writing do you need to work on?” with a response prompt to select all that apply from the options: “knowing the audience”, “organization”, “tone/style”, and “citing references”. This question not only provides information about student needs, but provides the student with an opportunity to reflect, and self-evaluate. The last two questions were designed to assess the effectiveness of the exit quiz and my helpfulness with the data analysis component of the lab. These questions were designed with a 4point scale response (very helpful, helpful, unhelpful, very unhelpful). Question nine asks,“To what extent have the exit quizzes helped you understand the key concepts necessary for completing the lab report?” and question ten asks, “To what extent are the TAs explanations of the data analysis procedures helpful to your completion of this section of the report?”.  

4. Results Results were obtained from the Needs Assessment Survey, the Mid-Semester Survey, Laboratory Report Re-grades, and Signature Assignment Grades. 4.1 Needs Assessment Survey Results The needs assessment survey results were tabulated by SATAL. I used the results to better understand my audience, learn whether students were using the lab grading rubric and how well they understood it, how frequently they write formal laboratory reports, and what sections of the formal laboratory they found most difficult (based on their experience with the first lab report). The responses related to the lab grading rubric reveal 91% of the students reported having read the lab grading rubric (Figure 2), and that 70% of the students report using the lab grading rubric as a guide when working lab write ups (Figure 3). The responses to the third question of survey revealed that the sections of the lab grading rubric students find confusing are: the results section (30%), writing and tone (27%), abstract (18%), methods (18%), introduction (15%), conclusion (15%), and no response (3%) (Figure 4). Because the majority of the students had reviewed the lab grading rubric, I did not review the lab grading rubric as an in-class exercise. To address student needs with respect to the confusing aspects of the lab grading rubric, I articulated expectations. The fourth question of the survey addressed how frequently the students write formal laboratory reports. A majority of the student reported completing formal laboratory reports frequently (more than twice a year) (57%) and often (at least twice a year) (27%) (Figure 5). However, based on the student lab report grades, the students may have overestimated the frequency with which they prepare formal lab reports. Another possibility for why student grades did not reflect that of students with more writing experience may be that there is a distinct difference in expectations for lab reports prepared in the lower division courses versus the upper division courses. Here is some student feedback from the instructor evaluations: “Perhaps in the beginning, provide a sample of a lab report that is 100% what the instructor is looking for. That might help minimize that awkward period where we realize that these lab reports have to be written in a professional manner (and not like the boilerplate check-off-thechecklist format of earlier physics/chemistry classes)”. “She could be a little more lenient when grading labs, this is my first upper division course and first time I had to write a lab and every time I did them I was super stressed out.. Wish she would give more partial points”. The last question of the survey asked students to rate on a scale of 1-10 (one very easy; ten very difficult) the difficulty of the components of the lab exercise: data collection (lab procedure), data analysis, and technical writing. The results indicate the majority of the students found data collection on the easier end of the spectrum (1-4) (Figure 6), data analysis in the more difficult range of the spectrum (6-9) (Figure 7), and technical writing in the moderate range of the

spectrum (3-7) (Figure 8). Based on the results of the survey, more of the class discussion time (for the first three labs) was focused on the data analysis than on technical writing or the methods and procedures.   4.2 Mid-Semester Survey Results The mid-semester survey results were tabulated by SATAL. I used the results to identify what motivates students to improve their laboratory performance, to communicate the importance of the lab grading rubric, and to receive feedback on how well some of the teaching strategies were working. The first question of the survey was designed to learn what motivates students to improve their writing skills. This information was a tool used to add value to the lab writing exercise. It follows that if students see value in the writing exercise; they may make the connection that improving writing skill is worthwhile, and be motivated to increase effort. The results indicate that approximately 62% of the students report technical writing skill is important for career related reasons (Figure 9). After learning this information, I frequently stressed the importance of improving writing skill for professional reasons. The second and third questions of the survey were devoted to the lab grading rubric. 63% of students reported they regularly use the grading rubric when preparing their lab reports, down 7% from the 70% reported in the Needs Assessment Survey (Figure 10). The answers to the (free response) question asking how students use the lab grading rubric had several different variations of the same answer. In general, the students were using the rubric as a standard (Figure 11). Questions four, five, six, and seven of the survey were devoted to assessment of group and individual written feedback. In response to question four, 94% of the students reported they read the written feedback on their lab reports (Figure 12). In response to question five, 61% agree and 39% strongly agree that written feedback in this class has been helpful (Figure 13). In response to question six, 50% agree, 47% strongly agree, and 3% disagree that what they learned from the written feedback will influence how they write reports in the future (Figure 14). In response to question seven, the majority of the students (59%) indicated they preferred an in-class discussion of common mistakes made in the last laboratory report with a written version provided on CROPS (Figure 15). Based on the results, it was clear that the students place a high value on targeted feedback. As a result, I continued to make targeted feedback an important part of the assessment process throughout the semester. The eighth question of the survey was designed to learn about student writing needs, to give students a chance to reflect, and conduct a self-evaluation. According to the results, students report they need to work on: 59% citing references, 56% tone/style, 41% knowing the audience, 25% organization, 3% presenting results, and 3% conclusion, methods, and data analysis (Figure 16). Based on these results, I took action to articulate expectations and provide targeted feedback to help students improve their writing.

The ninth question of the survey was designed to assess the effectiveness of the exit quiz. The results indicate 59% find the exit quiz somewhat helpful, 19% very helpful, 9% somewhat unhelpful, and 3% helpful (Figure 17). Because the majority of the students found the exit quiz at least somewhat helpful, I continued to provide the exit quiz throughout the semester. The last question of the survey was designed to assess my effectiveness with helping students to complete the data analysis component of the lab report. The results indicate 48% found my explanations of the data analysis procedure very helpful, 24% somewhat unhelpful, 21% somewhat helpful, 4% helpful, and 3% no response (Figure 18). While the majority of the students found my data analysis explanation at least somewhat helpful, it was clear that there was some room for improvement on my part. To address this, I made an extra effort in lesson planning to anticipate where students might have trouble, and spent extra time in the development of the exit quiz questions to target the more difficult aspects of the data analysis. During the lab discussion I would share with the students which steps of the data analysis I predicted they would have some difficulty. To do this, I would use phrases to emphasize the importance of the procedure I was about to explain with, “this is the most difficult part of the data analysis”, or “pay attention to this step, because it was tricky”, or “the trick to this step was…” 4.3 Lab Report Re-grades Students were provided an opportunity to act on the individual written feedback on the laboratory reports for the first four labs. They could re-submit the first and second reports for a new grade. For the third and fourth labs, the new grade was an average of the revised lab score and the original lab score. A total of 18 students participated in the re-grade opportunities. Of that total, 11% participated in all (4 total) re-grade opportunities, 11% participated in 3 out of 4, 39% participated in 2 out of 4, and 39% participated in 1 out of 4. For the first lab, 13 out of 38 students (34%) participated in the re-grade opportunity. The original score and the score following feedback (re-grade) for the first lab are provided in Figure 19. The original lab score and the re-grades are represented in the graph as vertical bars, and the overall course grade is presented on the horizontal axis. The average percent increase in regrades for the first lab is 17 ± 9%. The majority of the students that participated in the first opportunity for a re-grade were average students based on course grades (62% fell within the range 74-63; C+ is 70), 23% were above average students (range 75 and greater), and 15% of the students were below average (range 62 and lower) (Figure 20). For the second lab, 14 out of 38 students (38%) participated in the re-grade opportunity. The original and the re-grade score for the second lab are provided in Figure 21. The average percent increase in lab scores for re-grades for the second lab is 20 ± 10%. The majority of the students that participated in the first opportunity for a re-grade were average students based on course grades (57% fell within the range 74-63; C+ is 70), 29% were above average students (range 75 and greater), and 14% of the students were below average (range 62 and lower) (Figure 22). For the third lab, 5 out of 38 students (13%) participated in the re-grade opportunity. The original, the re-grade, and the average of the two scores for the third lab is provided in Figure 23.

The average percent increase in lab scores for the third lab is 8 ± 5%. The students that participated in the second opportunity for a re-grade were average students based on course grades (57% fell within the range 74-63; C+ is 70), 29% were above average students (range 75 and greater), and 14% of the students were below average (range 62 and lower) (Figure 24). For the fourth lab, 3 out of 38 students (8%) participated in the re-grade opportunity. The original, the re-grade, and the average of the two scores for the fourth lab are provided in Figure 25. The average percent increase in lab scores for the fourth lab is 8 ± 5%. Two of the students that participated in the fourth opportunity for a re-grade were average students based on course grades (74-63; C+ is 70) and 1 student was below average (range 62 and lower) (Figure 26). To investigate the subsequent performance of the students that participated in feedback for credit opportunities, all lab scores (original scores 1-6, and any re-grades 1-4) of students that participated in only one lab re-grade were graphed in Figure 27. The general trend is an increase in scores over the semester. The average percent difference between the first and last labs of this group was 16%. The lab scores of the cohort that participated in two re-grade opportunities were also graphed in Figure 28. The trend for the average scores of the cohort that participated in two re-grades is less clear. This may be because the re-grades for labs three and four were averages of the original and the resubmission scores resulting in lower gains. Also, one student participating in a re-grade for lab had a score of 4, which lowered the average of this cohort considerably. Nonetheless, the average percent change between the first and the last labs for the cohort participating in two re-grades is an increase of 10%. Two students participated in three regrade opportunities. The lab scores for this cohort are presented in Figure 29. The average percent change between the first and last labs for this cohort is 8%. Two students participated in four re-grade opportunities. The lab scores for this cohort are presented in Figure 30. The average percent change between the first (original) and the last lab for this cohort is 5%. Finally, the average percent change in the first (original) and last lab for the students that participated in any re-grade was compared to students that did not participate in re-grade opportunities. Students that participated in re-grades not only benefited from higher course grades due to the increased points from the re-grade, but the average unadjusted percent change in lab scores was higher in this cohort than those that did not participate in re-grades; 12% and 4%, respectively. 4.4 Signature Assignment Grades The key results of the signature assignment are the student’s grades over the semester, and the percent difference in grades between assignments, and percent difference in the initial (lab one) and final assignment (lab six). As shown in Figure 31, the lab score frequency (percent) for each lab (1-6) is graphed on the yaxis, and the range of lab scores is plotted on the x-axis. This graph (does not include re-grades) includes the original lab scores to capture the change over the course of the semester. A general trend that can be observed is a decrease in the frequency of lab scores in the range 7-8, and an increase in the frequency of scores in the range of 8-9. This is especially evident when looking at the scores for labs 4, 5, and 6. This trend happens to coincide at the point in time that I took action to articulate expectations.

A graph that presents the percent change between each lab, and the initial and final lab is presented in Figure 32. Again, this figure does not include re-grade scores in the results to capture the change in scores over the course of the semester. The general trend is a gain in percent difference for all labs, except for the second lab. The average percent change (N=38) between the initial and final lab is 7 ± 14%. The average score for each lab is presented in Figure 33. This figure does not include re-grades to capture the trend over the course of the semester. The average scores for the first three labs show that the aggregate student performance improved on the second lab compared with the first lab, but student performance regressed on the third lab when compared with the second lab. A possible explanation for this is that lab two was an easier (in my opinion) lab assignment than the first or the third. Another consideration is that my assessment methods were under development during this point in the semester. Recall, the point distribution rubric (that applied to all labs) was implemented in the third lab. When the lowest lab score was dropped and the re-grades were included, the following average scores (out of 10 possible points) were calculated for the three classes: lab one 8.4, lab two 8.7, lab three 7.9, lab four 7.9, lab five 8.5 and lab six 8.5. The average lab score for my three lab sections for all labs, (including re-grades and dropping the lowest score) was 8.3. 5. Conclusion Based on my experience this semester, I have gained insight into what motivates students to improve the skills required to succeed in the ENGR 120 Fluid Mechanics Lab and the effectiveness of strategies used to add value and expectancy to the student experience. The strategies used to add value and expectancy included making what is expected of students clear, the methods of evaluation transparent and collaborating with students through targeted feedback. The observed evidence for student motivation came from the results of the re-grade activities and the mid-semester survey. Grade improvement was an effective incentive for students to act on written feedback. Students were more willing to participate in re-grade activities when the incentive was high, a complete re-grade, versus the lower incentive of a re-grade that averaged the original and revised lab score. However, when students reported why technical writing skill was important (results from the mid-semester survey), career related reasons outranked grades. Thus, while grades are an important motivator, relating the lab writing exercise to career development should not be overlooked. Despite the frequency with which the majority of the students reported preparing formal lab reports, most students were not prepared to write a professional lab report. And, while a writing sample (from a heat transfer lab) and a lab grading rubric were provided at the beginning of the semester, students expressed (in office hours, in the mid-semester survey results, and in my instructor evaluation) how they wished they knew what was expected of them from the beginning. Based on this feedback, I recommend providing the resources and activities I engaged in to articulate expectations in the first class meeting. Another recommendation is to make a scientific writing course a prerequisite.

Another strategy to add value and expectancy is to make the methods of evaluation transparent. While the course lab grading rubric provides detailed descriptions of a “distinguished” lab report, it does not include point information. A lab grading rubric that includes a detailed point distribution not only makes evaluation more transparent to the student, it stream-lines the grading process for the teaching assistant. I recommend adding a detailed point distribution to the course lab grading rubric. Finally, collaboration with students through targeted feedback was highly effective based on mid-semester survey results and the lab scores of the cohort that acted on written feedback in regrade activities. All of the students responded that written feedback on lab reports was at least helpful. Students that responded to written feedback for points in the re-grade activity not only benefited from higher course grades due to the increased points, but the average unadjusted percent change in lab scores was higher in this cohort than those that did not participate in regrades. I recommend making written feedback on lab reports a high priority for future teaching assistants, and reducing the turn-around time for the lab reports as much as possible. 6. References Ambrose et al., How learning works: 7 research based principles for smart teaching (JosseyBass, San Francisco, 2010). Wiggins, G., & McTighe, J., Understanding by design (expanded second edition), Association for Supervision & Curriculum Development (2005). Hattie, J. and Timperly, H. (2007). The power of feedback, Review of Educational Research, 77:81. Haak et al., (2011). Increased Structure and Active Learning Reduce the Achievement Gap in Introductory Biology, Science, 332, 1213-1216.

7. Appendix 7.1 Spring 2014 ENGR 120 Fluid Mechanics Course Syllabus

 

     

7.2 Needs Assessment Survey

 

7.3 Lab Report Grading Rubric

7.4 Detailed lab grade point distribution Overall lab score: 10.0 Title: 1.0 Abstract 2.0 +0.30 Introduction (recap why experiment was conducted) +0.30 Methods (recap how experiment was conducted) +0.50 Results (numerical values with units) +0.30 Conclusions (percent error; key insights) +0.30 Tone/style +0.30 Grammar/spelling Introduction 1.0 +0.10 Objective/application stated +0.60 Background on principles used in the methods +0.10 Tone/style +0.10 Grammar/spelling +0.10 References (cite the reference in text) Methods 2.0 +0.60 Procedures discussed using complete sentences (not in the form of a list). +0.60 Equations presented (including equation numbers) +0.20 Discuss how the equations were used to conduct data analysis +0.20 Use of equation editor +0.20 Tone/style +0.20 Grammar/spelling Results 3.0 +2.0 Results presented in figures and tables with captions +0.50 Discuss figures and tables in text (refer to figure or table numbers); discuss trends in data and sources of error; answer questions in lab manual +0.25 Tone/style +0.25 Grammar/Spelling Conclusion 1.0 +0.30 Restate key findings (numerical values with units provided here again). +0.30 Recommendations based on findings. +0.10 Tone/style +0.10 Grammar/Spelling +0.20 References provided in list format

7.5 Example Abstract Abstract The feasibility of using noninvasive vibration sensors to measure flow in pipes was investigated. Vibration data for five different flow rates was collected using an accelerometer attached to the outside surface of three different pipe materials connected in series. A comparison was made of the curve fit of the standard deviation of the vibrations in the test section with mass flow for various test section materials. A second order least-squared fit was determined for each pipe material (PVC, galvanized steel, and copper) and each axis of the accelerometer (0, 1, and 2), respectively. The coefficient of determination (R2) for each axis is (1) PVC: 0.750, 0.821, and 0.984; (2) Galvanized Steel: 0.878, 0.986, and 0.981; and (3) Copper: 0.850, 0.990, and 0.963. There appears to be a deterministic relationship with flow rate. However, the curve fit would be more quadratic if only fully developed turbulent flow was included in the data (i.e. excluding flow rate at zero cubic meters per hour). In future experiments, care should be taken to determine the minimum and maximum flow rate to keep the pipe full of water during the sampling intervals, and to isolate each source of vibration.                                              

7.6 Example Introduction  

Introduction The wind tunnel can be used to model flow fields in both qualitative and quantitative manners (UCM SOE, 2014). Pressures and velocities in the wind tunnel can be measured using manometers and pitot-static tubes. Manometers consist of a column or tube of fluid with end of the tube open to a known pressure and the other end connected to the system containing the pressure to be determined. The manometer difference in fluid height gives the change in pressure; also described by the incompressible hydrostatic equation (Munson et al., 2009). Manometers can be used in conjunction with the Bernoulli equation to characterize fluid flow (UCM SOE, 2014). Bernoulli’s equation can be used to characterize flows where the viscous effects are assumed to be negligible, the flow is assumed to be steady, the flow is assumed to be incompressible, the equation is applicable along a streamline and is useful in situations with stagnation and dynamic pressures such as a pitot-static tube (Munson et al., 2009). A pitot-static tube consists of an opening that is aligned with the flow direction. At the opening, the fluid decelerates to a zero velocity and a corresponding stagnation pressure is measured. Pitot-static tubes also have openings perpendicular to the flow direction where the static pressure is measured. The total pressure measured at the mouth of the pitot-static device is the sum of the static and dynamic pressure (UCM SOE, 2014). References ENGR 120 Fluid Mechanics Laboratory Manual, 2014.”Introduction to the Wind Tunnel”, pp. 13. School of Engineering, University of California, Merced, CA. B.R. Munson, D.F. Young, T.H. Okiishi, and W.W. Huebsch (2009). Fundmentals of Fluid Mechanics, 6th Ed., John Wiley & Sons, Hoboken, NJ.    

7.7 Example Methods 1. Methods 1.A. Experimental Procedure Three pipes of equal diameter and length but differing material (PVC, galvanized steel and copper) were connected, in series, to a water source. The flow rates were controlled by an adjustable flow meter at the water source. An accelerometer was attached to each pipe type using a mounting bracket. A USB data interface and software were used to collect vibration data for five different flow rates for each pipe material. For each flow rate, the signal noise output with a rate of 60 Hz, was collected for approximately 40 seconds. This procedure was reported for five different flow rates and for each pipe material. 1.B. Data Analysis The signal noise data was collected and an empirical relationship between the standard deviation of the pipe vibrations and the mean flow rate of the fluid in the pipe was developed. The relationship between the standard deviation of the pipe vibrations and the mean flow rate is outlined in Evans et al. (2004). Briefly, beginning with the definition of turbulent flow given in Eq. 1, ! ! = ! + !′  

(1) where ! is the mean velocity (in units meters per second) u’ is the fluctuation of turbulent velocity about the mean (in units meters per second) For dynamically similar flows, the ratio of the flow fluctuations to the average flow is constant given by Eq. 2,    

! !

=!

(2)

where ! is the fluctuation of turbulent velocity about the mean (in units meters per second) This relationship is used to define the “intensity of turbulence” which is a measure of the magnitude of the turbulent disturbance. Following Evans et al. (2004), the intensity of turbulence expression can be arranged as given in Eq. 3,

! !

!

=!

! !!!

!! ! !! !

(3)

!!

Multiplying both sides by the number of points N and !! and dividing by N-1 results in the following equation where the left hand side is the definition of the sample standard deviation as given in Eq. 4, ! !!!

! !!!

=

!! ! − !

!

=

!" !!!

!!

(4)

where C given defined in Eq. 5, !

! = !"

(5)

where A is the cross-sectional area (in units square meters) ɣ is the specific weight (in units Newtons per cubic meter) Eq. 4 is based on the argument that flow fluctuations are proportional to pressure fluctuations and the pressure fluctuations are proportional to pipe vibrations, it follows that the standard deviation of the pipe vibrations is proportional to the average flow rate. Thus Equations 1-4 allow us to empirically determine the relationship between the standard deviation of the pipe vibrations and the mean flow rate of the fluid in the pipe. For the purposes of this report, the standard deviation of the pipe vibe vibrations was calculated for each pipe material and each axis of the accelerometer (0,1, and 2) in Excel. The raw data for each flow rate and pipe material was sorted by axis. The standard deviation of each axis was plotted against each flow rate, and a regression analysis was conducted. The best fit was determined by how close the coefficient of determination (R2) value was to 1. References R.P. Evans, J.D. Blotter, and A.G. Stephens, “Flow Rate Measurements Using Flow-Induced Pipe Vibration,” Journal of Fluids Engineering, vol. 126, Mar. 2004, pp.280-285.

7.8 Example Results Results and Discussion The five measured depths with respective transducer depth and water height to compute the actual pressure gage height are presented in Table 2. Table  2  Pressure  transducer  calibration  data  

Measurement points 1 2 3 4 5

Depth of pressure transducer [cm] 90.8 79.38 68.45 57.97 46.79

Water height [cm]

102.78 102.83 102.83 102.83 102.83

Height of water above transducer [cm] 11.98 23.45 34.38 44.86 56.04

The barometrically compensated height or height as a function of gage pressure for each meter stick measurement is presented in Table 3. Table  3  Barometrically  compensated  gage  measurements    

Measurement points 1 2 3 4 5

H (ptotal) [cm]

87.01 98.08 108.85 119.3 130.48

H (patm) [cm]

75.41 75.41 75.41 75.41 75.41

H (pgage) [cm]

11.6 22.67 33.44 43.89 55.07

Using the distance between the transducer depth and the surface of the water from Table 2 and the height as a function of gage pressure from Table 3, a calibration curve was constructed as shown in Figure 1. Figure 1 expresses the relationship of barometric-compensated depths vs. the corrected scale-recorded depths.

 

Unknown

Figure  1  Pressure  transducer  calibration  curve

Calibra

In Table 4, the regression statistics for the calibration curve are provided, namely--the slope, yintercept, standard errors for the slope and y-intercept, and the R² value. The uncertainty for H is the standard error of y, SEy, 0.1459 m.

60  

Table  4  Regression  statistics  for  the  calibration  curve  

40  

0.99995

2

0.1459 R

h(Pgage)  [cm]  

0.98742 -0.3784 m 0.00421 0.15792 SEm

y  =  0.9874x  -­‐  0.3784   R²  =  0.99995  

50  

b SEb SEy

30   20   10   0  

The average value of the hydraulic conductivity, K, is 1.75E-04 m/s. The standard deviation of the three measurements is the uncertainty of the hydraulic conductivity, 4.51E-05 m/s. The distance between wells, L, was calculated using Eq. 2, and was found to be 160.527 m. The uncertainty associated with the distance between wells was calculated using Eqs. 3-7 and found to be 0.07071 m. The value for Darcy’s velocity, q, calculated with Eq. 1, was found to be, 3.48E-07 m/s. The error for Darcy’s velocity was propagated using Eq. 8 and was found to be 9.31E-08 m/s.

Table  5  Calculated  values,  absolute  uncertainty  and  relative  uncertainty  for  the  variables  including  Darcy’s  velocity,  hydraulic   conductivity,  the  change  height,  and  the  distance  between  wells.  

Value absolute uncertainty relative uncertainty

q (m/s) 3.48E-07 9.31E-08 26.75%

K (m/s) 1.75E-04 7.81E-05 44.73%

ΔH (m) -0.32 0.00146 0.45%

ΔL (m) 160.527 0.07071 0.04%

As shown in Table 5, Darcy’s velocity is (3.48 ± 0.93) x 10-7 m/s with a relative uncertainty of 26.75%. The variable that contributes most to the uncertainty is the hydraulic conductivity with a relative uncertainty of 44.73%. The average hydraulic conductivity value is within the range of values found in the literature. The representative values of hydraulic conductivity, K, are .002.005 m/s for gravel (which has big pores that are easy for water to flow through) and 1.28 x 10 -6 m/s for clay (which is composed of very fine particles and thus has very small, tight pores that are difficult to force water through). Based on the range of values for the three hydraulic

-­‐10   Deleted:

-­‐10  

0  

10  

2

h

conductivity samples, the soil samples are within the range of sand (1.76 x 10-4 m/s) and sandy loam (3.47 x 10-3 m/s) (Dingman, 2002).

References Dingman, S. (2002). Physical Hydrology. 2nd Ed. Waveland Press, Inc., Long Grove, IL.

7.9 Example Conclusion Conclusion Using a wind tunnel, a manometer and a pitot-static tube pressures and velocities were estimated. For the static head measurements, the percent difference is highest for wind velocity 5 m/s (79.95%) and lowest for wind velocity 35 m/s (0.057%). A graph of the change in pressure versus velocity demonstrates a quadratic relationship. The percent difference follows the same general trend with the dynamic head measurements found in the static head measurements. The percent difference is greatest, 35.0%, with the smallest wind velocity, 7.30 m/s, and the smallest percent difference, 0.089%, occurs when the velocity of 50% of the maximum or 38.1 m/s. The effect of yaw angle on velocities was also examined and the range of percent differences is as follows: high value 24.5% at yaw angle -30, and the low value 0.005% at yaw angle -5. The range of values for percent differences associated with the “actual” and “theoretical” pressures (when the pitot-static tube was oriented at, above and below the horizontal), respectively, are high values 24.01% and 43.00% each at yaw angle -30, and low values 0.215% at yaw angle 5

and 0.010% at yaw angle -5. It was observed that the lowest percent differences occurred when the wind tunnel operated at or near 50% of maximum capacity. Experimental error could be attributed to the assumption that the static pressure is zero even when the pitot-static tube is oriented at an angle above or below the horizontal. Thus, it is recommended that in future experiments the contribution from the static pressure be included in the calculated velocities and pressures. References B.R. Munson, D.F. Young, T.H. Okiishi, and W.W. Huebsch (2009). Fundamentals of Fluid Mechanics, John Wiley & Sons, Hoboken, NJ.

7.10 Mid-Semester Survey

7.11 Needs Assessment Survey Results

 

Q1.  I  have  read  the  lab  grading  rubric  

9%   Yes  

No  

91%  

  Figure  2  Results  of  needs  assessment  survey  question  1.

Q2.  I  use  the  lab  grading  rubric  as  a  guide   when  I  am  working  on  lab  write  ups  

30%  

Yes  

No  

70%  

  Figure  3Results  of  needs  assessment  survey  question  2.  

%  Response  

Q3.  Which  criteria  of  the  lab  rubric  is   confusing?   35   30   25   20   15   10   5   0  

  Figure  4  Results  of  needs  assessment  survey  question  3,  “Which  criteria  of  the  lab  rubric  is  confusing?”  

Q4.  How  oVen  have  you  completed  a   formal  laboratory  report  with  the   following  elements?   3  Frequently   12%  

1  Rarely  

3%  

2  OBen   0  Not  at  all  

27%  

58%  

  Figure  5  Results  of  needs  assessment  survey  question  4.  

 

%  Response  

Q.5  Based  on  your  experience  in  the  first  lab,   please  rate  the  difficulty  of  the  following   components  of  the  lab  exercise  on  a  scale  of   1-­‐10:  Data  collecJon  (lab  procedure)   20   15   10   5   0  

  Figure  6  Response  to  needs  assessment  survey,  question  five,  part  one.  

%  Response  

Q.5  Based  on  your  experience  in  the  first  lab,   please  rate  the  difficulty  of  the  following   components  of  the  lab  exercise  on  a  scale  of   1-­‐10:  Data  analysis  (computaJon/staJsJcs,  etc.)   30   25   20   15   10   5   0  

  Figure  7  Response  to  needs  assessment  survey,  question  five,  part  two.  

%  Response  

Q.5  Based  on  your  experience  in  the  first  lab,   please  rate  the  difficulty  of  the  following   components  of  the  lab  exercise  on  a  scale  of   1-­‐10:  Technical  wriJng  (wri`en  lab  report)   20   15   10   5   0  

  Figure  8  Response  to  needs  assessment  survey,  question  five,  part  three.  

           

7.11 Mid-Semester Survey Results

 

Q1.  In  one  phrase  or  sentence,  please  state  why   technical  wriJng  skill  is  important  to  you   Increase  the  chance  of  promoOon   To  improve  wriOng  skills  in  general   To  explain  or  clear  up  any  confusion   To  use  for  the  rest  of  my  life   They  are  necessary  to  get  a  job   To  convey  and  apply  theoreOcal   To  get  a  good  grade   Data  presentaOon   Good  communicaOon   To  express  the  results  of  the  experiment   Engineers  need  to  write  technical  reports   For  my  profession  

0  

10  

20  

30  

40  

%  Response  

 

Figure  9  Response  to  mid-­‐semester  survey,  question  one.  

Q2.  Do  you  regularly  use  the  grading  rubric  when   preparing  your  lab  reports?  

6%   Yes   31%   63%  

No   No,  I  use  the   outline  format  

  Figure  10  Response  to  mid-­‐semester  survey,  question  two.  

Q3.  If  you  answered  yes,  how  do  you  use  the  grading  rubric?   Glance  over  it   Follow  what  the  rubric  requires   To  know  what  is  important  to  be  added  to  my  report   To  answer  every  quesOon  for  full  credit   As  a  guideline  to  know  what  topics  and  secOons  I  need  to  cover  to  receive  a  good  score   No  response   As  a  guide,  but  I  find  the  concise  rubric  in  the  individual  lab  more  helpful   ABer  compleOng  the  report,  I  go  back  to  the  rubric  and  check  what  needs  improving.     As  a  standard  for  my  lab  report.     To  cover  all  the  requirements  for  each  secOon   Read  the  excellent  column  and  base  the  lab  structure  off  of  what  each  topic  suggests   To  understand  what  is  expected  in  the  lab  report   Looking  over  the  rubric  with  each  secOon  of  the  wriXen  report   To  provide  the  format  for  the  lab  report  

0  

5  

10  

15  

20  

%  Response  

  Figure  11  Response  to  mid-­‐semester  survey,  question  three.  

Q4.  Have  you  read  the  wri`en  feedback  on  your  lab  report?  

6%   Yes  

No  

94%  

  Figure  12  Response  to  mid-­‐semester  survey,  question  four.  

Q5.  Wri`en  feedback  in  this  class  has  been   helpful   Agree   Strongly  Agree   39%   61%  

  Figure  13  Responses  to  mid-­‐semester  survey,  question  five.  

Q6.  What  I  learned  from  the  wri`en  feedback   will  influence  how  I  write  reports  in  the  future   3%   50%   47%  

Agree   Strongly  Agree   Disagree  

  Figure  14  Responses  to  mid-­‐semester  survey,  question  six.  

Q7.  Please  check  the  box  for  which  form  of  group  feedback   you  find  the  most  useful   Be  more  specific  about  how  the  lab  reports  should  be  wriXen   Example  report   Discuss  upcoming  lab/report  before  it  is  due   TA  discusses  common  mistakes  in  the  last  lab  report  in  the  pre-­‐lab  discussion.     Discussion  in  pre-­‐lab  and  wriXen  version  of  mistakes  on  CROPS.    

0  

50   %  Responses  

100  

 

Figure  15  Responses  to  mid-­‐semester  survey,  question  seven.  

Q8.  What  aspects  of  technical  wriJng  do  you  need  to  work  on?   Conclusion,  methods,  and  data  analysis   PresenOng  results   Using  and  applying  our  vocabulary  to  real  world  events.     OrganizaOon   Knowing  the  audience   Tone/style   CiOng  references  

0  

10  

20  

30  

40  

50  

60  

70  

%  Responses  

  Figure  16  Responses  to  mid-­‐semester  survey,  question  eight.  

Q9.  To  what  extent  have  the  quizzes  helped  you   understand  the  key  concepts  necessary  for   compleJng  the  lab  report?   Somewhat  Helpful  

3%  

9%  

Very  Helpful   19%  

Somewhat  Unhelpful   69%  

Helpful  

  Figure  17  Responses  to  mid-­‐semester  survey,  question  nine.  

Q10.  To  what  extent  are  the  TA's  explanaJons   of  the  data  analysis  procedures  helpful  to  your   compleJon  of  this  secJon  of  the  report.     4%   3%   21%  

48%  

24%  

Very  Helpful   Somewhat  Unhelpful   Somewhat  Helpful   Helpful   No  response  

  Figure  18  Responses  to  mid-­‐semester  survey,  question  ten.  

           

7.12 Lab Report Re-grade Results  

Lab  Scores  (10  points  possible)  

Grade  distribuJon  first  regrade   10   9   8   7   6   5   4   3   2   1   0  

Original  Score   Regrade  Score  

50   57   63   64   65   67   70   70   71   71   81   83   84   Overall  Course  Grade    

 

Number  of  parJcipants  regrade  1  

Figure  19  Original  score  and  re-­‐grades  for  the  first  lab.    

4   3   2   1   0   A+  

A  

A-­‐  

B+  

B  

B-­‐  

C+  

C  

C-­‐  

D+  

D  

D-­‐  

Course  Le`er  Grade  

  Figure  20  Course  grades  of  re-­‐grade  1  participants.  

 

10   9   8   Lab  Score  

7   6   5  

Original  Score  

4  

Regrade  Score  

3   2   1   0   50   57   62   63   63   64   65   67   70   71   76   81   83   84   Overall  course  grade  

  Figure  21  Original  score  and  re-­‐grades  for  the  second  lab.  

Number  of  parJcipants  regrade  2  

4.5   4   3.5   3   2.5   2   1.5   1   0.5   0   A+  

A  

A-­‐  

B+  

B  

B-­‐  

C+  

C  

C-­‐  

D+  

D  

D-­‐  

Course  Le`er  Grade  

  Figure  22  Course  grades  of  re-­‐grade  2  participants.  

10  

Lab  Score  Regrade  3  

9   8   7   6   5  

Original  Score  

4  

Regrade  Score  

3  

Average  Score  

2   1   0   57  

64  

67  

70  

71  

Overall  course  grade  

  Figure  23  Original  score,  re-­‐grades,  and  average  of  the  two  for  the  third  lab.  

Number  of  parJcipants  regrade  3  

2.5   2   1.5   1   0.5   0   A+  

A  

A-­‐  

B+  

B  

B-­‐  

C+  

C  

C-­‐  

D+  

D  

D-­‐  

Course  Le`er  Grade  

  Figure  24  Course  grades  of  re-­‐grade  three  participants.  

10  

Lab  score  re-­‐grade  4  

9   8   7   6   5  

Original  

4  

Regrade  

3  

Average  

2   1   0   62  

70  

71  

Overall  course  grade  

 

Number  of  parJcipants  in  regrade  4  

Figure  25  Original  score,  re-­‐grades,  and  average  of  the  two  for  the  fourth  lab.  

2.5   2   1.5   1   0.5   0   A+  

A  

A-­‐  

B+  

B  

B-­‐  

C+  

C  

C-­‐  

D+  

D  

D-­‐  

Course  le`er  grade  

  Figure  26  Course  grades  of  re-­‐grade  four  participants.  

Lab  Scores  -­‐  Cohort  1    

10   9   8   7   6   5   4   3   2   1   0  

 Lab  1   Regrade  1    Lab  2   Regrade  2   Lab  3   Lab  4   Lab  5   Lab  6   Overall  course  grade  

 

Lab  scores  -­‐  cohort  two  

Figure  27  All  lab  scores  (original  and  re-­‐grades)  of  students  participating  in  only  one  re-­‐grade  opportunity.  

10   9   8   7   6   5   4   3   2   1   0  

Lab  1   Regrade  1   Lab  2   Regrade  2   Lab  3   Regrade  3   Lab  4   Regrade  4   Overall  course  grade  

Lab  5  

  Figure  28  All  lab  scores  (original  and  re-­‐grades)  of  students  participating  in  two  re-­‐grade  opportunities.  

Lab  scores  -­‐  cohort  three  

10   9  

Lab  1  

8  

Regrade  1  

7  

Lab  2  

6   5  

Regrade  2  

4  

Lab  3  

3  

Regrade  3  

2  

Lab  4  

1   0   64  

57   Overall  course  grade  

Regrade  4   Lab  5  

  Figure  29  All  lab  scores  (original  and  re-­‐grades)  of  students  participating  in  three  re-­‐grade  opportunities.  

Lab  scores  -­‐  cohort  four  

10   9  

Lab  1  

8  

Regrade  1  

7  

Lab  2  

6   5  

Regrade  2  

4  

Lab  3  

3  

Regrade  3  

2  

Lab  4  

1   0   70  

71   Overall  course  grade  

Regrade  4   Lab  5  

  Figure  30  All  lab  scores  (original  and  re-­‐grades)  of  students  participating  in  four  re-­‐grade  opportunities.  

           

7.13 Signature Assignment Results

Frequency  of  lab  score  for  all  students,   N=38  

  50%   45%   40%   35%  

Lab  1  

30%  

Lab  2  

25%  

Lab  3  

20%   15%  

Lab  4  

10%  

Lab  5  

5%  

Lab  6  

0%   0  

0-­‐5  

5-­‐6  

6-­‐7  

7-­‐8  

8-­‐9  

9-­‐10  

Lab  score  range  

  Figure  31  Frequency  of  lab  scores  (all  students,  N=38)  

10%   8%   6%   4%   2%   0%   -­‐2%   -­‐4%   -­‐6%  

%  change  labs   %  change  lab   %  change  lab   %  change  lab   %  change  lab   %  change   1  and  2   2  and  lab  3   3  and  lab  4   4  and  lab  5   5  and  lab  6   labs1  and  6  

Figure  32  Change  in  lab  scores,  all  students  

 

Average  lab  score  for  all  students   (N=38)  

8.6   8.4   8.2   8   7.8   7.6   7.4   Lab_1  [10]   Lab_2  [10]   Lab_3  [10]   Lab_4  [10]   Lab_5  [10]   Lab_6  [10]   Figure  33  Average  original  lab  scores,  all  students  N=38  

   

 

Brandi McKuin.pdf

The signature assignment selected for the certificate. program was an embedded assignment in the final laboratory report which follows both the. program level ...

7MB Sizes 2 Downloads 144 Views

Recommend Documents

Brandi McKuin.pdf
... and increasing effort to reach goals through written feedback. Whoops! There was a problem loading this page. Brandi McKuin.pdf. Brandi McKuin.pdf. Open.

S.B. 304 Brandi Sluss.pdf
2016-2017] John Marshall Law Journal 293. (2) 'Medical examination' ... As the law currently stands, Georgia did not ... Page 3 of 6. S.B. 304 Brandi Sluss.pdf.

Brandi Evans - Seduciendo a Jason.pdf
-¡Por supuesto que puedes, Mags!, dijo su mejor amiga. –. Solo tienes que subir las escaleras, y tan pronto como él te. deje entrar, dejas caer tu abrigo y le das ...

mom bang teens brandi love casi james.pdf
Best xgifs brandi love casi james. Moms bang teens 2012 ripperz file. Madison chandler brandi love in moms bang teens video fun to. Casi james brandi love in ...