Friday, February 27, 2009

reducing plagiarism

In the mid-semester projects the students show amazing creativity. Especially in deriving and building their physics models and explanations. However we have noticed over the years that some groups cut and paste from the internet whole paragraphs (or more) for the introduction, i.e. the section that describes the device.

I have described the down-side of this type of plagiarism in lecture. Mainly the argument that in their future careers writing a report/developing a project that builds on an existing idea is an efficient and good thing to do, but the source of the idea should be acknowledged, and doing so does not harm your case/project/pitch/report. However presenting an idea as your own when it is not leads to a negative impression of you.

There are other arguments why students should not plagiarize, but I've found that this practical argument of "use the good ideas that you find, acknowledge them, then extend the ideas" resonates with the students.

This semester we are also trying a new tool that searches the web for plagiarized text. The tool is safeAssign and it is a new part of webCT. The students usbmit their draft project to safeAssign then the students and TAs see a report which contains an estimate of the amount of plagiarized text, as well as a color-coded highlight to which internet sources the text seems to have been cut-and-paste from.

So far the tool has worked with reports ranging from 0% plagiarism to over 90% plagiarized :( Since the students see the reports, the goal is for them to redo these sections before they submit their final project.

Tuesday, February 24, 2009

quantitative modeling/calibrated peer review

A key part of the course is for students to grow in their ability to solve complex problems. One vital skill is to take a complex system and to develop an approximate, model description of it that captures the main features, but is simple enough to be tractable.

In the mid-semester projects students must come up with a reasonable model description of an device they regularly encounter and try to quantitatively calculate some aspect of it, e.g. how long it takes for a pizza to cook. In this case trying to calculate the rate of heat transfer to the pizza what it might depend on, all the way to temperature changes etc. The educational goal is for students to develop skills in figuring out how to approximately model a complex device so that the science involved is correct enough to reasonably accurately describe how it performs. Key is often to know what to leave in or out of the model.

It is also a good chance to encourage student writing skills.

The challenge has been how to get good feedback to the students on their work. The TAs are the main source of feedback, but this semester I am also trying peer feedback. Students submit their draft work to a web-site http://cpr.molsci.ucla.edu/ , the website then shuffles the papers, and students are then asked to give feedback on 2-3 other projects. Hopefully the benefit is both ways, by reading other projects students will develop a stronger understanding of modeling, and they will get specific feedback on their project.

So far it has been relatively smooth, with the largest concern being that the only format you can submit is plain text, i.e. all equations, figures, tables, graphs are lost. I will suggest to the designers of the site, that they consider pdf uploads as well

Friday, February 20, 2009

Exam results

The first exam came in with a class average of 55% :( There were many concerns raised on the course discussion board, one was the lack of time, so I will extend the time in the next exam by 30 min.

A more subtle issue is which questions the students did well on, and which ones they did not do well on. Across pretty much all the content areas, questions that were direct applications of a single core equation or skill scored very high. I grouped these and the average is ~ 70%.

However for questions that required a combination of two ideas, the percentage of correct answers dropped, to a class average for this group of questions of ~ 40%.

There are many calls from the students to provide a larger formula sheet, or more example problems in recitation, or... I understand this, but I do not think that this will help address the core challenge, in fact it may make it worse. Students taking the course are heading to a variety of quantitative careers, engineering, science, business. The number one goal is for students to develop skills that go beyond the straightforward application of ideas and to combine information in novel ways. I understand that this is challenge for students, and that this is not their typical experience of a university course. But I want students to strive for this higher goal.

All the components of the course are designed to give students multiple opportunities to develop these skills, the complex problem-sets, multi-faceted problems on Tue recitations, the group projects, the multi-content questions on the exams. I will continue to work with the students to help them as much as I can. I also hope that students will strive for these higher goals as well.

Tuesday, February 17, 2009

first exam!

The first exam is Wed. I think it is a fair exam, with a mix of conceptual and quantitative questions. Some questions are designed that the majority of students should get correct to establish a reasonable floor in the score. Other questions are harder, and hopefully serve as some discriminatory power.

But as always I am nervous to see how much the students have understood and how well they do. I hope well

Thursday, February 5, 2009

hard problem sets

It has been a fascinating week. Each week students complete two problem sets, the first covering basic ideas and key skills, the 2nd set is far more complex. These are hard problems and have been a cause of frustration for the students.

A mini-maelstrom of complaints occurred on the course discussion board. Some of the comments were rather negative and I had to make sure I did not take them personally. However after thinking about this for a few days I decided to make two changes, a) shift the due date to Tue 8am from Mon 8am so the students have an extra day to get help from TAs, help-room or myself, and b) increase the number of attempts for each problem from 2 to 3.

I also extracted the time when students were starting the problem-sets and the score on the problem-set. Students who started the work within 24 hours of the due date got 25%, while those who started 3-4 days before the due date scored ~ 75%. With a smooth trend between these two extremes. So I encouraged students to start early and use all the resources we make available.

At lecture I also discussed the goal of these problem-sets is to build confidence so that when they see a tough problem that they can't do, they have the approach, skills, and confidence to say, OK how do I start, what principles are at work in this case, how can I make progress towards a solution, etc. These are critical skills for the future.