Community posts

A student's point of view


by Paul Denny
The University of Auckland

Amongst the numerous posts on the PeerWise Community blog are accounts by instructors of their experiences, descriptions of features within PeerWise, ideas for helping students use PeerWise effectively, and even a few curiosities.

However, one thing missing from this blog has been the student voice - that is, until now!

The idea for this began after reading an excellent post on the Education in Chemistry blog written by Michael Seery (Chemistry lecturer at the Dublin Institute of Technology). In the post, Seery expressed three main reservations for not (yet) using PeerWise (although, as indicated in his post's title, it appears he is slowly warming to the idea - and even has an account!). The post itself is very thoughtfully written and worth a read if you haven't yet seen it. And, I should add, Seery maintains a fantastic blog of his own (the wonderfully titled "Is this going to be on the exam?") which covers all kinds of teaching and education-related topics, which I thoroughly recommend.

A few days after Seery's post was published, a student named Matt Bird wrote a comment on the post describing his experiences using PeerWise in a first year Chemistry course at the University of Nottingham. Incidentally, this course was taught by Kyle Galloway who has previously spoken about PeerWise and who I see gave a talk yesterday entitled "PeerWise: Student Generated, Peer Reviewed Course Content" at the European Conference on Research in Chemistry Education 2014 - congratulations Kyle!)

Matt's comment briefly touched on the reservations expressed by Seery - question quality, plagiarism and assessment - but also discussed motivation and question volume. I thought it would be interesting to hear more from Matt to include a student perspective on the PeerWise Community blog and so I contacted him by email. He was good enough to agree to expand a little on the points originally outlined in his comment.

Included below are Matt's responses to 5 questions I sent him - and, in the interests of trying to be balanced, the last question specifically asks Matt to comment on what he liked least about using PeerWise.

Tell us a little about the course in which you used PeerWise. How was your participation with PeerWise assessed in this course? Do you think this worked well? We used PeerWise for the Foundation Chemistry module of our course. It was worth 5% of the module mark, and was primarily intended as a revision resource. To get 2% we were required to write 1 question, have an answer score of 50 or more, and comment/rate at least 3 questions. Exceeding these criteria would get 3%, and being above the median reputation score would get the full 5%. Despite only being worth a small amount of the module, I think this system worked well to encourage participation as it was easy marks, and good revision.

What did you think, generally, about the quality of the questions created by your classmates? How did you feel about the fact that some of the content, given that it was student-authored, may be incorrect? In general the questions were good quality. Obviously some were better than others, but there were very few bad questions. There were cases where the answer given to the question was incorrect, or the wording of the question itself unclear, but other students would identify this and suggest corrections in a comment. In almost all cases the question author would redo the question.

Were you concerned that some of your fellow students might copy their questions from a text book? I wasn't concerned about questions being copied from textbooks. At the end of the day it is a revision resource, and textbook questions are a valid way of revising. The author still had to put the question into multiple choice format, thinking about potential trick answers they could put (we all enjoyed making the answers mistakes people commonly made!) so they had to put some effort in. Obviously lecturers may have a different opinion on this!

How did you feel about the competitive aspects of PeerWise (points, badges, etc.)?
The competitive aspects were what kept me coming back. It was an achievement to earn the badges (especially the harder ones), and always nice to be in the top 5 on one or more of the leader-boards. If you knew your friends' scores then you could work out if you were beating them on the leader boards or not, which is kind of 'fun'. I fulfilled the minimum requirements fairly quickly, so most of my contributions were done to earn badges, and work my way up the leader-boards (and to revise, of course!).

Do you feel that using PeerWise in this course helped you learn? What did you personally find most useful / the best part about using PeerWise? What did you personally find least useful / the worst part about using PeerWise? I got 79 % for the first year of the course, so something went right! PeerWise probably contributed somewhat to that, as it did help me with areas I was less strong on. It's hard to say what the most useful part of PeerWise was, but the number of questions was certainly useful. I guess that's more to do with the users rather than the system though. As previously mentioned the competitive aspect was fun. The worst part of PeerWise would be the rating system. No matter how good the question, and how good the comments about the question were hardly anybody rated questions above 3/5 with most coming in at around 2. I guess nobody wanted to rate question too highly and be beaten in that leader-board! It would also have been nice to create questions where multiple answers were correct so you need to select 2 answers. Overall, I enjoyed using PeerWise and hope it is used again later on in my course.

Many sincere thanks to Matt Bird for taking the time to respond to these questions - particularly during his summer break - enjoy the rest of your holiday Matt!

Although his feedback represents the opinion of just one student, several interesting points are highlighted. For one thing, that one of the most common instructor concerns regarding PeerWise (the lack of expert quality control) did not seem to be of particular concern. In fact, Matt seems to appear fairly confident in the ability of his classmates to detect and suggest corrections for errors.

When commenting on the aspects of PeerWise that did concern him, Matt mentioned that the student-assigned ratings did a poor job of differentiating between questions. Indeed, this does appear to be somewhat of an issue in this course. The histogram below illustrates the average ratings of all questions available in the course.

Of the 363 questions in the repository, 73% were rated in a narrow band between 2.5 and 3.5 and 96% of all questions had average ratings between 2.0 and 4.0. While there are some techniques that students can use to find questions of interest to them (such as searching by topic or "following" good question authors) it seems like this is worth investigating further.

Below are two example questions pulled out of the repository from Matt's course - only the question, student answers and the explanation are shown, but for space reasons none of the comments that support student discussion around the questions are included. I selected these questions more or less at random, given that I am completely unfamiliar with the subject area! It is, of course, difficult to pick just one or two questions that are representative of the entire repository - but these examples go a small way towards illustrating the kind of effort that students put into generating their questions.

And finally, one other thing Matt mentioned in his feedback was that he would liked to have seen other question formats (in addition to single-answer multiple choice). Watch this space...

Return to Community posts