Community posts

Does PeerWise aid pupil progress in the secondary school?


by Matthew Downie

As part of my PGCE in UK secondary schools, I have had to produce some research into educational methods. Having first seen reference to PeerWise in the Royal Society of Chemistry’s Education in Chemistry (EIC) magazine, I felt it would be interesting to discover the usefulness of PeerWise in Secondary Education. Below I have reproduced the research report and findings which I hope will be of interest. I hope that the findings of this research will be of interest for many and look forward to expanding on this research in the near future.

1. The Research Premise

Does self-generated content enhance pupil progress by giving them a greater understanding of the subject content? This question is the focus of this research but before detailing how this question is to be answered, it requires dissection to ascertain exactly what is required to provide a suitable answer. This dissection will occur through the following questions:

1. What is self-generated content?
2. Why is self-generated content of any use in education?
3. What methods will be used to enable pupils to self-generate content?
4. How can progress be measured against the use (or lack thereof) of self-generated content?

The following sections will answer each of the following questions and end with a summary of the aims and objectives of this research.

1.1 What is self-generated content?

By far one of the most widely used models for cognitive development in education is that from Bloom’s Taxonomy1,2. In summary, objectives for learning are developed such that learners progress from simple factual recall (low-order thinking/ cognitive skills) to application and evaluating (high-order thinking/metacognitive skills). As metacognitive skills are more highly valued, efforts to push learners towards fulfilling these learning objectives are increased2-4.

Self-generated content has been used for some time in education. One simply has to consider assignments such as essays, and lab reports to recognise that these had to, by definition, be produced by the student as part of their studies5. Much research has been performed on self-regulated learning, of which self-generated content is an integral part, and focussed around the use of Bandura’s Theory6. This theory suggests that self-regulated learning is a consequence of personal, environmental and behavioural influences7.

The value of self-generated content varies from subject to subject. For example, artistic work tends to place more value on self-generated content compared to mathematical work5. This tendency leads into concept regarding so-called ‘academic’ subjects where textbooks and the like are more highly valued and rely on students being able to regurgitate information, of which self-generated content is used as a means to assess the learning of a particular student5.

1.2 Why is self-generated content of any use?

So if self-generated content is considered mostly of use as a form of assessment, why bother with is as a mode of imparting knowledge and information? In short: previous research on student generated content has shown a significant correlation between summative assessment scores and levels of participation in generating self-assessment content3,8-10.

Clearly the ability to produce one’s own resources based on one’s level of understanding reinforces learning and allows for greater levels of metacognition to occur. If such content is to be used by others, then the process of developing content requires more metacognitive work, as the content needs to then be accessible to others, not just its author. Additionally, this leads to greater engagement and achievement in the overall learning process5,10.

1.3 Methods for pupils to self-generate content

Consequently there is a challenge for educators to enhance metacognitive skills through application. There are a range of methods available, but this research will discuss two in detail. The first of these involves multiple choice questions (MCQs). While answering these types of questions can be relatively easy – tending to rely on recall of factual information – writing MCQs requires a much broader skill set. A good understanding of the subject content is a requisite due to good MCQs having an optimal number of answers (ranging between 3 and 5)11 where the incorrect answers also need to be plausible – possibly as a result of common misconceptions or mistakes3. Writing MCQs therefore is more time-consuming than it is to answer them. When learners produce their own multiple choice question, they are challenged to use higher cognitive levels than would be required to simply answer them.

Science is a highly conceptual subject and some concepts can be more easily explained using analogies. In the context of a lesson, this involves explaining a new concept by describing it in a more familiar context 12. These comparisons allow for development of understanding of new knowledge or alter that which is already understood 13,14. Indeed, within the National Curriculum, there are requirements for pupils to learn about several different models such as the heliocentric model of the Solar System and the Bohr model of the atomic nucleus to give two examples. It has therefore been argued that analogies are akin to using models, and therefore inseparable from understanding scientific concepts 15.

However, the issues surrounding models – recognising that they provide a simplified means to understanding ‘real-life’ systems – is not necessarily appreciated by pupils. One of the main issues identified is the creation of misconceptions 14,16 which is ironically what the model is attempting to avoid. It is therefore crucial that any attempt to use analogies are presented in a manner that is explicitly clear that they do not necessarily describe the ‘full picture.

Another issue with analogies are those categorised as Model I 17 analogies. These require low levels of input from pupils, and present low levels of monitoring pupil understanding. These usually arise from when a teacher provides a model to the pupils. From Oliva et al’s(17 teaching model constructs, matching analogies to descriptions of ‘real-life’ processes agrees with Model IIA. The subsequent level, whereby pupils would need to construct their own analogies for the effects of different parameters on reaction rate, matches Model IIB. The final approach, requiring high input from pupils and high levels of monitoring progress focusing on pupils sharing their analogies and creating discussion would result in Model III analogies which are the type which require the hardest level of cognitive action on the part of the pupil but result in the maximum development of understanding of the concepts being taught.

1.4 How to measure progress?

By far the easiest method of determining pupil progress is through end-of-topic, or end-of-year tests. Comparisons can be made between differing groups, such as those given the opportunity to self-generate content, and those not given such an opportunity. This method is similar to that performed by other research groups2,9,18-20. Consequently, if a group of pupils were split such that one could use PeerWise, and thus generate their own learning repository, whereas the other group were not, then comparisons could be made to determine how much progress was made between these two groups over a period of time. Ideally, such a comparison would occur over a period of 2-3 years, but in the context of this research, such comparisons will be attempted over one topic (covering approximately 6-7 weeks).

2 Aims and objectives

This research will aim to address the following question:

Does the ability to self-generate content on PeerWise improve pupil progress?

This will be answered using the following criteria:

1. What level of participation is achieved when pupils are given the opportunity to generate their own content?
2. How effective were pupils at generating content over an entire topic?
3. What impact did self-generated content have on pupils’ attainment?
4. Did pupils believe that the option to produce their own learning resources was beneficial to them?

3 Research methodology

3.1 The method

This research was performed at a co-educational independent boarding school and focused on the progress of pupils in one topic within the school’s year 9 specification. The topic chosen was Oxygen, Oxides, Hydrogen and Water – a unit within the Edexcel IGCSE specification.

The year is split into four paired sets – sets 1 and 2 being the ‘top’ sets and sets 7 and 8 having the weakest pupils. Two sets were selected – sets 4 and 5 – to be given the task of using PeerWise to aid their studies. A specific course was set up on the PeerWise software for these pupils to use.

The first lesson of the topic was used to introduce PeerWise to the pupils including a short demonstration of how to produce and answer MCQs. Upon completion of the demonstration, pupils were given the task of logging in and completing the following tasks as homework: (1) write three MCQs, (2) answer 1 MCQ and (3) comment on 1 MCQ. Upon completion of this task, pupils were informed that they would be free to use the software as much as they liked. Pupils were also informed that their activity on PeerWise would be monitored – unsuitable questions or comments would be removed and sanctions applied accordingly. Reference to previous research was also provided, stating that greater activity on PeerWise did coincide with higher attainment, thus the desire of pupils to be successful was used as motivation to increase their activity on PeerWise. Pupils were also informed that this would be a ‘use it or lose it’ process. The more activity that was observed on PeerWise, the more likely that it would be opened up to the rest of the year’s cohort for their revision, and use throughout their IGCSE studies. Minimal activity, or failure to use PeerWise would result in the courses set up to be closed down and the opportunity to use it denied for their peers. This too was aimed at ensuring motivation towards using PeerWise.

Data collected would be both quantitative as well as qualitative. The quantitative data would focus around the number of questions and answers uploaded per day as well as comparisons between mean end-of-topic test results for each set in the cohort. The results would be used to determine the effectiveness of pupil-generated content in developing pupil understanding of the subject content, and thus, their overall progress.

The qualitative data would be in the form of a questionnaire given to the pupils in sets 4 and 5 after completing the test to give feedback on their experience with PeerWise, focusing around its ease of use and whether they would continue to use it throughout their studies. This would be used to determine whether pupils felt they had benefitted from its use and what improvements they felt could be applied to aid their use of PeerWise.

For both, the results obtained and the questionnaire answers would be anonymised. The results for the end-of-topic test would be averaged and no names attributed to any particular score while the questionnaire would be provided without a requirement to add the pupil’s name to help the pupils be honest about their experience using PeerWise and explaining to them that only the author would see their answers for the questionnaire.

4 Results and discussion

4.1 Use of PeerWise

PeerWise was introduced to the Pupils in the two sets during their first lesson on Oxygen, Oxides, Hydrogen and Water. They were given the task of producing three MCQs for their peers as well as answer and comment on a minimum of one other question. To aid the pupils, two questions were previously uploaded for them. Figure 1 below shows the number of questions uploaded each day during the topic.

Figure 1 Number of questions uploaded per day during IGCSE topic. Note that the topic took longer than the two week period shown in this plot. After the 24th March, no further questions were uploaded. 30 questions were uploaded in total out of a minimum expectation of 111

A total of 30 questions were uploaded over the entire topic which was considerably lower than expected. The two sets combined had a total of 37 pupils so 111 questions would have been expected if all pupils had completed the minimum requirements. The reasons for this greatly decreased number of questions uploaded stem from some of the feedback gained after the research had been completed once the opportunity to use it for the other topics covered by the pupils. Namely that writing questions was ‘hard’ and pupils would prefer the teacher to write the questions. It was explained to them that this would defeat the point of PeerWise as it is based on pupils providing assistance to one another by producing MCQs on areas they have confidence with, and answering MCQs on areas they feel less confident with.

In looking through the questions, one had been deleted shortly after uploading due to the pupil recognising it as unacceptable. This supports references in the literature where teacher/instructor input is minimal due to the pupils/students self-regulating over their activity on PeerWise. In reviewing the questions, the vast majority (25 out of 30) focused on air composition and acid rain – two areas covered over the very first few lessons in the topic. The remaining five were spread over tests for gases and identifying the elements present in specific compounds e.g. water. These results were anticipated – that activity would decrease as more pupils completed their minimum requirements although it was hoped that pupils would see the benefits of using this program and continue to use it throughout the topic, thus resulting in questions covering every aspect in the topic.

The number of answers uploaded, and the comments uploaded however, showed a marked difference. (Figure 2)

Figure 2 Number of answers uploaded per day during the IGCSE topic. 214 answers were uploaded over the entire topic. The green ringed column comprises of 11 answers given by one pupil during the Easter holidays when usually no homework is set.

The 30 questions were answered a total of 214 times over the course of the topic with the majority answered shortly after the MCQs were uploaded. Of interest is the green ringed results on the 11th April. This is of interest because the answers were all given by one pupil during the Easter holidays when no homework had been given. Thus, for one pupil at least, recognition had been made that this program is useful for revision and can be accessed at any time, even outside of normal school time. The comments usually uploaded with the answers were along the lines of ‘this is a useful question for revision’. Several different pupils had given this comment so the association between the subject matter of the MCQs and the process involved in either writing or answering them as an aid to revision was clearly remarked upon. Within individual questions, the number of answers and the feedback given was useful in assessing the learning of individual pupils. More answers to particular questions coincided with areas of lower understanding and higher rated questions were generally very well written. The consequences of these questions are discussed below.

4.2 Comparisons of End-of-Topic Results

Upon completion of the topic, the pupils were given a week to revise for their end-of-topic test. The mean mark for this test would be compared, on a set-by-set basis, with the remainder of the Year 9 cohort. It was hoped that the process of self-generating content in the form of MCQs would have a positive effect on the overall mark achieved by the pupils in the two sets that had been given the opportunity to use PeerWise. The average marks for each set is shown below in Figure 1Figure 3.

Figure 3 Comparative average scores for the end-of-topic test for Oxygen, Oxides, Hydrogen and Water. The results for 3 set 6 had not been received at the time of submission. The green circle highlights the average score for the pupils in the sets who had access to PeerWise. All scores are ± 1 standard deviation.

From the results shown in , the average results were not greatly affected by the use of PeerWise. As an argument for the use of PeerWise, this does not provide much supporting evidence. This does, however, demonstrate that the overall setting of ability in this school is effective. Despite this, the section of the test in which the PeerWise using sets scored highest was on the composition of the air and on acid rain – the two main areas pupils had produced their MCQs. So as a tool to aid pupil understanding, self-generated content is beneficial. On other areas where MCQs were not produced, or the test did not address the area covered by the MCQs produced, pupils were relying on their ‘normal’ revision techniques. This could pose an issue – why give pupils the task of generating their own content if it is not assessed in a test or exam? The answer to this question derives from the holistic understanding of the subject in question. Science is can be a very difficult subject – some topics can be abstract with little or no obvious link to previous study whereas other topics can quickly become tedious for pupils as they can be covering the same ground at other topics, albeit in a different context. An example of the latter point is from the topic where this research was performed. In this topic, pupils learnt about testing for gases. In the subsequent topic (called Salts and Chemical tests), many of these tests were covered again. In particular, the chemical reaction involving calcium carbonate and acid was covered three times in three different topics over the Year 9 scheme of work, and on each occasion, was addressed in a different manner. By self-generating content, pupils need to develop their understanding of the subject content to a level that is clear and concise to their peers that enables not only their peers to develop their understanding, but to demonstrate their own.

4.3 Results from Questionnaire

Having analysed the results from the end-of-topic test, the opinions of the pupils were considered. How effective did they find PeerWise? Would they use it of their own accord? How difficult did they find using PeerWise? All of these questions were addressed in some form using the questionnaire in 4.2Appendix A. Pupils were asked to grade several aspects of PeerWise and their experience using it on a Likert Scale. The results are summarised below in Figure 4.

Figure 4 Percentage scores for pupil responses to the questionnaire (see Appendix A).

From the results of the questionnaire, it can be seen that there was a generally positive response to the use of PeerWise with the majority of pupils describing the program as easy to use, useful for revision and like PeerWise to be available for use throughout their studies. A sixth question was also provided (but not included in Figure 4) enquiring about how pupils would prefer the use of PeerWise to be regulated. Pupils were given options about it being given purely as homework, to be used as and when the pupils wanted to use it or for its use to count towards the pupils’ end of topic/year mark. The summary can be seen below in Figure 5.

Figure 5 Percentage of pupils preferring PeerWise to be (a) given purely as homework, (b) used as and when needed and (c) counted towards end of topic or end of year exam marks

From Figure 5, it can clearly be seen that the majority of pupils would not want activity on PeerWise to be included in their end of topic or end of year exam marks. This idea was introduced as, before the reforming of the UK education system, pupils would have had to perform some form of coursework which drew on several areas of the subject in order to complete effectively. Additionally, as PeerWise could be used to link topics together, having it contribute to pupil’s end of year marks would force them, in a sense, towards developing this holistic approach to the subject and develop their understanding of each topic. In later years, ‘gaps’ in pupil knowledge would be filled which, when compared to the course list on PeerWise, would enable them to produce more links between topics, thus progress and develop their knowledge and understanding further.

Two pupils did note down they would prefer both of options a and b – that PeerWise be set as homework and for it to be used as and when needed, but the majority opted for either one or the other option. Among the additional comments provided, several pupils stated that they would prefer PeerWise to have several questions uploaded by their teacher or even set as revision homework – a method which would enable the teacher to more accurately gauge how much revision is being performed by individual pupils. This last point was surprising and had not been considered until after the questionnaires had been collected in and read.

4.4 Providing pupils further opportunities

Upon collecting the questionnaires, reviewing and analysing the data, the two sets were then given a chance to vote on whether, for the next topic and for all other topics covered in Year 9, they would like additional courses on PeerWise to be provided for them to use of their own accord. The response was heavily in favour of this and subsequent courses were uploaded. Surprisingly, within six hours of access being given to the pupils for the next topic, four questions had already been uploaded, answered and commented on. Use of PeerWise is now being monitored on an occasional basis to ensure no unsuitable activity is taking place. The questions uploaded have been observed to be well written and will, if this use is continued, result in a strong repository of questions for pupils to use throughout their studies.

5 Conclusions and Future Work

5.1 Conclusions

37 pupils were given the opportunity to enhance their learning through the use of a MCQ forum, PeerWise. Their activity was monitored and comparisons were made between their end of topic test results and those of their peers in the rest of the cohort. Uptake of the activity insofar as writing MCQs was lower than expected and heavily related to content covered in the first few lessons given. Pupils were observed to prefer answering questions, referring to the writing of MCQs as difficult and wanting their teacher to produce them instead.

The results from the end-of-topic test did not show a marked difference in overall pupil attainment, rather the results were as expected for each set’s ability. For the pupils using PeerWise, the majority of their marks received tied very closely with the content they had produced on PeerWise. This was repeated in all 37 pupil’s tests and demonstrates that self-generated content can be used to reinforce learning in lessons.

The overall lack of questions arose mainly from the fact that activity n PeerWise was described as voluntary. As a result, a good number of pupils opted to not participate in writing questions, but preferred to answer questions produced by their peers. Feedback gained from the questionnaire showed that pupils recognised the usefulness of PeerWise, especially for revision purposes. One pupil in particular remarked that using PeerWise could be used to assess whether or not pupils are actually revising for their tests, be it end-of-topic tests or end-of-year exams. A small minority of pupils described how they would prefer PeerWise to contribute to their results but overall, the ability to use PeerWise on an ad hoc basis was more important.

Comparisons between the results from this research, and that of research at the tertiary level demonstrates a clear difference. At the tertiary level, students have chosen to study the subject further and so have an invested interest in high attainment whereas at the secondary level, especially at Key Stage 4, most pupils do not have the choice in certain subjects, and thus their interest may not be as high. Consequently, they may not feel as though they have as strong a vested interest in high attainment.

This last statement obviously comes with a caveat. The pupils selected are a small group compared to the rest of their year group, and indeed, the whole school population. This generalisation may be unfounded and requires further research into determining the true extent of pupils’ vested interests.

5.2 Future work

Further research can therefore follow the subsequent differing routes (or even a combination of them) as detailed below:

1. Expand the pupil numbers to include an entire year group – this will enable more informed discussion about the effects of PeerWise on pupil progress as a result of having a larger sample group and being able to observe more closely the effects of the pupils’ personal vested interests in their education.

2. Apply the use of PeerWise throughout the entire content of the academic year – this would enable greater AfL by monitoring which topics generated more answers which is an indication of where pupils have difficulties as well as ensure pupils are able to use it continuously throughout their studies rather than it being introduced at a comparatively odd time of the year (as was the case with this research).

3. Perform comparative studies between KS4 and KS5 students – this would enable a detailed review of pupil/s vested interests as at KS4, pupils do not have the option regarding studying the sciences whereas at KS5 they do. It would therefore be of interest to observe whether the decision to study the subject further influences the motivation of the student to self-generate content.

Undoubtedly, the short timescale of this research will have influenced the results obtained. Subsequent research would therefore need to be extended over a period of at least three years in order to be able to generate data comparable to that from other tertiary level research groups.

6 References

1. Krathwohl DR. A revision of bloom's taxonomy: An overview. Theory into Practice. 2002;41(4):212-+.
2. Bates SP, Galloway RK, Riise J, Homer D. Assessing the quality of a student-generated question repository. physical review special topics - physics education research. 2014;10(2):020105.
3. Galloway KW, Burns S. Doing it for themselves: Students creating a high quality peer-learning environment. Chem Educ Res Pract. 2015;16(1):82-92.
4. Draper SW. Catalytic assessment: Understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology. 2009;40(2):285-293.
5. Sener J. In search of student-generated content in online education. E-edukacja na swiecie. 2007;21(4).
6. Schraw G, Crippen KJ, Hartley K. Promoting self-regulation in science education: Metacognition as part of a broader perspective on learning. Research in Science Education. 2006;36(1-2):111-139.
7. Bandura A. Self-efficacy: The exercise of control. New York: Freeman; 1997.
8. Frase L, Schwartz B. Effect of question production and answering on prose recall. Journal of educational psychology. 1975;67(5):628.
9. Denner PR, Rickards JP. A developmental comparison of the effects of provided and generated questions on text recall. Contemporary Educational Psychology. 1987;12(2):135.
10. Sanchez-Elez M, Pardines I, Garcia P, et al. Enhancing students' learning process through self-generated tests. J Sci Educ Technol. 2014;23(1):15-25.
11. Vyas R, Supe A. Multiple choice questions: A literature review on the optimal number of options. Natl Med J India. 2008;21(3):130-133.
12. Gershon M. 19. analogies. In: How to use differentiation in the classroom: The complete guide. ; 2013:185.
13. Mozzer NB, Justi R. Students' pre- and post-teaching analogical reasoning when they draw their analogies. Int J Sci Educ. 2012;34(3):429-458.
14. Haglund J. Collaborative and self-generated analogies in science education. Studies in Science Education. 2013;49(1):35-68.
15. Coll RK, France B, Taylor I. The role of models/and analogies in science education: Implications from research. International Journal of Science Education. 2005;27(2):183-198.
16. Gilbert J, Osbourne R. The use of models in science and science teaching. European Journal of Science Education. 1980;2(1):1-11.
17. Oliva JM, Azcárateb P, Navarreteb A. Teaching models in the use of analogies as a resource in the science classroom. International Journal of Science Education. 2007;29(1):45.
18. Denny P, Luxton-Reilly A, Hamer J. Student use of the PeerWise system. NEW YORK; 1515 BROADWAY, NEW YORK, NY 10036-9998 USA: ASSOC COMPUTING MACHINERY; 2008:77.
19. Hardy J, Bates SP, Casey MM, et al. Student-generated content: Enhancing learning through sharing multiple-choice questions. International Journal of Science Education. 2014;36(13):2180-2194.
20. Bates SP, Galloway RK, McBride KL. Student-generated content: Using PeerWise to enhance engagement and outcomes in introductory physics courses. 2011 Physics Education Research Conference. 2012;1413:123-126.

Return to Community posts