Open Access
Author Manuscript
SCI-PUBLICATIONS Author Manuscript
American Journal of Applied Sciences 4 (10): 828-833, 2007
ISSN 1546-9239
© 2007 Science Publications
Adoption of Peer-To-Peer Assessment in a Computing Skills Course
Mansoor Al-A'ali
Department of Computer Science, College of Information Technology,
University of Bahrain, P.O. Box 32038, Kingdom of Bahrain
Abstract: The process of peer-to-peer assessment brings great benefits to the teaching & learning
process. This paper presents a peer assessment model which was applied to students enrolled in a
generic computing skills course. In order to measure the effectiveness of this proposed model, students
had to evaluate their colleagues based on predefined criteria and a comparison is presented between the
lecturer assessments and the peer assessments. The peer assessment model was evaluated and the
results presented demonstrate that the peer assessment model is a successful process to be adopted in
teaching generic skills.
Key words: Peer-to-Peer assessment; Group project; Student perception
learning tool. Ideally assessment is intended to help
students plan their learning, identify their strengths and
weaknesses and develop transferable skills.
The implementation of peer assessment, an
alternative way of assessment for teachers, received
much attention in recent years due to its effectiveness
for students’ learning[1,9,18,6,8]. This new assessment and
learning strategy has been used extensively in diverse
fields[5,6,8]. In addition to helping students plan their
own learning, identify their own strengths and
weaknesses, target areas for remedial action, develop
meta-cognitive and professional transferable skills, and
enhance their reflective thinking and problem solving
abilities during the learning experience[11,12]. Peer
assessment is also found to increase students’
interpersonal relationships in the classroom[10].
Peer assessment has been used extensively in many
different fields, such as writing composition, business,
science, electrical engineering, medicine, information
and social science. While reviewing the past studies of
peer assessment, Topping (1998) found it to be a
reliable and valid method for assessment and teaching.
The peer assessment scheme has been modelled after
the authentic journal publication process of an
academic society. In the process, the editors of the
journal provided the authors with anonymous
comments and suggestions for further modification,
thus making the papers more mature[13,14,15].
A continuing challenge for educators using group
work is to ensure that it is a positive learning
experience for students. Group work is an important
teaching strategy within the Generic Computing Skills
and Professional Issues course curriculum as it can
INTRODUCTION
Peer assessment is the process of assessment of
students by other students, both formative reviews to
provide feedback and summative grading. Peer
assessment is one form of innovative assessment, which
aims to improve the quality of learning and empower
learners, where traditional forms can by-pass learners'
needs. It can include student involvement not only in
the final judgments made of student work but also in
the prior setting of criteria and the selection of evidence
of achievement[2,16]. Peer assessment can likewise be
both formative and summative, and can be a useful way
of enabling students to think critically about their own
work. For peer assessment it is essential to develop
clear guidelines about giving feedback to others.
Peer to peer assessment is a major issue in many
universities and it is widely applied in some[7]. If
organized, delivered and monitored with care, peer
assessment can yield gains in the cognitive, social,
affective, transferable skills and systematic domains
that are at least as good as those from staff
assessment[3,17]. A range of studies has contested the
value of assessment in general, in particular teacheronly assessment. Teacher-only assessment limits the
opportunity for individual evaluation skills and the
opportunity for students to understand the educational
objectives and how these relate to the educational
experiences. Teacher-only assessment restricts students
from taking responsibility for their own learning.
Students need to gain awareness of their own
learning strategies in the light of the educational
objectives. The assessment process needs to be a
828
Am. J. Applied Sci., 4 (10): 828-833, 2007
SCI-PUBLICATION Author Manuscript
facilitate both knowledge gaining and the development
of teamwork skills, which are essential for the
professional practitioner. Students often enjoy working
in a group and they value learning from and with other
people. However, problems can arise when group work
is assessed and the same mark is awarded to individual
students irrespective of their contribution to the group
work.
The problem of ‘free-riders’ within group work is
well recognized and educators need to consider the
impact this has on student’s attitude to group work.
Students view group work assessment as unfair if there
is equal reward for unequal contributions. These
negative experiences can lead to students feeling
discontent and dissatisfied with group work and result
in students resenting further group work when the
assessment system is perceived as unfair and
inequitable. Therefore, the challenge for educators is to
develop new systems of assessment that are recognized
and accepted by students as ensuring equality in group
work assessment. Hand over control of the assessment
process may be difficult for many teachers, but sees it
as important if ‘academic and professional conformity’
is to be avoided[4].
personal reactions. The assessment criteria for the peer
assessment model were carefully reviewed and
confidential to eliminate personal opinions as much as
possible so each student in the group evaluates every
other member of the group by answering 13 questions
that were developed by the course instructor. The
results of those assessment questions form a part of this
paper where students evaluated other members in the
group and gave an overall average of the presentation
skills that were later compared with the instructor's
marking. At the completion of the project, all 26
students enrolled in the course were asked to complete
a questionnaire consisting of 7 questions about the peer
assessment process and 10 questionnaires were returned
as a response.
RESULTS AND DISCUSSION
The generic computing skills and professional
issues course covered a number of topics: ethics,
research skills, technical report writing skills,
presentation skills, preparing tenders "ations. All
topics received between one and two weeks of lectures
and an assignment for each topic. The presentation
skills covered issues such as preparing power points
slides on the research topic assignment covered in the
research skills section of the course. Each student was
given an opportunity to practice before his peers once
and before the lecturer once which meant that each
student gave two practice presentations and sat through
all other presentations. The lecturer gave feedback to
every student about his/her seminar and these
comments or criticisms were done in front of all other
students. This way every one of the students benefited
equally from these practice sessions so that when the
for-real judgements were made by the peers and the
lecturer, all had to compete seriously and put their full
effort in trying to score as high a grade as possible.
The concept of peer-to-peer assessment was
explained to the students and blind marking was a way
of avoiding copycats and personal issues. A total of
thirteen issues relating to presentation skills were very
clear to all peers and the grading scheme was given by
the lecturer.
After evaluating their colleagues in the peer
assessment model the students reached very high results
that are shown in Fig. 1. Further, the average peer
assessment was given a real weighting in the final
marks of the students which made the process a bit
more serious for the students. A weighting of 20% was
allocated to peer-to-peer assessment.
The criteria provided by the course instructor were
taken into consideration during the evaluation process.
The Peer Assessment Model: The Generic skills and
professional ethics course consists of a number of
topics: ethics, research methods, technical report
writing, presentations skills, CV writing & interviewing
skills. In order to get realistic and beneficial results only
the presentation skills part of the course was applied.
The peer assessment model adopted for students
enrolled in the Generic Computing Skills and
Professional Issues course was concerned about grading
their colleagues when presenting multiple topics. As the
introduction of the group project was a new risk,
students needed to be monitored and evaluated on a
continuous basis by the course instructor. The project to
be assessed by students was evaluating other student's
presentations skills. Many aspects were taken into
consideration, such as:
Presentation skills which focused on (Voice
projection, Eye contact, Confidence, Time keeping,
Generating interest, Level of improvement., Clarity
of presentation tool provided by the student during
his/her presentation).
Cooperation techniques (Listening to each other,
Helping the group address individual differences,
Attending meetings and completing tasks on time).
The criteria that require a careful examination of
the work in question is likely to provide valuable peer
feedback, rather than reliance on judgment based
829
Am. J. Applied Sci., 4 (10): 828-833, 2007
Overall presentation skills by peer assessment
model
10.50
M arks out of 10
SCI-PUBLICATION Author Manuscript
10. Presenter generated interest in topic presented:
Students had generated a very high interest in other
students. This is indicated by the score of 9.
Students gained new skills like changing the tone of
their voice, keeping eye contact with others,
focusing on certain points to show details
specifically, etc.
11. Presenter kept your attention during presentation:
Students gave a score of 9.05 for the presenter who
tries to keep the audience attention.
12. Level of improvement in presentation skill since
beginning of course: 9.52 was the score for this
topic. Students gained new skills and therefore
improved their presentation skills and even
developed new skills to follow themselves and other
members of the group.
13. Confidence by presenter throughout presentation:
9.34 is the score for the overall presenters
confidence throughout their presentation. Students
had developed high confidence in standing in front
of an audience and trying to persuade them about
the topic they present.
The lecturer marking is shown in Fig. 2 and it
indicates that the lecturer also had high results for the
presentation skills. The results of the peer assessment
model marking were compared with the lecturer
marking results as shown in Fig. 3.
The lecturer can indicate from Fig. 3 that most of
students marked their colleagues on reasonable basis
which is nearly similar to the lecturer marking. By this,
the lecturer can adapt the peer assessment technique
because it stands on good basis and students tend to
experience more when marking themselves.
Students in their evaluation of the peer assessment
model were asked to complete a questionnaire to get
their opinion of the peer assessment method. Results
from this study, detailed in Fig. 4, indicate that (50%)
In order to show the results in a summative way, we
show the overall average of the presentation skills from
the student's perspective that were later compared to the
lecturer marking (Fig. 2) to test the reliability of peer
assessment. Here is the summary of the results:
1. Clarity of slides (font, colour, graphics, etc.):
Students gave a score of 9.94 out of 10 to the slides
clarity due to the student's effort to show their
presentation skills improvement. Students organized
their topics points to be discussed in a way that
delivers the information through meaningful
graphics, graphs, tables, etc.
2. Distribution of contents of slides: A score of 9.76
was given to the way the slides content were
distributed. The students aimed to adapt a structured
and systematic way to present their topics in the best
way, thus focusing on giving a brief introduction,
some facts, results and conclusions.
3. Flow of presentation: This topic scored a 9.44 out of
10. Presentations had very logical flow where the
student talked about each topic and moved to the
next in a smooth way.
4. Voice projection: Overall average of student's voice
projection is 9.4. This skill improved all the way
through multiple trainings and group work, where
the student got used to the audience and tried to
catch their attention by changing the pace and tone
of their voice.
5. Eye contact and attention to the participants: 9.06 is
the score for this topic. Students built self
confidence
through
presentations
training.
Therefore, it was obvious that students were
confident to face the audience and make eye contact
and keep their attention.
6. Presenter confidence in topic presented: 9.29 is the
score for the presenters confidence in the topic
presented. Due to the enhancement of searching,
scanning and skimming skills, students were very
confident about their topics, not forgetting the
feedbacks by the course instructor.
7. Level of interest expressed by presenter: Students
scored 9.13 for this topic where they tried to show
their interest in the topics they presented and show
the audience their point of view in the topic.
8. Time keeping, Students kept their presentation in
the time frame specified and this scored 9.74 out of
10.
9. Beginning and finishing presentation: 9.36 was
given as a score for this topic. Students had a good
beginning and ending when they presented their
topics.
10.00
9.50
9.00
8.50
1
2
3
4
5
6
7
8
9
10 11 12 13
Series1 9.94 9.76 9.44 9.40 9.06 9.29 9.13 9.74 9.36 9.00 9.05 9.52 9.34
Topics provided by the course instructor
Fig. 1: Presentation Skills Test (Peer model marking)
830
Am. J. Applied Sci., 4 (10): 828-833, 2007
9.50
Marks out of 10
80%
70%
Number of students
9.00
8.50
8.00
1
2
3
4
5
6
7
8
9
10
11
12
13
60%
50%
40%
30%
20%
10%
Series1 8.88 8.92 8.62 8.50 8.77 8.96 9.00 9.42 8.62 8.77 9.15 9.00 9.08
0%
Topics provided by the course instructor
1
2
3
4
5
Rate
Fig. 2: Presentation Skills Test (Lecuturer marking)
Fig. 6: Possibility of "pay-back"
of the students involved in the peer assessment model
agreed that the weight of peer assessment should be
30%. This maybe due to the fact that peer assessment
enhances the participation of group members in
cooperative work and that students regarded the
exercise as effective for learning and group work. It is
also important that students understand the assessment
criteria that they are being asked to apply. This is very
interesting since it indicates that students have trust in
being assessed by their peers.
Comparison between peer assessment marking and lecturer marking
10.50
M a rk o u t o f 1 0
10.00
9.50
Peer assessment marking
Lecturer marking
9.00
8.50
8.00
7.50
1
2
3
4
5
6
7
8
9
10
11
12
13
Peer assessment marking 9.94 9.76 9.44 9.40 9.06 9.29 9.13 9.74 9.36 9.00 9.05 9.52 9.34
8.88 8.92 8.62 8.50 8.77 8.96 9.00 9.42 8.62 8.77 9.15 9.00 9.08
Lecturer marking
Topics provided by the lecturer
Fig. 3: Comparison of results (peers & lecturer
According to the results from Fig. 5, (90%) of
students involved in the peer assessment model gave
high interest level in peer assessment while (10%) gave
the highest rate for this type of assessment. There was a
high correlation between the members of the groups
that found the peer assessment process an enjoyable
clear learning experience. This proves that final year
university students have learnt to take responsibility
and are ready to take decisions. Further, it indicates that
students began to appreciate the role played by the
lecturer in the assessment process.
Students in this study were asked if they were
worried about the possibility of pay-back by other
students. A total of (70%) of the students agreed on an
average level that they were partially worried about the
possibility of pay-back. Only (30%) of the students
were really worried about the possibility of payback by
other students. Fig. 6 indicates their responses about the
possibility of pay-back. All students felt they put time
and effort into their assessments and therefore they
were not all that worried about the possibility of payback.
One of the key issues to arise during the
development of a peer assessment model was the value
of this assessment model against the traditional
"lecturer-only" assessment. Students were asked if they
would prefer the peer assessment model for marking
their work rather than the lecturer. In Fig. 7 almost
(60%) wanted the peer assessment model to mark their
work while (20%) preferred the lecturer marking.
Although a range of studies have questioned the
reliability of students' perceptions about peer
assessment and that it may lead to lose quality in the
Q1: How much, in your point of view, the weight of
peer assessment should be?
Number of students
60%
50%
40%
30%
20%
10%
0%
1
2
3
4
5
Rate
Fig. 4: Peer assessment weight
Q2: How interesting is peer assessment?
Number of students
SCI-PUBLICATION Author Manuscript
Q3: Were you worried about the possibility of "pay-back"
by other students?
Overall presentation skills by lecturer marking
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
1
2
3
4
5
Rate
Fig. 5: Interest in peer assessment
831
Am. J. Applied Sci., 4 (10): 828-833, 2007
Q7: Do you think that peer assessment is suitable for
other courses?
Number of students
60%
50%
40%
30%
20%
10%
0%
1
2
3
4
5
Rate
Q4: Do you prefer peer assessment model marking over
the lecturer marking?
Fig. 10: Using peer assessment for other courses
45%
35%
Questions average
30%
5
25%
20%
4
15%
Rate (out of 5)
Number of students
40%
10%
5%
0%
1
2
3
4
5
3
2
1
Rate
0
Fig. 7: Peer assessment preference over lecturer
marking
Average out of 5
1
2
3
4
5
6
7
2.9
4.1
3.4
3.8
4
4.4
3.7
Questions number
Fig. 11: Overall questionnaire analysis
When asked if they liked this type of assessment,
students were very positive. This can be seen in Fig. 9
where 80% agreed to a high extent and 20% agreed to a
very high extent. This indicates that students had a good
experience as a result of this process and that
confidence and integrity can be developed and taught.
When students were asked whether they think that
this kind of assessment is suitable for other courses in
computer science, (50%) replied with high agreement.
Students have gained knowledge from this work and it
was a great learning style for students that may really
help in other courses. This is shown in Fig. 10 above.
Peer assessment may not be ideal for some courses in
computer science, but it certainly can be put to the test
for course projects and assignments.
In Fig. 11, the overall questionnaire average is
shown and each of the questions is given a mark out of
5. The results of this questionnaire are intended to
feedback into a later stage of development of the peer
assessment model in the Generic Computing Skills and
Professional Issues course and to elaborate the model to
assess the teamwork process.
Q5: Do you agree about such an assessment method?
70%
Number of students
60%
50%
40%
30%
20%
10%
0%
1
2
3
4
5
Rate
Fig. 8: Students agreement about peer assessment
Q6: In your opinion, do you think that the group liked this
kind of assessment?
70%
Number of students
SCI-PUBLICATION Author Manuscript
marking, students found the process to be fair, valuable,
enjoyable and helpful in developing. transferable skills
in research and communication and by this students had
enriched their experience.
When asking students about their agreement on
such a method, in Fig. 8 we can conclude that (60%) of
students replied with a high agreement. Students noted
the fairness of this model and the opportunities for
learning afforded by having to actually apply the
assessment criteria on other students.
60%
50%
40%
30%
20%
10%
CONCLUSION
0%
1
2
3
4
5
Peer assessment models provided a challenge and
one of the key challenges was to adapt a new
assessment method for a new course in the department
Rate
Fig. 9: Student preference of this method
832
Am. J. Applied Sci., 4 (10): 828-833, 2007
SCI-PUBLICATION Author Manuscript
8. Lin Kuan-Cheng; Yang Shu-Huey; Hung, J.C.;
Wang Ding-Ming, 2006. Web-based appreciation
and peer-assessment for visual-art education.
International Journal of Distance Education
Technologies, v 4, n 4: 5-14.
9. Rada, R., & Hu, K., 2002. Patterns in student–
student commenting. IEEE Transactions on
Education, 45: 262–267.
10. Sluijsmans, D., Brand-Gruwel, S., & van
Merrie¨nboer, J. J. G., 2002. Peer assessment
training in teacher education: effects on
performance and perceptions. Assessment and
Evaluation in Higher Education, 27: 443–454.
11. Smith, H., Cooper, A., & Lancaster, L., 1998.
Improving the quality of undergraduate peer
assessment: a case study from psychology.
Innovations
in
Education
and
Teaching
International, 39: 71–81.
12. Topping, K. J., 1998. Peer assessment between
students in colleges and universities. Review of
Educational Research, 68: 249–276.
13. Tsai, C.-C., 2001. The interpretation construction
design model for teaching science and its
applications to Internet based instruction in Taiwan.
International Journal of Education Development,
21: 401–415.
14. Tsai, C.-C., Lin, S. S. J., & Yuan, S. M., 2002.
Developing science activities through a networked
peer assessment system. Computers and Education,
38: 241–252.
15. Tsai, C.-C., Liu, E. Z. F., Lin, S. S. J., & Yuan, S.
M., 2001. A network peer assessment system based
on a Vee heuristic. Innovations in Education and
Training International, 38: 220–230.
16. van Hattum-Janssen & Lourenco, J.M., 2006.
Explicitness of criteria in peer assessment processes
for first-year engineering students Source. European
Journal of Engineering Education, v 31, n 6: 683-91.
17. Wenzel, Thomas J., 2007. Evaluation tools to guide
students' peer-assessment and self-assessment in
group activities for the lab and classroom, Journal of
Chemical Education, v 84, n 1, January: 182-186.
18. Woolhouse, M., 1999. Peer assessment: the
participants’ perception of two activities on a further
education teacher education course. Journal of
Further and Higher Education, 23: 211–219.
of computer science at the University of Bahrain. This
success of the peer assessment method was based upon
the following findings. Students felt they learnt a great
deal throughout the assessment process. Students felt
that peer assessment should be given a reasonable
weight. (2.9/5.0). Students enjoyed assessing the work
of their peers (4.2/5.0). An average number of students
were worried about the possibility of pay-back ratings
by their peers (3.4/5.0). A significant proportion
(3.8/5.0) of students preferred this type of assessment to
"lecturer only" assessment. A number of cautions must
be made. First, the sample of Generic Computing Skills
and Professional Issues students was quite small and
focused. Second, the results record only the students'
perception at the time of the study. As a result of
students feedback the peer assessment method scored a
rate of (3.75) out of (5) and this may help other courses
in the University to benefit from the peer assessment
method and improve assessment and teaching within
the faculty.
REFERENCES
1. Al-Bahadly, I., 2006. Integrating assessment and
learning
into
engineering
education.
WSEAS Transactions on Advances in Engineering
Education, v 3, n 3: 209-16.
2. Biggs, J., 1999. Teaching for Quality Learning at
University. Buckingham: SRHE and Open
University Press.
3. Bulman, T., 1998. Peer assessment between
students in colleges and universities. Review of
Educational Research. 168 (3): 249-277.
4. Elliott, N and Higgins, 2005. A Self and peer
assessment – does it make a difference to student
group work. Nurse Education in Practice, Volume
5, Issue 1: 40-48.
5. Falchikov, N., & Goldfinch, J., 2000. Student peer
assessment in higher education: a meta-analysis
comparing peer and teacher marks. Review of
Educational Research, 70: 287–322.
6. Freeman, M., & McKenzie, J., 2002. SPARK, a
confidential web-based template for self and peer
assessment of student teamwork: benefits of
evaluating across different subjects. British Journal
of Educational Technology, 33: 551–569.
7. http://www.keele.ac.uk/depts/cs/Stephen_Bostock/d
ocs/bostock_peer_assessment.htm
833