Berrezueta 2023 Cseet 2

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Plagiarism Detection and its Effect on the Learning

Outcomes
1st Jonnathan Berrezueta-Guzman 2nd Markus Paulsen 3rd Stephan Krusche
Technical University of Munich Technical University of Munich Technical University of Munich
[email protected] [email protected] [email protected]

Abstract—The consistent effort and autonomous problem- who lack prior exposure to plagiarism control measures are
solving required in programming instruction presents unique more prone to committing plagiarism. Lastly, the study reveals
educational challenges, with plagiarism being a significant factor that early communication of plagiarism control measures at
negatively correlated with independent programming perfor-
mance. The prevalence of digital communication and information the beginning of the course serves as a motivational factor for
exchange further complicates this issue. students, leading to improved academic performance.
This research paper aims to investigate and discern patterns of Notably, the outcomes of the third experiment hold sub-
plagiarism among first-year computer science undergraduates. To stantial implications. The data unequivocally indicate that
investigate the research questions, we conducted three controlled
students who are aware of and adhere to plagiarism control
experiments. The first experiment demonstrated that students
were more likely to commit plagiarism when provided with the measures achieve their intended learning outcomes more ef-
opportunity to do so. The second experiment corroborated this, fectively (achieving higher grades). This accomplishment in
showing a tendency for plagiarism among students who had not turn helps them to avoid complications in subsequent subjects
previously been exposed to any plagiarism mitigation strategies. throughout their academic journey, safeguarding the quality
The third experiment revealed that early awareness of plagiarism
of their education. Moreover, these positive outcomes extend
mitigation strategies leads to enhanced student performance,
indicating that awareness of these measures acts as a stimulant beyond the realm of academia, benefiting students in their
for autonomous learning. future professional opportunities.
Index Terms—programming education, interactive learning, This paper systematically presents its content in the follow-
online training and education, software engineering education ing way: Section II is dedicated to reviewing and summarizing
for novices, vision for education in the future
relevant existing related work, along with their respective
conclusions and limitations. Section III explains the method-
I. I NTRODUCTION
ology adopted in this research. It outlines the organizational
Plagiarism poses a substantial challenge in introductory pro- aspects of the conducted courses and describes the specific
gramming courses, presenting both immediate and long-term tool employed for plagiarism control. Section IV presents a
consequences for students [1]. Extensive research has estab- meticulous overview of the experimental design, data collec-
lished a strong correlation between plagiarism and academic tion procedures, and the subsequent results derived from the
failure, with students who engage in plagiarism consistently experiments.
achieving the poorest performance in individual examinations In Section V, the paper offers a discussion to critically
[2]. Moreover, the implications of plagiarism extend beyond analyze and interpret the obtained results. This section serves
the academic realm, impacting students’ professional lives and to emphasize the significance of the findings, considering their
imposing limitations on their subsequent performance [3], [4]. implications and potential ramifications. Finally, Section VI
The task of plagiarism detection in assignments within concludes the research by summarizing the key insights and
large-scale courses presents a formidable challenge. This conclusions drawn from the study. It also presents potential
complexity arises from the fact that the required effort for avenues for future research, highlighting areas that warrant
plagiarism detection is directly proportional to the number of further investigation and suggesting potential directions for
students, while the number of reviewers available for the task improving plagiarism control.
usually remains constant [5]. Without the aid of specialized
tools, determining instances of plagiarism, as well as identify- II. R ELATED WORK
ing the originators/creators (providers) and recipients (takers)
of plagiarized content, becomes a non-trivial problem within According to Jenkins and his colleagues, plagiarism hap-
the domain of computer science education [6]. pens mostly when the student encounters the opportunity [7].
This paper makes a significant contribution by conducting Additionally, some researchers experimented and concluded
three rigorous experiments to validate key hypotheses. Firstly, that plagiarism is related to high levels of failure in academic
the findings confirm that the presence of an opportunity to do courses and especially those involving programming exercises
so significantly increases the likelihood of students engaging [8], [9]. Therefore, including a plagiarism control system for
in plagiarism. Secondly, the results demonstrate that students the source code of programming assignments has motivated

© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including
reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or
reuse of any copyrighted component of this work in other works.
students to work autonomously and to improve their academic unique exercise per student while the second one is conducting
performance [4], [10]. individual presentations. The first technique suggested a con-
Kaya and Özel extended the Moodle system with the GNU siderable reduction in plagiarism detection in different classes,
Compiler Collection (GCC)1 and the Measure of Software as there was a reduction of inter-report similarity of 13 %.
Similarity (MOSS)2 source code plagiarism detection tool The second technique suggested that its application motivated
to decrease the effort for the assessment of programming the students to avoid plagiarism. However, Halak and El-
assignments and prevent students from plagiarism in the Data Hajjar didn’t mention the necessary effort of the instructors
Structure course [13]. The experiment was conducted with two of conducting these techniques.
courses of the same subject, and the same professor but in Finally, Moss and his colleagues conducted a systematic re-
different years. view of 83 empirical papers to clarify the psychological causes
The results indicated that the first group (which was not of plagiarism [17]. The finding of this study established that
aware of the plagiarism check) was the one with lower the possible causes can be the emphasis on competition and
performance (29.8 % of students failed) in comparison to success rather than development and cooperation coupled with
the second group (which was aware of the plagiarism check) impaired resilience, limited confidence, impulsive tendencies,
where 21.6 % of students failed (8.2 % lower failure rate). and biased cognitions.
As it was expected, the first group was the one with a higher These studies and their results collectively contribute to
amount of plagiarism cases, taking into account a threshold the understanding of plagiarism in programming courses and
of 50 % of similarity. However, the experiment had only 3 provide insights into effective approaches for plagiarism de-
programming assignments and also included 1 midterm and 1 tection, prevention, and fostering academic integrity among
final exam. students.
Pawelczak used a tool for tokenizing and averages certain
characteristics of the source code [14]. He analyzed 5 years
of data from a course on the introduction of programming in III. M ETHODOLOGY
C language with 7 programming exercise assignments. The
typical threshold used was 80 % in this study due to the This research study aims to address three primary research
average length of the exercises. questions:
The results established that since the implementation of the
tool, the performance of the students in examinations is about • RQ1: Do students exhibit a higher propensity to engage
8.6 % better. They also find that about 84 % of the stu- in plagiarism when presented with an opportunity?
dents who failed plagiarized in the programming assignments. • RQ2: Does prior experience with plagiarism checks en-
Additionally, the tool detected about 92 % of students who hance students’ motivation to work autonomously, com-
abandoned the course were involved in committing plagiarism. pared to those without such experience?
Pawelczak also implemented an experiment where he com- • RQ3: Does the announcement of plagiarism checks at
pared two introductory programming courses on their assign- the beginning of the course yield beneficial outcomes for
ments and their learning outcomes [15]. He checked the stu- students?
dents’ submissions (for 7 assignments in total) for plagiarism To address these questions, we devised and executed three
directly after the submission deadline while for the second one, experiments spanning three semesters within the Information
we conducted the plagiarism check after the semester ended. Engineering bachelor’s degree program: winter semester 2021-
The results indicate that the average percentage of stu- 2022 (WS21/22), summer semester 2022 (SS22), and winter
dents plagiarizing in the first group is 11.1 %, while in semester 2022-2023 (WS22/23).
the second group is 35.1 % (24 % higher than the first
During the winter semesters, we conducted an intro-
group). Additionally, the second group was the one with lower
ductory programming course (InProg) targeted at first-
performance in the final examination. However, the students
semester students, facilitating their progression from beginner
in this group presented better skills in fundamental coding.
to intermediate-advanced programming skills. In the summer
Pawelczak believes this is due to the students being forced
semester, we offered an Introduction to Software Engineer-
to edit their implementations (adding loops, outsourcing code
ing (ISE) course, intended for second-semester students to
into functions, etc.) in order not to be caught by the plagiarism
develop advanced to professional-level software engineering
detection tool.
and programming skills.
On the other hand, Halak and El-Hajjar presented two tech-
While it is not mandatory to have completed InProg to
niques to prevent plagiarism [16]. The first one is assigning a
enroll in ISE, students are expected to leverage their acquired
1 It is a free and open-source compiler system developed by the GNU knowledge from InProg to successfully tackle the proposed
Project. It is used to compile various programming languages such as C, exercises and the final exam.
C++, Objective-C, Fortran, Ada, and others [11]. These courses employ interactive learning, which aims to
2 It is an automatic system for determining the similarity of programs. To
date, the main application of Moss has been focused on detecting plagiarism minimize the temporal gap between theoretical learning and
in programming classes [12]. practical application by students [18]–[21].
A. Course Organization While other tools may employ different detection methodolo-
InProg. The introductory programming course course takes gies, such as Neural Networks [25], JPlag has proven to be
place over 12 weeks in which students not only take the lecture highly efficient in terms of analysis time and accuracy when
but also work on exercises that prepare them for the final exam. identifying instances of plagiarism among submissions.
Quiz exercises. At the beginning of each lecture, students When students upload their solutions through Artemis,
are presented with a set of 3-5 questions associated with instructors have the option to manually initiate a similarity
the topics covered in the previous lecture. They are given 5 analysis run by setting a threshold based on the exercise’s
minutes to solve these quiz exercises. When the time is up, the length. For instance, the threshold can be determined by
instructor proceeds to explain the solution and then addresses the average number of lines of code required to complete a
any doubts or objections raised by the students afterwards. programming-based exercise, the length of words in a text-
Tutor exercises. These exercises consist of two assignments based exercise, or the components used in a model-based
per week and primarily focus on programming tasks of easy exercise.
to medium difficulty. Students collaboratively work on these Figure 1 illustrates the sequential steps involved in the
exercises during tutoring sessions, fostering the development automatic checking, manual analysis, student notification, stu-
of soft skills such as effective communication and teamwork. dent statement, and verdict of a plagiarism case, carried out
The collaborative nature of these exercises prepares students throughout the utilization of the Artemis platform. These steps
for independent problem-solving in subsequent homework contain the following actions:
exercises. 1. Automatic check: Once the exercise instructors activate
Homework exercises. These assignments comprise two the automatic plagiarism checker in the plagiarism detection
weekly programming exercises of medium to hard difficulty. screen in Artemis by clicking on the ”Detect Plagiarism”
The submissions of these exercises contribute to the practical Button, all student submissions are compared against each
part (Prog) of the InProg course’s final grade and aid in other using JPlag’s greedy string tiling algorithm in order to
preparing students for the final exam, which determines the obtain a measure of the similarity of a submission pair (in
theoretical part (In) grade of the course. percent) and a plagiarism report describing the similarities
Presentations. Homework exercises and tutorial exercises found. Artemis now displays to the exercise instructors every
are presented by students each week during the tutorial ses- submission pair above the aforementioned threshold and an
sions. This is a suitable environment to discuss the solutions overall similarity distribution.
and resolve any remaining questions. This is an additional Overall, this step preprocesses and discards a significant
step to check if students can understand and present their own number of submission pairs that are not suspected to be
solution. involved in plagiarism (indicated by a similarity measure
ISE. The course Introduction to Software Engineering fol- below the threshold). Therefore, the exercise instructors can
lows a similar methodology to InProg, but with the inclusion focus exclusively on the cases which are suspected to be
of other exercise types, such as text exercises and modeling plagiarism cases.
exercises, in addition to programming tasks. Students take it 2. Manual analysis: The exercise instructors inspect the
in their second semester, as it builds on the InProg. plagiarism report, which displays the two similar code files
side by side. The part of the code that was not included in the
B. Plagiarism Detection Process and Tool template and overlaps is highlighted in blue. Subsequently, the
Artemis is an open-source learning and research platform3 . exercise instructors decide whether the similarities found are
It is designed to facilitate the creation and automated assess- indeed indicative of plagiarism.
ment of programming, modeling, text, and file uploading ex- Therefore, they try to spot typical plagiarism patterns and
ercises [22]. Students participate in programming exercises by other recognizable indications like removing/adding white
committing and pushing code in a version control repository. spaces, renaming variables, identical variable naming strate-
They receive feedback from automatically executed tests and gies, changing between similar types of control statements,
static code analysis in a continuous integration environment. identical comments, identical text patterns, identical mistakes,
Artemis incorporates a semi-automatic system for detecting and identical elements within a UML diagram. In case there
similarity among submissions of the same exercise [23]. is sufficient reasonable evidence of plagiarism, the exercise
In order to achieve this, Artemis integrates JPlag, a plagia- instructors will mark the submission pair as a ”plagiarism
rism detection service equipped with a Greedy-String-Tiling case”. Otherwise, they will mark the submission pair as a ”no
algorithm [24]. JPlag is capable of identifying pairs of similar plagiarism case”.
programs within a given set of programs and supports multiple This step is necessary as automatic checking is not yet able
programming languages, including C, C++, Java, and more. It to accurately detect all cases and would produce too many
compares the abstract syntax tree of two programs. This way, false positives. Therefore, the exercise instructors manually
the comparison hits even if students copy the code and make review all cases with high similarities and accept or deny each
simple changes to variable names or the order of statements. case.
3. Student notification: If the exercise instructors (after
3 https://github.com/ls1intum/Artemis a second screening) decide a particular pair of submissions
1. Automatic check 2. Manual analysis 3. Student notification 4. Student statement 5. Final verdict

Fig. 1. Procedure of the plagiarism check: The instructors evaluate the results of the automatic check for plausibility and notify the identified students, who
can submit a statement on their case. Depending on the severity of the plagiarism and the student’s statement, the instructors decide on the final verdict.

to be a case of plagiarism, they will issue a notification conducted with the involved students in a plagiarism case. The
email in Artemis to both involved students. The email will determination of their roles relies partially on the admissions
inform them of the suspected plagiarism and the consequences provided by the students who acknowledge their involvement.
they can expect (e.g., failing the course and only getting one Verification is feasible due to the availability of precise
more chance), and the opportunity to submit comments within data in Artemis, including the specific number and timing of
the following seven days. In either case, both students are commits made by the students. Typically, the giver is charac-
suspected of plagiarism, as there is usually no way to clearly terized as the student who initiates and completes the task first,
identify the sender and the participant of an deception before while also demonstrating a higher frequency of commits. Con-
contacting them. This step is required to inform the student of versely, the taker often exhibits contrasting behavior, typically
the suspected plagiarism, which may or may not subsequently commencing the task later and generating fewer commits. By
result in a plagiarism conviction. considering these observed patterns and leveraging the data
4. Student statement: The student responds to the notifi- provided by Artemis, instructors can discern the roles of the
cation by using the link embedded in the email. Using this taker and giver in the plagiarism case.
link, the student can view the comparison between the own
work and the anonymized work of the other involved student. IV. E XPERIMENT AND R ESULTS
The student explains in detail why the work is supposed to be A total of three experiments were conducted over three
original and therefore the suspicion is unfounded. semesters. The organization of these experiments, along with
The student must demonstrate that the assumptions made their respective pre-conditions, student groups, and the corre-
by the exercise instructors regarding the plagiarism patterns sponding courses involved, are presented in Table I.
detected were incorrect or that the accuracy of the detection
was too low. Likewise, they have the opportunity to admit TABLE I
their misconduct and facilitate further proceedings. This step O RGANIZATION OF THE EXPERIMENTS . AWARENESS MEANS WHETHER
STUDENTS HAVE BEEN INFORMED ABOUT PLAGIARISM CONTROL .
allows the student to justify the submission against suspicion
of plagiarism, which may be warranted in the case of a false Experiment Students’ groups Awareness Course
positive. WS21/22 HN01 (70 students)
Since the middle
InProg
of the course
5. Final verdict: The exercise instructors meticulously HN01 (44 students) Since the beginning
assess and evaluate whether the justifications made by the SS22 ISE
GA01 (1621 students) of the course
student are steadfast. This involves analyzing whether the Since the beginning
WS22/23 HN02 (61 students) InProg
of the course
students’ justification demonstrate that they fully mastered the
concepts taught in the lecture and exercises, and whether there
are any unresolved gaps or problems in the reasoning of the
exercise instructors. A. First Experiment - WS21/22
Finally, the exercise instructors decide on a verdict of either The objective of this experiment was to investigate whether
”plagiarism” (in case the justifications were not steadfast), students (HN01 group) in the InProg course engaged in
resulting in a failing grade, or ”no plagiarism” (in case plagiarism when they were not initially informed about the im-
the justifications were steadfast), which has no immediate plementation of plagiarism checks. Additionally, we focused to
consequences, except that students are now generally more assess their behavior after the announcement of the plagiarism
careful with their submissions. A third alternative would be to checks.
deduct points in the respective exercise and to issue a warning. In order to achieve this, we chose not to inform the students
In this last step, the exercise instructors pronounce their about the plagiarism control at the beginning of the semester
final verdict for the two students involved, whereupon the but rather in the middle of the course period, specifically in
appropriate consequences are announced. This final verdict is week 8. During that week, we provided detailed information
documented in Artemis together with the plagiarism report in on how the checks were conducted (as described in the
case the data is needed later, e.g. if the student complains to previous section) and the penalties.
the examination board or sues the universities in court. Figure 2 illustrates that the plagiarism control was initiated
The identification of the taker (the student who engaged in exercise H03E01 (where the x-axis starts), as the initial two
in copying) and the giver (the student who facilitated the weeks primarily involved simple exercises with minimal lines
copying) can sometimes be ascertained through the discussions of code (conducting a plagiarism analysis on these exercises
would have resulted in an unnecessarily high number of was adequately complex, presented a specific context, and
plagiarism accusations). demanded students to propose their solutions. Furthermore,
We observed an average of 20 instances of similarity the notification of the plagiarism control served as an alert to
exceeding the predetermined threshold per exercise during the refrain from copying, resulting in increased time investment
first 8 weeks (the number of possible cases of similarity likely and commits compared to the other exercises
depends on the difficulty level of each exercise). However, Similar patterns were observed in exercises H09E02 and
in week 8, during the lecture, we explicitly announced to H13E01, where a substantial number of hours were invested
the students that plagiarism checks would be conducted for and only a few instances of similarity were found. However,
assignments starting from week 9. in the case of H13E01, these instances were confirmed as
As a result, we subsequently observed a significant decrease instances of complete plagiarism. Hence, it can be inferred
of 84 % in the number of submissions with similarity above the that the occurrence of plagiarism in these exercises can be at-
threshold for homework H09E01. Despite the prior notification tributed to their complexity. The students were confronted with
during the lecture, we had to issue plagiarism notifications to challenging exercises that pressured them to utilize plagiarism
4 students, indicating their involvement in plagiarism (First in order to resolve them.
notification).
40 350
37
316,2
35 300
40 31
36 253,3
30
35 250
172,7
25 23
30 29 200
28
27
20
25 25
25 23 150
21 21 15
20
19
20 18 100
10

15 5 50
10
10 9
8 0 0
7
6 6
H01E01
H01E02
H02E01
H02E02
H03E01
H03E02
H04E01
H04E02
H05E01
H05E02
H06E01
H06E02
H07E01
H07E02
H08E01
H08E02
H09E01
H09E02
H10E01
H10E02
H11E01
H11E02
H12E01
H12E02
H13E01
5
44
5 3
22
1
0 Average commits Average invested time in hours
H03E01
H03E02
H04E01
H04E02
H05E01
H05E02
H06E01
H06E02
H07E01
H07E02
H08E01
H08E02
H09E01
H09E02
H10E01
H10E02
H11E01
H11E02
H12E02
H13E01

Log. (Average commits) Log. (Average invested time in hours)

Simil arity cases over the threshold First notification (warning)


Fig. 3. Quantitative representation of the average commit frequency and the
quantified time investment required to solve course exercises, as captured from
Second notification (Failed Prog) Logarithmic trendline
the Artemis learning platform data.

Fig. 2. Incidents of code similarity identified through the use of Artemis, in


conjunction with notifications of potential instances of plagiarism throughout Out of the 12 students who received at least one notification
the course assignments. for a plagiarism case, 8 exhibited characteristics indicative
of being takers. These students displayed a lower number
Additionally, it is worth noting that certain submissions of committed activities and invested less time in solving the
for homework H09E02, H10E01, and H10E02 exhibited a exercises compared to their peers who were also involved in
similarity level surpassing the predefined threshold. However, plagiarism cases.
upon conducting a comprehensive review by the instructors, Figure 4 illustrates that exercises H09E01 and H13E01
it was concluded that no instances of plagiarism were present had the lowest levels of participation. With the exception
in these submissions (false-positive cases). Consequently, the of exercise H06E01, which had a high level of complexity
students involved were not notified of any plagiarism concerns. (as indicated by the number of attempts in Figure 3), the
In week 11, during the assessment of assignments H11E01 complexities of the other exercises were generally at a medium
and H11E02, the plagiarism control was conducted, leading to level.
the identification of 10 new cases of clear plagiarism involving It was evident that, following the plagiarism control notifica-
students. Out of these 10 students, 4 were notified for the sec- tion, students either strived to work independently or gave up.
ond time, resulting in immediate course failure. Additionally, This finding confirmed our initial expectation, that the takers
6 students received their first notification regarding plagiarized would be inclined to avoid making any effort, while the givers
submissions. Overall, there were 8 students who received preferred not to share their solutions. Without counting with
a first notification, while 4 students failed the course upon the tendency of quite this kind of courses.
receiving the second notification. At the conclusion of the course, a final exam was ad-
Through an analysis of the time invested and commits made ministered, with the programming component accounting for
in each exercise, Figure 3 reveals that assignment H09E01, 70 % of the total grade. The analysis of the results revealed
which included the initial 4 plagiarism cases, required the that the 4 students who received notifications for committing
highest amount of time and had the most pushed commits by plagiarism and subsequently failed the practical part of the
students. The instructors conclude that this particular exercise course also failed the final exam. Similarly, the 8 students
100% in the participation rates of the two groups. The HN01 group
40% 43% 40% experienced a reduction in participation of 22.6 % (as shown in
80% Figure 5), while the GA01 group exhibited a more significant
decrease of 40.7 %, nearly double the reduction (as shown in
60% Figure 6).

40%

100%
20%

80%

0%
H01E01
H01E02
H02E01
H02E02
H03E01
H03E02
H04E03
H04E01
H04E02
H05E01
H05E02
H06E01
H06E02
H07E01
H07E02
H08E01
H08E02
H09E01
H09E02
H10E01
H10E02
H11E01
H11E02
H12E01
H12E02
H13E01
60%

Participation Average score Log. (Participation) Log. (Average score)


40%

Fig. 4. Quantitative representation of student participation, alongside corre- 20%


sponding average grade distributions, throughout the duration of the course.
0%

H01E01
H01E02
H02E01
H02E02
H02E03
H03E01
H03E02
H03E03
H04E01
H04E02
H04E03
H05E01
H05E02
H06E01
H06E02
H06E03
H07E01
H07E02
H07E03
H08E01
H08E02
H09E01
H09E02
H10E01
H10E02
H11E01
H11E02
who exhibited characteristics consistent with being takers also
Participation Average score Log. (Participation) Linear (Av erage score)
failed the exam.
Hence, we can infer that these 8 students were indeed takers.
Fig. 5. Quantitative representation of student participation, alongside cor-
This outcome validates the efficacy of our plagiarism analysis responding average grade distributions, of the HN01 group throughout the
in the exercises, as it aligns with our initial expectations. duration of the ISE course.

Finding 1: In the absence of plagiarism control, students


are more likely to engage in plagiarism when presented
100%
with the opportunity.
80%
Furthermore, it suggests that students who commit plagia-
rism, referred to as takers, generally exhibit lower performance 60%

in autonomous examinations. In contrast, students who facili- 40%


tate plagiarism, referred to as givers, tend to demonstrate better
performance. 20%

0%
Finding 2: Following a plagiarism control notification,
H01E01
H01E02
H02E01
H02E02
H02E03
H03E01
H03E02
H03E03
H04E01
H04E02
H04E03
H05E01
H05E02
H06E01
H06E02
H06E03
H07E01
H07E02
H07E03
H08E01
H08E02
H09E01
H09E02
H10E01
H10E02
H11E01
H11E02
students either strive to work independently or simply
give up. Participation Average score Log. (Participation) Linear (Av erage score)

B. Second Experiment - SS22 Fig. 6. Quantitative representation of student participation, alongside corre-
sponding average grade distributions, of GA01 group throughout the duration
This experiment involved the comparative analysis of pla- of the ISE course.
giarism behavior between two distinct groups of students
within the same subject but in different locations. The first The occurrence of detected plagiarism cases exhibited a
group consisted of students from Experiment 1, referred to substantial disparity between the HN01 and GA01 groups. In
as the HN01 group, while the second group comprised stu- the HN01 group, only 3 exercises indicated a similarity level
dents who had not previously undergone a plagiarism control surpassing the established threshold, as shown in Figure 7. In
process, denoted as the GA01 group. contrast, the GA01 group indicated 15 exercises with a sim-
In this experiment, both groups were immediately informed ilarity level above the threshold, which suggests a quadruple
about the implementation of a plagiarism check for all as- increase compared to the HN01 group, as shown in Figure 8.
signments at the beginning of the SS22 semester. The subject The distribution of plagiarism cases was more evenly spread
taught during this period was ISE, encompassing programming across the exercises in the GA01 group. In the HN01 group, a
tasks, modeling exercises, and text-based assignments. total of 12 students were notified, out of which 2 students
The average grade between the HN01 and GA01 groups received a second notification and subsequently failed the
exhibited minimal imbalance. The HN01 group achieved an course. On the other hand, in the GA01 group, a larger
average grade of 77.4 %, whereas the GA01 group obtained number of students, specifically 86 students, were notified
an average percentage of 78.3 %, resulting in a difference of of plagiarism cases, and among them, 22 students failed the
less than 1 %. However, a notable divergence was observed course.
8 ments throughout the course duration (due to the semester
7
7 planning, week 12 had no assignments).
6 Figure 9 shows a notable decline in the number of similarity
5 5 5
5 checks, reaching zero starting from week three. In contrast,
4 to experiment 1 with the HN01 group (Figure 2), there
3 is no provision for a first notification in this experiment,
2 2 2
2 and students fail the course directly upon confirmation of a
1 plagiarism case.
0 Consequently, assignments (from week 4 onwards) exhibit
H07E03 H08E01 H08E02

Simil arity cases over the threshold First notification Second notification (Fail the course)
no instances of similarity surpassing the established threshold.
This new condition has led to a significant reduction in the
Fig. 7. Quantitative representation of the plagiarism cases, alongside corre-
instructor’s effort required for reviewing similarity cases.
sponding notifications of HN01 group throughout the duration of the course.

5
30
26 26
25 4
4

20
17 17 3
16 16
15 2 2
12 12
10 2
10
7
6 66 6
5 4 44 4 4 4 4 4 4 4 4
3 1
22 2 2 2 2 22
0 0 0 0 0 0 0 0 0
0
0
01

02

03

01

02

03

02

03

03

01

02

01

02

02

02
3E

3E

3E

4E

4E

4E

5E

6E

7E

8E

8E

9E

9E

0E

1E
H0

H0

H0

H0

H0

H0

H0

H0

H0

H0

H0

H0

H0

H1

H1

H0 1

H0 2

H0 1

H0 2

H0 1

H0 2

H0 1

H0 2

H0 1

H0 2

H0 1

H0 2

H0 1

H1 2

H1 1

H1 2

H1 1
02
0

0
3E

3E

4E

4E

5E

5E

6E

6E

7E

7E

8E

8E

9E

9E

0E

0E

1E

1E
H0

Similarity cases over the threshold First notification Second notification (Fail the course)
Similarity cases over the threshold Definitive notification (Fail the course) Similarity trendline

Fig. 8. Quantitative representation of the plagiarism cases, alongside corre- Fig. 9. Quantitative representation of the plagiarism cases, alongside corre-
sponding notifications of GA01 group throughout the duration of the course. sponding notifications of HN02 group throughout the duration of the course.

Upon analyzing Figure 10, it is evident that the distribution


Finding 3: Students without previous plagiarism control
of grades in the practical component of the course (Prog)
exposure tend to plagiarize more than those with plagia-
exhibits a higher level of performance, as indicated by the
rism control experience. Awareness of plagiarism control
average score of 1.7. This performance surpasses that of the
also increases submission caution.
HN01 group in the first experiment, as depicted in Figure 11,
where the average score was 2.7. The high number of 5.0
C. Third Experiment - WS22/23 students grades, is among the students who failed the course
The third experiment, conducted in the WS22/23 semester, because of insufficient grades and also those students who quit
followed a similar approach to the first experiment (WS21/22), the course.
but with some modifications. In this experiment, all students The implementation of plagiarism control measures, cou-
(HN02 group) in the InProg course were informed about the pled with an early announcement of its enforcement at the
presence of plagiarism control right from the beginning of the beginning of a course, not only serves to mitigate instances
course. The student cohort consisted of 61 individuals, with of plagiarism but also facilitates enhanced comprehension of
11 of them repeating the subject. Among these 11 students, 4 programming concepts among students. This improvement in
had previously failed the course in the WS21/22 semester by understanding is subsequently reflected in the grades achieved
a plagiarism verdict. by the students.
In line with the findings from the previous experiment, the Upon comparing the performance of the HN02 group, which
determination of the similarity threshold in this experiment experienced the early announcement and implementation of
was based on the number of lines of code. Additionally, plagiarism control, with the HN01 group, it is evident that the
exercises that tended to have a limited number of possible former demonstrate superior performance in the final exam
solutions were excluded from the plagiarism control analysis. (In). The HN02 group achieved an average score of 2.7 (see
Furthermore, the number of assignments in this experiment Figure 12), whereas the HN01 group obtained an average score
was reduced to 22, with students receiving 2 weekly assign- of 3.7 (see Figure 13).
16 20

18
14

16
12
14

10
12

8 10

8
6

6
4
4
2
2

0 0
5,0 4,7 4,3 4,0 3,7 3,3 3,0 2,7 2,3 2,0 1,7 1,3 1,0 5,0 4,7 4,3 4,0 3,7 3,3 3,0 2,7 2,3 2,0 1,7 1,3 1,0

Fig. 10. Graphical representation of grade distribution for the practical Fig. 13. Graphical representation of final examination grades (In) for the
programming course (Prog) during the winter semester WS22/23. The average winter semester WS21/22. The average grade of the course is highlighted in
grade of the course is highlighted in blue. blue.

34

This finding was validated by enhanced performance in indi-


29
vidual exams. The improvement underscores the effectiveness
24
of plagiarism control awareness in boosting programming skill
acquisition. At this point, it is important to highlight that these
19 three findings have the potential for broad applicability across
various courses within the domain of software engineering and
14 programming education.
9
V. D ISCUSSION
4
The research questions raised in this article have been
addressed through a series of three experiments.
-1
5,0 4,7 4,3 4,0 3,7 3,3 3,0 2,7 2,3 2,0 1,7 1,3 1,0
1. Opportunity to commit plagiarism. The findings of
the first experiment shed light on some intriguing patterns of
Fig. 11. Graphical representation of grade distribution for the practical
programming course (Prog) during the winter semester WS21/22. The average student behavior. It appears that students exhibit an elevated
grade of the course is highlighted in blue. propensity to engage in plagiarism when they perceive an
opportunity to do so. This inclination to engage in such
20
practices is a notable observation and further emphasizes the
18 need for vigilant plagiarism control measures in academic
16 environments.
14 However, when the plagiarism control was announced and
12
communicated clearly, the dynamics shifted noticeably. There
was a significant reduction in the number of suspected cases
10
of plagiarism by as much as 72 %. This statistic shows the
8
deterrent effect that communication about plagiarism control
6 measures can have on students’ predisposition to plagiarize.
4 2. No experience with plagiarism control. The results
2 obtained in the second experiment shed significant light on
0
the effects of prior exposure to plagiarism control measures on
5,0 4,7 4,3 4,0 3,7 3,3 3,0 2,7 2,3 2,0 1,7 1,3 1,0 student behavior. It was found that students in group GA01,
who had no previous experience with such controls, were more
Fig. 12. Graphical representation of final examination grades (In) for the prone to commit plagiarism across a variety of exercises.
winter semester WS22/23. The average grade of the course is highlighted in
blue. Conversely, the group that had previous experience with
plagiarism control measures (group HN01) exhibited markedly
lower instances of attempted plagiarism. This distinct contrast
Finding 4: Early disclosure of plagiarism control within
between the two groups underscores the importance of regu-
the course significantly fostered student engagement in
lar and consistent enforcement of anti-plagiarism policies. It
autonomous work.
appears that experience with such measures fosters a culture
of academic integrity among students, leading to decreased of programming and software engineering. As a consequence,
incidences of plagiarism. it is not possible to guarantee the generalizability of the results
However, it is important to note that this experiment does obtained from these experiments to other learning topics.
not facilitate a direct comparison of results with the final Stress Presence. In WS21/22, one student, who failed
exam. The GA01 group completed the final exam without the course due to plagiarism, expressed, ”I independently
supervision, whereas the HN01 group underwent a supervised completed all the assignments this semester (WS22/23) as I
examination. Consequently, it would be unfair to make a did not wish to fail again. It was not worth repeating the course
comparison between these groups in this particular context. because I had shared code for two exercises in the previous
3. Initial warning. The third experiment conducted as semester.” This indicates that students understand the severe
part of this study offers compelling evidence about the ef- consequences of committing plagiarism. Therefore, they tend
fectiveness of the early implementation of plagiarism con- to avoid any misconduct practice during the development of
trol measures. Mirroring the design of the first experiment independent assignments. Additionally, new students also are
but targeting a new cohort of students (group HN02), this aware of this, and they also tend to avoid this misconduct.
investigation revealed an overwhelmingly positive impact of
early plagiarism control introduction. Indeed, the instances VI. C ONCLUSION AND F UTURE W ORK
of similarity in submitted assignments, which could poten-
tially indicate plagiarism, saw a drastic reduction by 96.5 %. The primary contribution of this article is its provision of
This statistically significant finding underscores the preventive empirically backed findings that illustrate the propensity of
power of plagiarism control measures when instituted at the students to commit to plagiarism when given an opportunity.
start of the semester, making students acutely aware of the The data also reveals an elevated likelihood of plagiarism
scrutiny their work will undergo. among students that have not previously encountered plagia-
Furthermore, when comparing student performance across rism control initiatives in previous courses.
two different academic years (WS21/22 and WS22/23), the Additionally, this study suggests that the mere act of inform-
data revealed an encouraging upward trend. The overall aver- ing students about plagiarism control measures significantly
age grades for exercises witnessed an improvement of 24.6 %. deters plagiaristic activities. This not only leads to a decrease
These findings offer a compelling suggestion that the intro- in plagiaristic instances but also fosters a positive impact on
duction of plagiarism control mechanisms can act as a pow- students’ overall academic performance as evidenced by their
erful catalyst in motivating students to work autonomously. improved results in both assignments and examinations.
Rather than resorting to shortcuts such as copying others’ Given the deductions drawn from this investigation, it is
work, students, when faced with the awareness of plagiarism highly recommended that anti-plagiarism mechanisms be in-
checks, show a tendency to invest their time and energy in stituted and explicitly communicated to the students for every
understanding the intricacies of programming, practicing their assignment throughout the course’s duration. Evidence from
skills, and enhancing their grasp of the subject matter. the third experiment further corroborates this recommendation
Importantly, the initial notification of plagiarism control did by showcasing how such measures foster independent work
not hinder the formation of study groups, as students were ethics among the students.
permitted to collaborate during tutor sessions and outside of Moving forward, the validation of these findings with
class (with restrictions on sharing code or solving homework substantial statistical evidence is a critical step. Two-sided
together). Nevertheless, we cannot rule out the possibility that significance tests should be conducted once a larger dataset is
the existence of plagiarism control discourages the formation obtained over a more extended period. Furthermore, it would
of study groups. Beyond addressing the primary research be of high value to evaluate and compare plagiarism behavior
queries, this manuscript systematically investigates salient across different academic fields such as computer science and
dimensions, encompassing: economics. This comparative analysis would enable the identi-
Tool Validation. The plagiarism control procedure inher- fication of potential contributing factors to observed variations
ently carries the risk of false positive outcomes (as we got in in plagiarism behavior. These may include distinct learning
the first experiment in this study), which incorrectly flag non- and collaboration strategies or unique academic traditions
plagiarized work as a suspect. In response to this issue, sys- associated with each field of study.
tematic bilateral communication is instituted between students However, plagiarism checking should not be used in every
and instructors. This dialogue enables students to construct case. When it comes to creativity in project-based courses,
a well-substantiated response that unequivocally demonstrates a multitude of rules, guidelines, and instructions can limit
the error made by the plagiarism control. Additionally, this students’ thought processes and thus reduce student learning
discussion provides instructors with insights into the students’ [26]. It is important to weigh the use of plagiarism control
comprehensive understanding of the subject matter to which and use it where it makes sense.
the assignment pertains. In the future, we will implement a continuous plagiarism
Generalization. The scope of this article is expressly con- checker in Artemis that can be activated at the exercise level.
fined to experimental investigations conducted within courses It will inform students about possible plagiarism immediately
that comprise the study and implementation of fundamentals after submission - i.e. while the submission is still in progress.
This way, students will be made aware of the possible conse- [12] J. Hage, P. Rademaker, and N. Van Vugt, “A comparison of plagiarism
quences as early as possible and will have the opportunity to detection tools,” Utrecht University. Utrecht, The Netherlands, vol. 28,
no. 1, 2010.
correct the accusation in a later submission. Since JPlag works [13] M. Kaya and S. A. Özel, “Integrating an online compiler and a
on the abstract syntax tree, it will not be sufficient to apply plagiarism detection tool into the moodle distance education system for
simple refactoring such as automatic renaming of variables. easy assessment of programming assignments,” Computer Applications
in Engineering Education, vol. 23, no. 3, pp. 363–373, 2015.
Your solution will have to be inherently different in subsequent [14] D. Pawelczak, “Benefits and drawbacks of source code plagiarism
attempts to remove the plagiarism allegation. Interviews will detection in engineering education,” in Global Engineering Education
reveal the psychological impact on student learning outcomes Conference, pp. 1048–1056, IEEE, 2018.
[15] D. Pawelczak, “Effects of plagiarism in introductory programming
and how early notification affects their cheating behavior. courses on the learning outcomes,” in 5th International Conference on
Higher Education Advances, pp. 623–631, 2019.
R EFERENCES [16] B. Halak and M. El-Hajjar, “Plagiarism detection and prevention
[1] I. Albluwi, “Plagiarism in programming assessments: a systematic techniques in engineering education,” in 11th European Workshop on
review,” Transactions on Computing Education, vol. 20, no. 1, pp. 1–28, Microelectronics Education, pp. 1–3, IEEE, 2016.
2019. [17] S. Moss, B. White, and J. Lee, “A systematic review into the psycho-
[2] M. Joy and M. Luck, “Plagiarism in programming assignments,” Trans- logical causes and correlates of plagiarism,” Ethics & Behavior, vol. 28,
actions on education, vol. 42, no. 2, pp. 129–133, 1999. no. 4, pp. 261–283, 2018.
[3] E. Luquini and N. Omar, “Programming plagiarism as a social phe- [18] S. Krusche, A. Seitz, J. Börstler, and B. Bruegge, “Interactive learning:
nomenon,” in Global Engineering Education Conference, pp. 895–902, Increasing student participation through shorter exercise cycles,” in
IEEE, 2011. Proceedings of the 19th Australasian Computing Education Conference,
[4] R. J. Youmans, “Does the adoption of plagiarism-detection software pp. 17–26, 2017.
in higher education reduce plagiarism?,” Studies in Higher Education, [19] S. Krusche and A. Seitz, “Increasing the interactivity in software
vol. 36, no. 7, pp. 749–761, 2011. engineering moocs - A case study,” in 52nd Hawaii International
[5] G. Hill, J. Mason, and A. Dunn, “Contract cheating: an increasing chal- Conference on System Sciences, HICSS, pp. 1–10, ScholarSpace, 2019.
[20] S. Krusche, N. von Frankenberg, and S. Afifi, “Experiences of a software
lenge for global academic community arising from covid-19,” Research engineering course based on interactive learning,” in 15. Workshops
and practice in technology enhanced learning, vol. 16, pp. 1–20, 2021. Software Engineering im Unterricht der Hochschulen, pp. 32–40, 2017.
[6] T. Foltỳnek, D. Dlabolová, A. Anohina-Naumeca, S. Razı, J. Kravjar, [21] S. Krusche, N. von Frankenberg, L. M. Reimer, and B. Bruegge, “An
L. Kamzola, J. Guerrero-Dib, Ö. Çelik, and D. Weber-Wulff, “Testing interactive learning method to engage students in modeling,” in 42nd In-
of support tools for plagiarism detection,” International Journal of ternational Conference on Software Engineering: Software Engineering
Educational Technology in Higher Education, vol. 17, pp. 1–31, 2020. Education and Training, pp. 12–22, ACM/IEEE, 2020.
[7] B. D. Jenkins, J. M. Golding, A. M. Le Grand, M. M. Levi, and A. M. [22] S. Krusche and A. Seitz, “Artemis: An automatic assessment man-
Pals, “When opportunity knocks: College students’ cheating amid the agement system for interactive learning,” in Proceedings of the 49th
covid-19 pandemic,” Teaching of Psychology, 2022. technical symposium on computer science education, pp. 284–289,
[8] N. Tahaei and D. C. Noelle, “Automated plagiarism detection for ACM, 2018.
computer programming exercises based on patterns of resubmission,” in [23] S. Krusche, “Interactive learning - A Scalable and Adaptive Learning
Conference on International Computing Education Research, pp. 178– Approach for Large Courses,” Habilitation, Technical University of
186, 2018. Munich, 2021.
[9] J. Pierce and C. Zilles, “Investigating student plagiarism patterns and [24] L. Prechelt, G. Malpohl, M. Philippsen, et al., “Finding plagiarisms
correlations to grades,” in Technical Symposium on Computer Science among a set of programs with jplag.,” vol. 8, no. 11, p. 1016, 2002.
Education, pp. 471–476, 2017. [25] S. Engels, V. Lakshmanan, and M. Craig, “Plagiarism detection using
[10] J. C. Paiva, J. P. Leal, and Á. Figueira, “Automated assessment in feature-based neural networks,” in 38th technical symposium on Com-
computer science education: A state-of-the-art review,” Transactions on puter science education, pp. 34–38, 2007.
Computing Education, vol. 22, no. 3, pp. 1–40, 2022. [26] S. Krusche, B. Bruegge, I. Camilleri, K. Krinkin, A. Seitz, and
[11] R. Stallman, “Gnu compiler collection internals,” Free Software Foun- C. Wöbker, “Chaordic Learning: A Case Study,” in 39th International
dation, 2002. Conference on Software Engineering: Software Engineering Education
and Training, pp. 87–96, IEEE, 2017.

You might also like