Overview page of an VPL assignment
The overview page of a VPL assignment has four views that can be accessed via tabs: Description, Submissions List, Similarity and Test Activity.
When you open a VPL activity, you will be taken to the overview page of the VPL activity. Here you can open four different pages with information.
When opening a VPL activity, you will automatically be taken to the "Description" tab. Here you get an overview of the most important settings such as Due date, Requested files, Type of work, Grade settings and Execution files.
In the "Submissions list" view you can see an overview of the students' submissions. You can filter them by group, submission status and evaluation status. You can also display an Assessment report and download all or selected submissions.
In the "Similarity" view you can perform plagiarism checks. VPL contains a tool for checking plagiarism in the entire source code. The main purpose of this tool is to detect plagiarism among submissions for an assignment in one course room, but it can also include other sources such as submissions for the same assignment in previous semesters or similar assignments from other course rooms that are likely sources of plagiarism.
The process for finding similarities between source files consists of three steps: tokenization, comparison, and clustering.
Tokenization is the process of obtaining a normalized signature from each file in order to perform an efficient comparison and find similarities between them. It consists of three phases: lexical analysis, filtering and normalization.
- The lexical analysis extracts the tokens representing the elements of a program (they depend on the programming language).
- These tokens are then filtered to delete those that are not relevant for comparison.
- Finally, expressions are normalized to a canonical form, which generates the program signature.
Signatures are normalized representations for the source code files, which are extracted from them to optimize the comparison process. The form of the signature depends on the metric to be used in the comparison.
VPL uses three different metrics that give a number in the range 0.0 to 1.0 when comparing two signatures, where 0.0 means "completely the same" and 1.0 means "completely different". The use of three metrics takes advantage of the fact that they are affected in different ways by the modifications of the code.
Experience with the tool has shown that some cases of plagiarism do not represent a one-to-one relationship, but a group relationship, where usually nobody has detailed knowledge of all participants. For example, student A may lend his work to students B and C without B and C being aware of each other. Student C can lend his work to student D without A and B being known to each other, and so on. To provide the evaluators with information about these events, the system contains algorithms for identifying clusters of similar files. The visualization system of the anti-plagiarism tool allows to visualize lists of pairs of similar files, clusters of similar files or file-to-file similarities.
In the "Test activity" view, program code can be created or uploaded online. This is the most important view for students. As a manager you can check your sample solutions and the automatic evaluation here.
4.1 Submission view
When you open the Test Activity view, the Submission appears first. Here you can see an overview of the source code already created as well as the comments for the evaluation. These comments include the "Summary of tests", i.e. the output of the automatic evaluation.
4.2. Previous submissions list
The "Previous submissions list" provides statistics on the progress of the file size and the progress of the grading.
In the "Submission" view you can upload locally created source code and add a comment. You can then further edit and evaluate it in the "Edit" view.
The sub view "Edit" is the main view for creating a solution program.
In this editor you or your students create the required program.
The buttons in the toolbar have these functions:
- Show more...
- Automatic Evaluation
In the extended view the following buttons appear additionally:
- File list
- Select all
By clicking on "Automatic Evaluation", the automatic evaluation is executed and a feedback about the solution is given directly in the browser:
| 1 test run/ 1 test passed |