Set up automatic solution evaluation
The Virtual Programming Lab allows you to create test cases that can be used to automatically evaluate student submissions. Four steps are necessary to set up the automatic evaluation:
- Create test cases
- Set execution options
- Create required files
- Test sample solution
This tutorial explains the procedure using a simple programming assignment where a "Hello world" program is to be written in Python. We recommend the use of such a simple task so that the students can become familiar with the operation.
Table of contents
1. Prepare VPL assignment
2. Create test cases
3. Set execution options
4. Create required files
5. Test automatic evaluation with sample solution
First, create a VPL assignment with the following settings:
- Name - Python: Hello World
- Short description - Hello World in Python 3
- Full description - Write a Python program that outputs "Hello World!" to the console. Only the text without quotation marks and an empty line should appear after the output.
- Submission period - Available from today's date. Deadline is a date in the future.
- Maximum number of files - 1
- Grade - Grading type "Points", Maximum grade 100.
Leave all other settings in default. Click "Save and display" to open the VPL assignment.
Once the item has been created, create the test cases that will automatically check the correctness of the submissions.
- Open the VPL assignment.
- In the VPL assignment, click the gear wheel inside the white box to open the VPL Administration.
- Select "Test cases" here. An editor opens.
- Enter the following Test case in the editor:
Case = Test 1
output = "Hello World!"
The first line gives the test case a name. You can create several Test cases for one assignment. The second and third lines check the console output. Setting the quotation mark in the fourth line ensures that an empty line is output to the console. - Click on the disk icon to finish creating this test case.
Return to the VPL assignment by clicking on the title of the VPL assignment in the breadcrumb navigation (= "Python: Hello World").
- In the next step you define how the students' submissions should be executed and evaluated on the jail server.
- In the opened VPL assignment, click again on the gear inside the white box to open the VPL administration.
- Based on - Set the Basis of your VPL-Assignment
- Run script - Leave this option set to "Autodetect" or select a specific script to be executed for the submission.
- Debug script - Leave this option set to "autodetect" or select a specific debug script to run for the submission.
- Run - Enables execution on the VPL execution server. Set this option to "yes" for the practice example.
- Debug - Enables the built-in debugger. Set this option to "no" for the exercise example.
- Evaluate - Enables the automatic evaluation of the submission. Set this option to "yes" for the tutorial.
- Evaluate just on submission - Restricts the evaluation of the program on the moment it is submitted.
- Automatic evaluation - Settings for the automatic evaluation of tasks.
In the last step you define which files you expect to be submitted:
- In the opened VPL assignment, click again on the gear wheel inside the white box to open the VPL Administration.
- Select "Requested files". An editor page opens together with a "Create a new file" window.
- Enter "hello.py" as file name and then click "OK".
- In the editor, you can give your students a code construct that will be saved under the file name you just assigned. We leave it here with an empty file and save the file by clicking on the disk.
Return to the overview page of the activity by clicking on the assignment title (= "Python: Hello World") in the breadcrumb navigation. The automatic evaluation is now created.
5. Test automatic evaluation with a sample solution
Finally, you can check the automatic evaluation using a sample solution.
- In the opened VPL assignment, click on the "Test activity" tab and then on "Edit".
- Now enter print("Hello World!") into the editor.
- To save, first click on the disk symbol .
- Then click on the rocket symbol to run the program.
- By clicking on the check mark with the adjacent 0 you execute the automatic test. You should now see the following output on the right side:
- | 1 test run/ 1 test passed |