assignment-codeval 0.0.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,241 @@
1
+ Metadata-Version: 2.4
2
+ Name: assignment-codeval
3
+ Version: 0.0.1
4
+ Summary: CodEval for evaluating programming assignments
5
+ Requires-Python: >=3.7
6
+ Description-Content-Type: text/markdown
7
+ Requires-Dist: canvasapi==3.3.0
8
+ Requires-Dist: certifi==2021.10.8
9
+ Requires-Dist: charset-normalizer==2.0.9
10
+ Requires-Dist: click==8.2.1
11
+ Requires-Dist: configparser==5.2.0
12
+ Requires-Dist: idna==3.3
13
+ Requires-Dist: pytz==2021.3
14
+ Requires-Dist: requests==2.27.0
15
+ Requires-Dist: urllib3==1.26.7
16
+ Requires-Dist: pymongo==4.3.3
17
+ Requires-Dist: markdown==3.4.1
18
+
19
+ # CodEval
20
+
21
+ Currently CodEval has 3 main components:
22
+ ## 1. Test Simple I/O Programming Assignments on Canvas
23
+ ### codeval.ini contents
24
+ ```
25
+ [SERVER]
26
+ url=<canvas API>
27
+ token=<canvas token>
28
+ [RUN]
29
+ precommand=
30
+ command=
31
+ ```
32
+
33
+ Refer to a sample codeval.ini file [here](samples/codeval.ini)
34
+
35
+ ### Command to run:
36
+ `python3 codeval.py grade-submissions <a unique part of course name> [FLAGS]`
37
+ Example:
38
+ If the course name on Canvas is CS 149 - Operating Systems, the command can be:
39
+ `python3 codeval.py CS\ 149`
40
+ or
41
+ `python3 codeval.py "Operating Systems"`
42
+ Use a part of the course name that can uniquely identify the course on Canvas.
43
+
44
+ ### Flags
45
+ - **--dry-run/--no-dry-run** (Optional)
46
+ - Default: --dry-run
47
+ - Do not update the results on Canvas. Print the results to the terminal instead.
48
+ - **--verbose/--no-verbose** (Optional)
49
+ - Default: --no-verbose
50
+ - Show detailed logs
51
+ - **--force/--no-force** (Optional)
52
+ - Default: --no-force
53
+ - Grade submissions even if already graded
54
+ - **--copytmpdir/--no-copytmpdir** (Optional)
55
+ - Default: --no-copytmpdir
56
+ - Copy temporary directory content to current directory for debugging
57
+
58
+ ### Specification Tags
59
+ Tags used in a spec file (\<course name>.codeval)
60
+
61
+ | Tag | Meaning | Function |
62
+ |---|---|---|
63
+ | C | Compile Code | Specifies the command to compile the submission code |
64
+ | CTO | Compile Timeout | Timeout in seconds for the compile command to run |
65
+ | RUN | Run Script | Specifies the script to use to evaluate the specification file. Defaults to evaluate.sh. |
66
+ | Z | Download Zip | Will be followed by zip files to download from Canvas to use when running the test cases. |
67
+ | CF | Check Function | Will be followed by a function name and a list of files to check to ensure that the function is used by one of those files. |
68
+ | CMD/TCMD | Run Command | Will be followed by a command to run. The TCMD will cause the evaluation to fail if the command exits with an error. |
69
+ | CMP | Compare | Will be followed by two files to compare. |
70
+ | T/HT | Test Case | Will be followed by the command to run to test the submission. |
71
+ | I/IF | Supply Input | Specifies the input for a test case. The IF version will read the input from a file. |
72
+ | O/OF | Check Output | Specifies the expected output for a test case. The OF version will read from a file. |
73
+ | E | Check Error | Specifies the expected error output for a test case. |
74
+ | TO | Timeout | Specifies the time limit in seconds for a test case to run. Defaults to 20 seconds. |
75
+ | X | Exit Code | Specifies the expected exit code for a test case. Defaults to zero. |
76
+ | SS | Start Server | Command containing timeout (wait until server starts), kill timeout (wait to kill the server), and the command to start a server |
77
+
78
+ Refer to a sample spec file [here](samples/assignment-name.codeval)
79
+
80
+ ## 2. Test Distributed Programming Assignments
81
+ ### (or complex non I/O programs)
82
+ ### codeval.ini contents
83
+ ```
84
+ [SERVER]
85
+ url=<canvas API>
86
+ token=<canvas token>
87
+ [RUN]
88
+ precommand=
89
+ command=
90
+ dist_command=
91
+ host_ip=
92
+ [MONGO]
93
+ url=
94
+ db=
95
+ ```
96
+
97
+ Refer to a sample codeval.ini file [here](samples/codeval.ini)
98
+
99
+ ### Command to run
100
+ is the same as the [command in #1](#command-to-run):
101
+ `python3 codeval.py grade-submissions <a unique part of course name> [FLAGS]`
102
+
103
+ ### Distributed Specification Tags
104
+
105
+ | Tag | Meaning | Function |
106
+ |---|---|---|
107
+ | --DT-- | Distributed Tests Begin | Marks the beginning of distributed tests. Is used to determine if the spec file has distributed tests |
108
+ | GTO | Global timeout | A total timeout for all distributed tests, for each of homogenous and heterogenous tests. Homogenous tests = GTO value. Heterogenous tests = 2 * GTO value |
109
+ | PORTS | Exposed ports count | Maximum number of ports needed to expose per docker container |
110
+ | ECMD/ECMDT SYNC/ASYNC | External Command | Command that runs in the a controller container, emulating a host machine. ECMDT: Evaluation fails if command returns an error. SYNC: CodEval waits for command to execute or fail. ASYNC: CodEval doesn't wait for command to execute, failure is checked if ECMDT |
111
+ | DTC $int [HOM] [HET] | Distributed Test Config Group | Signifies the start of a new group of Distributed tests. Replace $int with the number of containers that needs to be started for the test group. HOM denotes homogenous tests, i.e., user's own submissions will be executed in the contianers. HET denotes heterogenous tests, i.e., a combination of $int - 1 other users' and current user's submissions will be executed in the containers. Can enter either HOM or HET or both |
112
+ | ICMD/ICMDT SYNC/ASYNC */n1,n2,n3... | Internal Command | Command that runs in each of the containers. ICMDT: Evaluation fails if command returns an error. SYNC: wait for command to execute or fail. ASYNC: Don't wait for command to execute, failure is checked if ICMDT *: run command in all the containers. n1,n2..nx: Run command in containers indexed n1,n2..nx only. Containers follow zero-based indexing |
113
+ | TESTCMD | Test Command | Command run on the host machine to validate the submission(s) |
114
+ | --DTCLEAN-- | Cleanup Commands | Commands to execute after the tests have completed or failed. Can contain only ECMD or ECMDT |
115
+
116
+ ### Special placeholders in commands
117
+ | Placeholder | Usage |
118
+ | --- | --- |
119
+ | TEMP_DIR | used in ECMD/ECMDT to be replaced by the temporary directory generated by CodEval during execution |
120
+ | HOST_IP | used in ECMD/ECMDT/ICMD/ICMDT to be replaced by the host's IP specified in codeval.ini |
121
+ | USERNAME | used in ICMD/ICMDT to be replaced by the user's username whose submission is being evaluated |
122
+ | PORT_$int | used in ICMD/ICMDT to be replaced by a port number assigned to the running docker continer. $int needs to be < PORT value in the specification |
123
+
124
+ Refer to a sample spec file [here](samples/assignment-name.codeval)
125
+
126
+ ### Notes
127
+ - The config file `codeval.ini` needs to contain the extra entries only if the tag `--DT--` exists in the specification file
128
+ - Distributed tests need a running mongodb service to persists the progress of students running heterogenous tests
129
+
130
+
131
+ ## 3. Test SQL Assignments
132
+ ### codeval.ini contents
133
+ ```
134
+ [SERVER]
135
+ url=<canvas API>
136
+ token=<canvas token>
137
+ [RUN]
138
+ precommand=
139
+ command=
140
+ dist_command=
141
+ host_ip=
142
+ sql_command=
143
+ ```
144
+
145
+ Refer to a sample codeval.ini file [here](SQL/samples/codeval.ini)
146
+
147
+ ### Command to run
148
+ is the same as the [command in #1](#command-to-run):
149
+ `python3 codeval.py grade-submissions <a unique part of course name> [FLAGS]`
150
+
151
+ ### SQL Specification Tags
152
+
153
+ | Tag | Meaning | Function |
154
+ |------------------|-------------------------|----------------------------------------------------------------------------------------------|
155
+ | --SQL-- | SQL Tests Begin | Marks the beginning of SQL tests. Is used to determine if the spec file has SQL based tests |
156
+ | INSERT | Insert rows in DB | Insert rows in the SQL database using files/ individual insert queries. |
157
+ | CONDITIONPRESENT | Check condition in file | Validate submission files for a required condition to be present in submissions. |
158
+ | SCHEMACHECK | External Command | Validate submission files for database related checks like constraints. |
159
+ | TSQL | SQL Test | Marks the SQL test, take input as a file or individual query and run it on submission files. |
160
+
161
+ Refer to a sample spec file [here](SQL/samples/ASSIGNMENT:CREATE.codeval)
162
+
163
+ ### Notes
164
+ - The config file `codeval.ini` needs to contain the extra entries only if the tag `--SQL--` exists in the specification file
165
+ - SQL tests need a separate container image to run SQL tests in MYSQL.
166
+
167
+
168
+ ## 4. Create an assignment on Canvas
169
+
170
+ ### Command to create the assignment:
171
+ **Syntax:** `python3 codeval.py create-assignment <course_name> <specification_file> [ --dry-run/--no-dry-run ] [ --verbose/--no-verbose ] [ --group_name ]`
172
+ **Example:** `python3 codeval.py create-assignment "Practice1" 'a_big_bag_of_strings.txt' --no-dry-run --verbose --group_name "exam 2"`
173
+
174
+ ### Command to grade the assignment:
175
+ **Syntax:** `python3 codeval.py grade-submissions <course_name> [ --dry-run/--no-dry-run ] [ --verbose/--no-verbose ] [ --force/--no-force][--copytmpdir/--no-copytmpdir]`
176
+ **Example:** `python3 codeval.py grade-submissions "Practice1" --no-dry-run --force --verbose`
177
+
178
+ **New tags introduced are :**
179
+
180
+ CRT_HW START <Assignment_name>
181
+
182
+ CRT_HW END
183
+
184
+ DISCSN_URL
185
+
186
+ EXMPLS <no_of_test_cases>
187
+
188
+ URL_OF_HW "file_name"
189
+
190
+ ### MODIFICATIONS REQUIRED IN THE SPECIFICATION FILE.
191
+ 1) Start the specification file with the tag CRT_HW START followed by a space followed by the name of assignment.
192
+ ``` For ex: CRT_HW START Hello World```
193
+ 2) The following lines after the first line will contain the description of the assignment in Markdown format.
194
+ 3) The description ends with the last line containing just the tag CRT_HW END .
195
+ ``` For ex: CRT_HW END ```
196
+ 4) After this tag, the content for grading the submission begins.
197
+
198
+ Addition of the Discussion Topic in the assignment description.
199
+ 1) Insert the tag DISCUSSION_LINK wherever you want the corresponding discussion topic's link to appear.
200
+ ```For ex: To access the discussion topic for this assignment you go here DISCUSSION_LINK```
201
+
202
+ #### Addition of sample examples in the assignment description.
203
+ 1) Insert the tag EXMPLS followed by single space followed by the value.
204
+ Here value is the number of test cases to be displayed as sample examples.
205
+ At maximum it will print all the non hidden test cases.
206
+ For ex: EXMPLS 5
207
+ #### Addition of the links to the files uploaded in the Codeval folder in the assignment description.
208
+ 1) In order to add hyperlink to a file the markdown format is as follows:
209
+ [file_name_to_be_displayed](Url_of_the_file)
210
+ Here in the parenthesis where the Url is required,insert the tag
211
+ URL_OF_HW followed by space followed by the file name of the file required to be linked in double quotes.
212
+ For ex: URL_OF_HW "file name.extension"
213
+ Note: The file should be present in the Codeval folder.
214
+
215
+ ### UPLOAD THE REQUIRED FILES IN CODEVAL FOLDER IN FILES SECTION.
216
+ 1) Create a folder called `assignmentFiles` which should conatin all the necessary files including
217
+ the specification file.
218
+
219
+ ### EXAMPLE OF THE SPECIFICATION FILE.
220
+
221
+ CRT_HW START Bag Of Strings
222
+ # Description
223
+ ## Problem Statement
224
+ - This Is An Example For The Description Of The Assignment In Markdown.
225
+ - To Download The File [Hello_World](URL_OF_HW "Helloworld.Txt")
226
+
227
+ ## Sample Examples
228
+ EXMPLS 3
229
+
230
+ ## Discussion Topic
231
+ Here Is The Link To The Discussion Topic: DISCSN_URL
232
+
233
+ ### Rubric
234
+ | Cases | Points|
235
+ | ----- |----- |
236
+ | Base Points | 50 |
237
+
238
+ CRT_HW END
239
+
240
+ C cc -o bigbag --std=gnu11 bigbag.c
241
+
@@ -0,0 +1,223 @@
1
+ # CodEval
2
+
3
+ Currently CodEval has 3 main components:
4
+ ## 1. Test Simple I/O Programming Assignments on Canvas
5
+ ### codeval.ini contents
6
+ ```
7
+ [SERVER]
8
+ url=<canvas API>
9
+ token=<canvas token>
10
+ [RUN]
11
+ precommand=
12
+ command=
13
+ ```
14
+
15
+ Refer to a sample codeval.ini file [here](samples/codeval.ini)
16
+
17
+ ### Command to run:
18
+ `python3 codeval.py grade-submissions <a unique part of course name> [FLAGS]`
19
+ Example:
20
+ If the course name on Canvas is CS 149 - Operating Systems, the command can be:
21
+ `python3 codeval.py CS\ 149`
22
+ or
23
+ `python3 codeval.py "Operating Systems"`
24
+ Use a part of the course name that can uniquely identify the course on Canvas.
25
+
26
+ ### Flags
27
+ - **--dry-run/--no-dry-run** (Optional)
28
+ - Default: --dry-run
29
+ - Do not update the results on Canvas. Print the results to the terminal instead.
30
+ - **--verbose/--no-verbose** (Optional)
31
+ - Default: --no-verbose
32
+ - Show detailed logs
33
+ - **--force/--no-force** (Optional)
34
+ - Default: --no-force
35
+ - Grade submissions even if already graded
36
+ - **--copytmpdir/--no-copytmpdir** (Optional)
37
+ - Default: --no-copytmpdir
38
+ - Copy temporary directory content to current directory for debugging
39
+
40
+ ### Specification Tags
41
+ Tags used in a spec file (\<course name>.codeval)
42
+
43
+ | Tag | Meaning | Function |
44
+ |---|---|---|
45
+ | C | Compile Code | Specifies the command to compile the submission code |
46
+ | CTO | Compile Timeout | Timeout in seconds for the compile command to run |
47
+ | RUN | Run Script | Specifies the script to use to evaluate the specification file. Defaults to evaluate.sh. |
48
+ | Z | Download Zip | Will be followed by zip files to download from Canvas to use when running the test cases. |
49
+ | CF | Check Function | Will be followed by a function name and a list of files to check to ensure that the function is used by one of those files. |
50
+ | CMD/TCMD | Run Command | Will be followed by a command to run. The TCMD will cause the evaluation to fail if the command exits with an error. |
51
+ | CMP | Compare | Will be followed by two files to compare. |
52
+ | T/HT | Test Case | Will be followed by the command to run to test the submission. |
53
+ | I/IF | Supply Input | Specifies the input for a test case. The IF version will read the input from a file. |
54
+ | O/OF | Check Output | Specifies the expected output for a test case. The OF version will read from a file. |
55
+ | E | Check Error | Specifies the expected error output for a test case. |
56
+ | TO | Timeout | Specifies the time limit in seconds for a test case to run. Defaults to 20 seconds. |
57
+ | X | Exit Code | Specifies the expected exit code for a test case. Defaults to zero. |
58
+ | SS | Start Server | Command containing timeout (wait until server starts), kill timeout (wait to kill the server), and the command to start a server |
59
+
60
+ Refer to a sample spec file [here](samples/assignment-name.codeval)
61
+
62
+ ## 2. Test Distributed Programming Assignments
63
+ ### (or complex non I/O programs)
64
+ ### codeval.ini contents
65
+ ```
66
+ [SERVER]
67
+ url=<canvas API>
68
+ token=<canvas token>
69
+ [RUN]
70
+ precommand=
71
+ command=
72
+ dist_command=
73
+ host_ip=
74
+ [MONGO]
75
+ url=
76
+ db=
77
+ ```
78
+
79
+ Refer to a sample codeval.ini file [here](samples/codeval.ini)
80
+
81
+ ### Command to run
82
+ is the same as the [command in #1](#command-to-run):
83
+ `python3 codeval.py grade-submissions <a unique part of course name> [FLAGS]`
84
+
85
+ ### Distributed Specification Tags
86
+
87
+ | Tag | Meaning | Function |
88
+ |---|---|---|
89
+ | --DT-- | Distributed Tests Begin | Marks the beginning of distributed tests. Is used to determine if the spec file has distributed tests |
90
+ | GTO | Global timeout | A total timeout for all distributed tests, for each of homogenous and heterogenous tests. Homogenous tests = GTO value. Heterogenous tests = 2 * GTO value |
91
+ | PORTS | Exposed ports count | Maximum number of ports needed to expose per docker container |
92
+ | ECMD/ECMDT SYNC/ASYNC | External Command | Command that runs in the a controller container, emulating a host machine. ECMDT: Evaluation fails if command returns an error. SYNC: CodEval waits for command to execute or fail. ASYNC: CodEval doesn't wait for command to execute, failure is checked if ECMDT |
93
+ | DTC $int [HOM] [HET] | Distributed Test Config Group | Signifies the start of a new group of Distributed tests. Replace $int with the number of containers that needs to be started for the test group. HOM denotes homogenous tests, i.e., user's own submissions will be executed in the contianers. HET denotes heterogenous tests, i.e., a combination of $int - 1 other users' and current user's submissions will be executed in the containers. Can enter either HOM or HET or both |
94
+ | ICMD/ICMDT SYNC/ASYNC */n1,n2,n3... | Internal Command | Command that runs in each of the containers. ICMDT: Evaluation fails if command returns an error. SYNC: wait for command to execute or fail. ASYNC: Don't wait for command to execute, failure is checked if ICMDT *: run command in all the containers. n1,n2..nx: Run command in containers indexed n1,n2..nx only. Containers follow zero-based indexing |
95
+ | TESTCMD | Test Command | Command run on the host machine to validate the submission(s) |
96
+ | --DTCLEAN-- | Cleanup Commands | Commands to execute after the tests have completed or failed. Can contain only ECMD or ECMDT |
97
+
98
+ ### Special placeholders in commands
99
+ | Placeholder | Usage |
100
+ | --- | --- |
101
+ | TEMP_DIR | used in ECMD/ECMDT to be replaced by the temporary directory generated by CodEval during execution |
102
+ | HOST_IP | used in ECMD/ECMDT/ICMD/ICMDT to be replaced by the host's IP specified in codeval.ini |
103
+ | USERNAME | used in ICMD/ICMDT to be replaced by the user's username whose submission is being evaluated |
104
+ | PORT_$int | used in ICMD/ICMDT to be replaced by a port number assigned to the running docker continer. $int needs to be < PORT value in the specification |
105
+
106
+ Refer to a sample spec file [here](samples/assignment-name.codeval)
107
+
108
+ ### Notes
109
+ - The config file `codeval.ini` needs to contain the extra entries only if the tag `--DT--` exists in the specification file
110
+ - Distributed tests need a running mongodb service to persists the progress of students running heterogenous tests
111
+
112
+
113
+ ## 3. Test SQL Assignments
114
+ ### codeval.ini contents
115
+ ```
116
+ [SERVER]
117
+ url=<canvas API>
118
+ token=<canvas token>
119
+ [RUN]
120
+ precommand=
121
+ command=
122
+ dist_command=
123
+ host_ip=
124
+ sql_command=
125
+ ```
126
+
127
+ Refer to a sample codeval.ini file [here](SQL/samples/codeval.ini)
128
+
129
+ ### Command to run
130
+ is the same as the [command in #1](#command-to-run):
131
+ `python3 codeval.py grade-submissions <a unique part of course name> [FLAGS]`
132
+
133
+ ### SQL Specification Tags
134
+
135
+ | Tag | Meaning | Function |
136
+ |------------------|-------------------------|----------------------------------------------------------------------------------------------|
137
+ | --SQL-- | SQL Tests Begin | Marks the beginning of SQL tests. Is used to determine if the spec file has SQL based tests |
138
+ | INSERT | Insert rows in DB | Insert rows in the SQL database using files/ individual insert queries. |
139
+ | CONDITIONPRESENT | Check condition in file | Validate submission files for a required condition to be present in submissions. |
140
+ | SCHEMACHECK | External Command | Validate submission files for database related checks like constraints. |
141
+ | TSQL | SQL Test | Marks the SQL test, take input as a file or individual query and run it on submission files. |
142
+
143
+ Refer to a sample spec file [here](SQL/samples/ASSIGNMENT:CREATE.codeval)
144
+
145
+ ### Notes
146
+ - The config file `codeval.ini` needs to contain the extra entries only if the tag `--SQL--` exists in the specification file
147
+ - SQL tests need a separate container image to run SQL tests in MYSQL.
148
+
149
+
150
+ ## 4. Create an assignment on Canvas
151
+
152
+ ### Command to create the assignment:
153
+ **Syntax:** `python3 codeval.py create-assignment <course_name> <specification_file> [ --dry-run/--no-dry-run ] [ --verbose/--no-verbose ] [ --group_name ]`
154
+ **Example:** `python3 codeval.py create-assignment "Practice1" 'a_big_bag_of_strings.txt' --no-dry-run --verbose --group_name "exam 2"`
155
+
156
+ ### Command to grade the assignment:
157
+ **Syntax:** `python3 codeval.py grade-submissions <course_name> [ --dry-run/--no-dry-run ] [ --verbose/--no-verbose ] [ --force/--no-force][--copytmpdir/--no-copytmpdir]`
158
+ **Example:** `python3 codeval.py grade-submissions "Practice1" --no-dry-run --force --verbose`
159
+
160
+ **New tags introduced are :**
161
+
162
+ CRT_HW START <Assignment_name>
163
+
164
+ CRT_HW END
165
+
166
+ DISCSN_URL
167
+
168
+ EXMPLS <no_of_test_cases>
169
+
170
+ URL_OF_HW "file_name"
171
+
172
+ ### MODIFICATIONS REQUIRED IN THE SPECIFICATION FILE.
173
+ 1) Start the specification file with the tag CRT_HW START followed by a space followed by the name of assignment.
174
+ ``` For ex: CRT_HW START Hello World```
175
+ 2) The following lines after the first line will contain the description of the assignment in Markdown format.
176
+ 3) The description ends with the last line containing just the tag CRT_HW END .
177
+ ``` For ex: CRT_HW END ```
178
+ 4) After this tag, the content for grading the submission begins.
179
+
180
+ Addition of the Discussion Topic in the assignment description.
181
+ 1) Insert the tag DISCUSSION_LINK wherever you want the corresponding discussion topic's link to appear.
182
+ ```For ex: To access the discussion topic for this assignment you go here DISCUSSION_LINK```
183
+
184
+ #### Addition of sample examples in the assignment description.
185
+ 1) Insert the tag EXMPLS followed by single space followed by the value.
186
+ Here value is the number of test cases to be displayed as sample examples.
187
+ At maximum it will print all the non hidden test cases.
188
+ For ex: EXMPLS 5
189
+ #### Addition of the links to the files uploaded in the Codeval folder in the assignment description.
190
+ 1) In order to add hyperlink to a file the markdown format is as follows:
191
+ [file_name_to_be_displayed](Url_of_the_file)
192
+ Here in the parenthesis where the Url is required,insert the tag
193
+ URL_OF_HW followed by space followed by the file name of the file required to be linked in double quotes.
194
+ For ex: URL_OF_HW "file name.extension"
195
+ Note: The file should be present in the Codeval folder.
196
+
197
+ ### UPLOAD THE REQUIRED FILES IN CODEVAL FOLDER IN FILES SECTION.
198
+ 1) Create a folder called `assignmentFiles` which should conatin all the necessary files including
199
+ the specification file.
200
+
201
+ ### EXAMPLE OF THE SPECIFICATION FILE.
202
+
203
+ CRT_HW START Bag Of Strings
204
+ # Description
205
+ ## Problem Statement
206
+ - This Is An Example For The Description Of The Assignment In Markdown.
207
+ - To Download The File [Hello_World](URL_OF_HW "Helloworld.Txt")
208
+
209
+ ## Sample Examples
210
+ EXMPLS 3
211
+
212
+ ## Discussion Topic
213
+ Here Is The Link To The Discussion Topic: DISCSN_URL
214
+
215
+ ### Rubric
216
+ | Cases | Points|
217
+ | ----- |----- |
218
+ | Base Points | 50 |
219
+
220
+ CRT_HW END
221
+
222
+ C cc -o bigbag --std=gnu11 bigbag.c
223
+
@@ -0,0 +1,26 @@
1
+ [build-system]
2
+ requires = ["setuptools>=69", "wheel>=0.40"]
3
+ build-backend = "setuptools.build_meta"
4
+
5
+ [project]
6
+ name = "assignment-codeval"
7
+ version = "0.0.1"
8
+ description = "CodEval for evaluating programming assignments"
9
+ readme = "README.md"
10
+ requires-python = ">=3.7"
11
+ dependencies = [
12
+ "canvasapi==3.3.0",
13
+ "certifi==2021.10.8",
14
+ "charset-normalizer==2.0.9",
15
+ "click==8.2.1",
16
+ "configparser==5.2.0",
17
+ "idna==3.3",
18
+ "pytz==2021.3",
19
+ "requests==2.27.0",
20
+ "urllib3==1.26.7",
21
+ "pymongo==4.3.3",
22
+ "markdown==3.4.1",
23
+ ]
24
+
25
+ [project.scripts]
26
+ assignment-codeval = "assignment_codeval.cli:cli"
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
@@ -0,0 +1,75 @@
1
+ import datetime
2
+ import sys
3
+ from functools import cache
4
+ from typing import NamedTuple
5
+ from configparser import ConfigParser
6
+
7
+ import click
8
+ from canvasapi import Canvas
9
+ from canvasapi.current_user import CurrentUser
10
+
11
+ from assignment_codeval.commons import error, info, errorWithException
12
+
13
+ CanvasConnection = NamedTuple('CanvasConnection', [('canvas', Canvas), ('user', CurrentUser)])
14
+
15
+
16
+ def _check_config(parser, section, key):
17
+ if section not in parser:
18
+ error(f"did not find [{section}] section in {parser.config_file}.")
19
+ sys.exit(1)
20
+ if key not in parser[section]:
21
+ error(f"did not find {key} in [{section}] in {parser.config_file}.")
22
+ sys.exit(1)
23
+
24
+
25
+ def connect_to_canvas():
26
+ parser = ConfigParser()
27
+ config_file = click.get_app_dir("codeval.ini")
28
+ parser.read(config_file)
29
+ parser.config_file = config_file
30
+
31
+ for key in ['url', 'token']:
32
+ _check_config(parser, 'SERVER', key)
33
+ try:
34
+ canvas = Canvas(parser['SERVER']['url'], parser['SERVER']['token'])
35
+ user = canvas.get_current_user()
36
+ info(f"connected to canvas as {user.name} ({user.id})")
37
+ return CanvasConnection(canvas, user)
38
+ except:
39
+ errorWithException(f"there was a problem accessing canvas.")
40
+
41
+
42
+ @cache
43
+ def get_course(canvas, name, is_active=True):
44
+ ''' find one course based on partial match '''
45
+ course_list = get_courses(canvas, name, is_active)
46
+ if len(course_list) == 0:
47
+ error(f'no courses found that contain {name}. options are:')
48
+ for c in get_courses(canvas, "", is_active):
49
+ error(fr" {c.name}")
50
+ sys.exit(2)
51
+ elif len(course_list) > 1:
52
+ error(f"multiple courses found for {name}: {[c.name for c in course_list]}")
53
+ for c in course_list:
54
+ error(f" {c.name}")
55
+ sys.exit(2)
56
+ return course_list[0]
57
+
58
+
59
+ def get_courses(canvas, name: str, is_active=True, is_finished=False):
60
+ ''' find the courses based on partial match '''
61
+ courses = canvas.get_courses(enrollment_type="teacher")
62
+ now = datetime.datetime.now(datetime.timezone.utc)
63
+ course_list = []
64
+ for c in courses:
65
+ start = c.start_at_date if hasattr(c, "start_at_date") else now
66
+ end = c.end_at_date if hasattr(c, "end_at_date") else now
67
+ if is_active and (start > now or end < now):
68
+ continue
69
+ if is_finished and end < now:
70
+ continue
71
+ if name in c.name:
72
+ c.start = start
73
+ c.end = end
74
+ course_list.append(c)
75
+ return course_list
@@ -0,0 +1,17 @@
1
+ import click
2
+ from assignment_codeval.evaluate import run_evaluation
3
+ from assignment_codeval.github_connect import github_setup_repo
4
+ from assignment_codeval.submissions import download_submissions, upload_submission_comments
5
+
6
+
7
+ @click.group()
8
+ def cli():
9
+ pass
10
+
11
+ cli.add_command(run_evaluation)
12
+ cli.add_command(download_submissions)
13
+ cli.add_command(upload_submission_comments)
14
+ cli.add_command(github_setup_repo)
15
+
16
+ if __name__ == "__main__":
17
+ cli()
@@ -0,0 +1,51 @@
1
+ from typing import NoReturn
2
+
3
+ import click
4
+ import time
5
+ import dataclasses
6
+
7
+
8
+ @dataclasses.dataclass(init=True, repr=True, frozen=True)
9
+ class _Config():
10
+ """Global configuration object for the CLI"""
11
+ show_debug: bool
12
+ dry_run: bool
13
+ force: bool
14
+ copy_tmpdir: bool
15
+
16
+ # static global config instance
17
+ _instance: '_Config' = None
18
+
19
+
20
+ def get_config():
21
+ if _Config._instance is None:
22
+ _Config._instance = _Config(False, True, False, False)
23
+ return _Config._instance
24
+
25
+
26
+ def set_config(show_debug, dry_run, force, copy_tmpdir):
27
+ _Config._instance = _Config(show_debug, dry_run, force, copy_tmpdir)
28
+ return _Config._instance
29
+
30
+
31
+ def _now():
32
+ return time.strftime("%Y-%m-%d %H:%M:%S", time.gmtime())
33
+
34
+
35
+ def debug(message):
36
+ if get_config().show_debug:
37
+ click.echo(click.style(f"{_now()} D {message}", fg='magenta'))
38
+
39
+ def error(message):
40
+ click.echo(click.style(f"{_now()} E {message}", fg='red'))
41
+
42
+ def errorWithException(message) -> NoReturn:
43
+ error(message)
44
+ raise EnvironmentError(message)
45
+
46
+ def info(message):
47
+ click.echo(click.style(f"{_now()} I {message}", fg='blue'))
48
+
49
+
50
+ def warn(message):
51
+ click.echo(click.style(f"{_now()} W {message}", fg='yellow'))