Staffeli NT Technology
With Staffeli, we work with local course clones. We aim to keep these clones compatible with git.
We recommend that you create a local directory canvas,
absalon, or similar, for all of you Canvas-related local course
clones.
Staffeli needs some initial help to be able to login with your
credentials. You need to generate a
token
for Staffeli to use, and save it in your home directory in a file with
the name .canvas.token.
Alternatively, you can specify a custom token file location using the --token flag (see General Usage below).
NB! This is your personal token so do not share it with others, else they can easily impersonate you using a tool like Staffeli. Unfortunately, to the best of our knowledge, Canvas has no means to segregate or specialize tokens, so this is really "all or nothing".
-
Install uv if you haven't already:
curl -LsSf https://astral.sh/uv/install.sh | shOr on Windows:
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
-
Clone the repository and navigate to it:
git clone https://github.com/kfl/staffeli_nt.git cd staffeli_nt -
Sync dependencies (creates virtual environment and installs packages):
uv sync
-
Install the
staffelitool for your user:uv tool install .This makes the
staffelitool available from anywhere on your system. -
Use the
staffelitool:staffeli download <course_id> <template.yaml> <assignment-dir>
-
Update after pulling new changes:
uv tool install . --force --reinstall -
Uninstall the
staffelitool:uv tool uninstall staffeli-nt
If you prefer using pip, you can install the package:
-
Create a virtual environment called
env.On macOS and Linux:
$ python3 -m venv envOn Windows:
$ py -m venv env -
Activate
envOn macOS and Linux:
$ source env/bin/activateOn Windows:
$ .\env\Scripts\activate -
Now install
staffeli_ntin editable mode inenv$ pip install -e .This also makes the
staffelitool available (only within the activated virtual environment).
Alternatively, if you need a requirements.txt file:
$ pip install pip-tools
$ pip-compile pyproject.toml -o requirements.txt
$ pip install -r requirements.txt
To see all available commands and options:
$ staffeli --help
usage: staffeli [-h] [--version] [--token PATH]
{scan,download,info,upload,upload-single} ...
Staffeli NT - Canvas LMS command-line tool (version 0.3.0)
options:
-h, --help show this help message and exit
--version show program's version number and exit
--token PATH path to Canvas token file (default: ~/.canvas.token)
subcommands:
{scan,download,info,upload,upload-single}
scan check if grading is fully done
download fetch submissions
info fetch infomation related to a course
upload upload feedback for submissions
upload-single upload feedback for a single submission
For more information, visit https://github.com/kfl/staffeli_nt
To get help for a specific subcommand, use staffeli <subcommand> --help, for example:
$ staffeli download --help
By default, staffeli looks for your Canvas token in ~/.canvas.token. If you need to use a different token file, you can specify it with the --token flag:
$ staffeli --token /path/to/my/token download 12345 template.yml ass1dir
This is useful if you:
- Have multiple Canvas accounts with different tokens
- Want to store your token in a non-default location
- Are testing with different credentials
There are multiple options for fetching submissions.
The general command is staffeli download <course_id> <template.yaml> <assignment-dir> [flags], where
<course_id>is the canvascourse_idfor the course.<template.yaml>is the template file to use when generating thegrade.ymlfile for each submission<assignment_dir>is a non-existing directory, that staffeli will create and store the submissions in.
Fetching all submissions:
To fetch all submissions from the course with id 12345, using the template-file ass1-template.yml and create a new directory "ass1dir" to store the submissions in:
$ staffeli download 12345 ass1-template.yml ass1dir
This will present you with a list of assignments for the course, where you will interactively choose which assignment to fetch.
For each submission, a directory will be created in <assignment_dir>, in which the handed-in files of the submission will be stored, alongside a file grade.yml generated from the <template.yaml>.
Submission comments, if any, will be downloaded as well, and stored alongside grade.yml and the files of the hand-in.
In case the student hands in a file called grade.yml it will be overwritten by staffeli. If the student hands in a file called submission_comments.txt and has written submission comments on the Canvas website, these comments will also overwrite the handed-in file.
What we call "Hold", canvas/absalon calls sections.
To fetch all submissions for an assignment, where the student belongs to a given section, and the <course_id> is 12345:
$ staffeli download 12345 ass1-template.yml ass1dir --select-section
This will present you with a list of assignments for the course, where you will interactively choose which assignment to fetch, followed by a list of sections for you to choose from.
It is possible to fetch specific submissions based on a list of kuids. To do this, create a YAML-file with the following format:
TA1:
- kuid1
- kuid2
- kuid3
TA2:
- kuid4
- kuid5To then fetch all submissions for an assignment for a given TA:
$ staffeli download <course_id> ass1-template.yml ass1dir --select-ta ta_list.yml
where ta_list.yml is a YAML-file following the above format.
This will present you with a list of assignments for the course, where you will interactively choose which assignment to fetch, followed by the list of TA's from your ta_list.yml file.
Selecting a TA, will fetch submissions from each kuid in the file, associated with the chosen TA, i.e. selecting TA1 will fetch submission from kuid1, kuid2 and kuid3.
In the template.yml-file you can add a field:
onlineTA: https://address.of.onlineTA.dk/grade/assignmentName
This will (attempt to) run onlineTA for each downloaded submission.
It is possible to only fetch submissions that are either ungraded or have a score < 1.0.
Currently this is implemented specifically for the PoP-course and might not be available in the current form in later releases.
This can be achieved by appending the --resub flag to any use of the download subcommand.
Use staffeli upload <template.yaml> <assignment-dir> [--live] [--step].
The default is to do a dry run, that is not to upload anything
unless the --live flag is given.
For instance, to review all feedback for submissions in the directory
ass1 before uploading:
$ staffeli upload ass1-template.yml ass1 --step
To upload all feedback for submissions in the directory
ass1:
$ staffeli upload ass1-template.yml ass1 --live
To upload feedback for a single submission:
$ staffeli upload-single <POINTS> <meta.yml> <grade.yml> <feedback.txt> [--live]
To generate feedback.txt locally for submissions in the directory
ass1:
$ staffeli upload ass1-template.yml ass1 --write-local
A (minimal) template could look like:
name: Mini assignment
tasks:
- overall:
title: Overall
points: 6
rubric: |
Some default feedback.
Your code and report are unreadable.
Wow, that's really clever.The template files support a few optional fields.
-
passing-points: N: Adding this field will have the effect, that the grade posted is1if the total sum of points is greater than or equal topassing-points, and0otherwise. -
show-points: BOOLSetting show-points tofalsewill exclude thepoints/gradefrom the generatedfeedback.txtfiles. Use this, if you do not want the students to see the points-per-task, but only receive an overall grade. -
onlineTA: ADDRInclude this field to (attempt to) run onlineTA at addressADDRfor each submission, when downloading submissions.
name: Mega assignment
passing-points: 42
show-points: false
onlineTA: https://yeah-this-is-not-a-real-address.dk/grade/megaassignment
tasks:
- megaAssignmentGeneral:
title: Mega assignment - General comments and adherence to hand-in format requirements
points: 100
rubric: |
[*] You should spell check your assignments before handing them in
[-] You are using the charset iso-8859-1. Please move to the modern age.
[-] Your zip-file contains a lot of junk. Please be aware of what you hand in.
- megaAssignmentTask1:
title: Task 1
points: 2
rubric: |
[+] Your implementation follows the API
[-] Your implementation does not follow the API
[+] Your tests are brilliant
[-] Your tests are not tests, just print-statements.
This is equivalent to an exam without an examinator, where you shout
in a room for half an hour and give yourself the grade 12.
- megaAssignmentTask2:
title: Task 2
points: 2
rubric: |
[+] Very good points.
[+] Very good points. However, I disagree with ...
[-] I fail to comprehend you answer to this task.
- megaAssignmentBonusTask:
title: Bonus tasks that do not give points, or another option for general comments
rubric: |
[*] You did extra work! It won't help you though.If you want to contribute to staffeli_nt or run type checking locally:
-
Clone and sync with dev dependencies:
git clone https://github.com/kfl/staffeli_nt.git cd staffeli_nt uv sync --extra devThis installs the package along with development tools like mypy, pyright, ruff, and type stubs.
To run mypy type checking:
uv run mypy -p staffeli_ntTo run pyright type checking:
uv run pyright staffeli_ntTo format code with ruff:
uv run ruff format staffeli_nt/To check for linting issues:
uv run ruff check staffeli_nt/To auto-fix linting issues:
uv run ruff check --fix staffeli_nt/After making code changes, you can test them locally by reinstalling:
uv tool install . --force --reinstallOr for development, you can use uv run to run the tool directly from the repository without installing:
uv run staffeli download <course_id> <template.yaml> <assignment-dir>