A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://cran.rstudio.com/web/packages/uwot/../Rcpp/../TexExamRandomizer/vignettes/GradingExams.html below:

Grading Exams

Grading Exams 2024-01-23

This vignette describes how to grade an exam using the script found on the exec/ folder. You can find where this script is found by typing on an R terminal.

    system.file("exec", package = "TexExamRandomizer")
## [1] "/private/var/folders/f9/gx8nkt0j6kgcwslqj681v8jw0000gp/T/Rtmp3qkaKd/Rinst10a993d21e2d0/TexExamRandomizer/exec"
Format of the student’s answers

When you collect the responses from the student, you should follow the following prescriptions:

  1. The actual answers to each question should be an integer number, (1 being the first choice). They should be written in the order that they are presented in the student’s exam. The columns should be called “Question 1”, “Question 2”, etc. (Or “Q 1”, “Q2”, etc).
  2. If you want to give extra points, add a column called “Extra Points” with the added points.
  3. You require a column called “Version”, where the version number for each exam is written.
  4. All other columns will be placed on the output “_Graded.csv” table.
Example responses format M601 21 Titar 1 3 4 4 1 1 1 1 3 4 1 3 4 1 2 4 5 M601 1 Suwannapoe 3 2 3 1 2 3 3 2 4 4 2 3 3 2 2 3 9 M601 5 Kan 3 3 3 4 2 1 3 4 2 1 1 3 4 2 3 3 6 M601 16 Mei 16 1 1 2 4 1 2 3 2 1 3 1 3 2 3 1 9 M601 17 Offside 6 2 4 4 1 3 2 1 2 3 3 2 1 1 2 3 3 Format of the answer sheet

When you created the exams by using the examrandomizer script, it will generate a fullanswersheet.csv file. When grading, it assumes that the “correctTag” and the “wrongTag” are respectively “choice” and “CorrectChoice”.

If that is not the case in your structure, simply change the names to “choice” and “CorrectChoice” of those columns.

Answer sheet, the last two rows must be named “choice” and “CorrectChoice” 0 1 1 1 1 1 1 1 1 1 NA 0 2 1 1 1 1 1 2 1 2 NA 0 3 1 1 1 1 1 3 1 3 NA 0 4 1 1 1 1 1 4 1 NA 4 0 5 1 2 1 2 1 1 1 1 NA

(But if you really want to change that assumption, you can access the script and edit ASHEET_COLCORRECT and ASHEET_COLINCORRECT)

What if I wrote some questions wrong and I noticed too late? I already printed the exams

If you realize one question is incorrectly written, but it is too late to rewrite the exam, find the original answer sheet in the fullanswersheet.csv (The rows with version number 0). Then, remove the lines from the answer sheet that refer to the question you want to ignore. (Keep a backup of the answer sheet, just in case).

When the students answer the exam. Tell them to write any answer in that question, it will be ignored by the program.

Grading the exam

You can specify directly the responses from the students (--resp) and the answer sheet (--answer).

gradeexamrandomizer --resp <student responses csv> --answer <fullanswersheet csv>

If you have both of those files on the same directory and you make sure their name is somehow similar to “responses*.csv” and “answer*.csv”, you can write the shorthand version.

gradeexamrandomizer --dir <dirname>

The output will be two csv files in the same directory where the students’ responses file is found, called *_Graded.csv and *_Stats.csv.

Output graded file M601 21 Titar 1 0 0 14 14 100.00000 M601 1 Suwannapoe 3 0 0 14 10 71.42857 M601 5 Kan 3 0 0 15 5 33.33333 M601 16 Mei 16 0 0 15 8 53.33333 M601 17 Offside 6 0 0 14 6 42.85714 Output stats (It simply counts how many times each choice was chosen by every student) 0 1 1 1 1 1 1 1 1 1 NA 1 0 2 1 1 1 1 1 2 1 2 NA 1 0 3 1 1 1 1 1 3 1 3 NA 5 0 4 1 1 1 1 1 4 1 NA 4 18 0 5 1 2 1 2 1 1 1 1 NA 1 How is the grade calculated

In the output *_Graded.csv table, as you can see in the example above, there will be 5 rows added to the output:


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4