Uploaded image for project: 'Sakai'
  1. Sakai
  2. SAK-40565

samigo: feature request: IF-AT questions (new scoring method with immediate feedback for MC)



    • Test Plan:

      Please add a Test Plan here.

      Please add a Test Plan here.


      An instructor would like to create an IF-AT multiple choice assessment in Tests & Quizzes. A description of IF-AT assessments can be found here: http://www.epsteineducation.com/home/about/

      In IF-AT quizzes, students get immediate feedback as to whether their answer is correct or incorrect, and then they can change their answer until they get the question right. If they answer incorrectly, the number of incorrect attempts decreases their score for the individual question, resulting in partial credit for questions they answered incorrectly at least once. The quiz would also have to prevent students from continuing to change their answer to a particular question upon entering a correct answer.

      A potential workaround would be to develop SAK-34754, allowing students to only answer questions they got wrong during a second quiz attempt. The instructor could use Feedback on Submission to indicate which questions were wrong or provide hints for them to arrive at the correct answers the next time and the students could make a new attempt at the remaining questions. Taking the average score of all attempts would then approximate the desired partial credit scoring.

      The instructor provided the following additional information regarding the IF-AT assessment format:

      I would like to recreate the process of taking these tests by paper as
      closely as possible. Students are given a handout with all the
      questions on it along with a "scratchers" sheet. These sheets are
      available from the website you reference in your email. They function
      similarly to lottery tickets where you scratch off what you think is
      the correct answer and there is a star randomly placed in the
      rectangle if you guessed correctly, and it's blank if not.

      If this were computerized it would be MUCH more straightforward than
      having to purchase these specialized scratchers and then match up the
      questions with the pre-defined order on the scratchers. That said, I
      definitely would want students to be able to see the entire test and
      not be forced to make an answer to move on to the next problem. I
      would say the most natural way to do this is to have to manually
      register an answer choice for each problem, receiving immediate
      feedback as to whether it was correct, and keeping a tally of how many
      attempts they made. You would want to make it so they couldn't
      continue making guesses once they got the answer correct. It would
      also be nice to keep a tally of how many questions they have yet to
      complete (i.e., having found the correct answer). So, you would
      decouple moving around from question-to-question from actually
      registering a response. The other cool thing is that they leave
      knowing the score they got on the exam (which could obviously change
      if the teacher decides to drop a question or curve the final values).

      The scoring would need to be customizable. For example, in my classes
      I typically use questions with 4 answers. If they get the answer
      correct on one try, they get full credit, if they get the answer
      correct on two tries, they get half credit, if they get the answer
      correct on three tries they get one quarter credit. Finally, if they
      only revealed the answer on their 4th try they get zero points, but
      the at least learn what the correct answer is. Thus, the scoring is
      not a linear effect of number of tries.

      As someone who studies memory, I know there is a significant
      improvement in long-term retention if students learn during the test,
      itself, so from an educational standpoint this is a major improvement
      relative to almost all standard testing approaches.

        Gliffy Diagrams



              Issue Links



                  maintenanceteam Core Team
                  rainribbon Tiffany Stull
                  1 Vote for this issue
                  3 Start watching this issue



                      Git Integration