Home > sage > A Homework on the Web System

A Homework on the Web System

February 4, 2011 Leave a comment Go to comments

In the early 2000’s, frustrated with the behavior of most computer-based homework systems in the market, my advisor—Bradley Lucier—decided to take matters into his own hands, and with the help of a couple of students, developed an amazing tool: It generated a great deal of different problems in Algebra and Trigonometry. A single problem model had enough different variations so that no two students would encounter the same exercise in their sessions. It allowed students to input exact answers, rather than mere calculator approximations. It also allowed you to input your answer in any possible legal way. In case of an error, the system would occasionally indicate you where the mistake was produced.

It was solid, elegant, fast… working in this project was sheer delight. The most amazing part of it all: it only took one graduate student to write the codes for the problems and checking for validity of answer. Only two graduate students worked in the coding of this project, with the assistance of several instructors, and Brad himself. He wrote a fun article explaining how the project came to life, enumerating the details that made it so solid, and showing statistical evidence that students working with this environment benefitted more than with traditional methods of evaluation and grading. You can access that article either [here], or continue reading below.

SAGE: a Homework on the Web System*

by Brad Lucier

*SAGE = Student Assignments Graded Electronically

A desire to help students succeed in their math courses resulted in the development of a unique system for doing
homework on the web.

Each year, the Department of Mathematics teaches precalculus (algebra and trigonometry) to thousands of students. While most of these students have studied these subjects in high school, very few can claim mastery of the material upon entering Purdue. Their backgrounds vary: for some, high school algebra was their terminal math course, while for others, trigonometry was studied as a prerequisite for calculus.

In many cases, an adverse experience in a previous math class, and/or a general lack of motivation to “re-learn” material previously studied in high school, hinders our precalculus students. As a result, the fraction who either withdraw from algebra and trig or who earn a grade of D or F—a number called the W/D/F rate—is frequently high. For example, prior to the year 2000, the W/D/F rate for MA 154 (trigonometry) was at times over 60%.

In fall 2000, the Mathematics Department decided to investigate whether homework-on-the-web systems could improve the experience of our precalculus students. An examination of the offerings of several textbook publishers and independent software developers revealed that none of the systems available at the time was suitable for general use. Realizing the department would have to develop its own homework system, we decided to begin with trigonometry course material, and I undertook the development of a system based on the following principles:

  1. the system should be correct: it should accept all correct answers and reject all incorrect answers.
  2. the system should be flexible: all possible questions should be posable and checkable within the system.

To anyone not familiar with the field, the two principles seem self-evident—how could any system be developed that did not satisfy these principles? Nevertheless, here are some problems found in commercial systems in 2000.

  1. A problem in one publisher’s system was posed as follows:
    The value of \sin \tfrac{\pi}{4} is:
    (A)~.707107\qquad (B)~1.414214\qquad (C)~.5\qquad (D)~2

    Here the choices do not include the true answer \sqrt{2}/2, which can be input as sqrt(2)/2, although (A) is an approximation to the answer. This can be corrected by simply changing the problem to

    The value of \sin \tfrac{\pi}{4} to six decimal places is…”

  2. Often systems will accept correct answers in one form but not in an equivalent form. For example, one might find that sqrt(2)/2 is accepted as the value of \sin \tfrac{\pi}{2}, but sqrt(1/2) is not. This particular example is often caught, but there are sometimes an infinite number of correct answers. For example one may ask:

    For which values of x is \sin^2 x = \tfrac{1}{2}? Use n to indicate an integer.

    While the natural answer to this problem is \tfrac{\pi}{4}+\tfrac{\pi}{2}n, infinitely many other answers, including, e.g., \tfrac{\pi}{4}-\tfrac{\pi}{2}n and 3\tfrac{\pi}{4}+\tfrac{n\pi}{2}, are also correct. So an attempt at pattern-matching certain answers is not enough; some level of symbolic manipulation is necessary in the system.

  3. One commercial system we tested accepted \sqrt{1-\cos^2 x} as an answer to a question where the true answer was \sin x. A moment’s reflection shows that this answer could not possibly be correct—\sin x is positive half the time and negative the other half, yet \sqrt{1-\cos^2 x} is always positive.

    How could this system say that these two functions were equal? It tests two functions for equality by evaluating them at a number of random points in an interval, and the default interval is [0,1]. For x \in [0,1], both \sin x and \sqrt{1-\cos^2 x} take the same values. This makes the mistake understandable, perhaps, but it is still an error.

Thus, a successful system needs to use floating-point arithmetic for testing whether answers are within a certain tolerance, rational arithmetic for other answers ( \tan^2 \tfrac{\pi}{6} = \tfrac{1}{3}, not .333333333), and symbolic processing for other answers. Furthermore, the students are used to treating decimal numbers as exact—in most programming languages, for example, .1 is automatically transformed into the floating-point number nearest to .1, which is

3602879701896397/36028797018963968 =.100000000000000005551115123125\dotsc

Students expect it to be treated as \tfrac{1}{10} exactly.

So much for correctness. To maintain flexibility, I decided to use a general-purpose programming language both to describe the problems and to check the correctness of the answers. This would give the maximum power and flexibility in designing problems, at the cost of extra complexity. In particular, it would not be reasonable to expect the lecturers teaching the courses to write code to define or check the answers to problems.

A further decision was to use the system mainly as an aid to understanding rather than as a way to assign grades. Students are offered unlimited attempts to answer each question. A student’s score on a question is recorded on the due date for that question, but a student can go back later in the semester and attempt questions that were not completed earlier. The parameters of each problem are chosen randomly, sometimes from a small list (there are only so many “nice” multiples of \pi), sometimes from a larger set. So each form of a specific problem is given to a number of students,but each student has a unique homework assignment for the course as a whole.

These design considerations led to the decision to use the programming language scheme, a dialect of Lisp, to implement the main part of the system and to use Maple as a back end to provide the symbolic manipulations capabilities that were needed. Scheme has exact rational arithmetic, as well as floating-point arithmetic, and the system can be programmed so that, e.g., .1 is interpreted as \tfrac{1}{10} exactly.

Of course, building the system was a bigger task than first anticipated. I first collaborated with graduate student Jeremy TerBush for several years to build the main part of the system and later worked with another graduate student, Francisco Blanco-Silva, to finish it. Others involved in the project included Rita Saerens, supervisor of all teaching assistants; Fabio Milner, chair of the elementary services committee; Devi Nichols, coordinator of elementary services courses; and Tim Delworth, coordinator and main lecturer for MA 154.

Delworth showed an incredible amount of enthusiasm, support, and patience while the system was being developed. Each year a new group of students has to be taught that, yes, the answer the system expects is the right answer, since any deviance for correctness is treated as a design flaw and is fixed as it is uncovered. At first Delworth answered all questions about the homework system himself, a tremendous undertaking when 800 students are using it each spring semester. Later, a bulletin board system was established, and an undergraduate “answer person” hired, so that common questions could be answered by other students or by the undergraduate “expert”.

The results have been good, even surprisingly good. The immediate effect was to halve the W/D/F rate to about 30%. Students get immediate feedback on the correctness of their homework answers, and most students who make any attempt at all to do the homework answer nearly all problems correctly in the end.

After two semesters of such success, the W/D/F rate started climbing again, and the number of students in the class stayed roughly the same. Normally, a higher success rate would lead to fewer students repeating the class, which would lead to lower enrollment. Since the enrollment has stayed the same, however, it appears that more students are being placed in the course by academic advisors. One suspects, therefore, that the greater success rate has led advisors to assign more marginally prepared students to the course. Even so, the W/D/F has remained significantly below the pre-2000 rate of 60%.

Homework-on-the-web systems offered by publishers have improved since 2000. Perhaps a commercial system now exists that would satisfy our requirements. Recent attempts to extend the system to high school algebra have not met with success—we have not found a rigorous definition of the requirement to “simplify” an answer in algebra. But some attempts are still being made in this area, and perhaps a commercial system or a locally developed system will eventually be adopted for algebra courses.

  1. No comments yet.
  1. No trackbacks yet.

Leave a comment