SAT Coaching Found to Boost Scores — Barely
Study Results Run Counter to Test-Prep Course Claims; How Colleges Fuel Industry
May 20, 2009
Check out the comments on this article including:
“Agreed – a lot of the courses are not great – but the right one-on-one SAT tutor (who makes the kids work) is worth his/her weight in gold. I believe college administrators are saying the politically correct thing. It was obvious to me (twice) that their is an advantage to students who can afford good SAT prep. Best money I spent – would do it again.”
(Please see corrections and amplifications item below.)
Families can spend thousands of dollars on coaching to help college-bound students boost their SAT scores. But a new report finds that these test-preparation courses aren’t as beneficial as consumers are led to believe.
The report, to be released Wednesday by the National Association for College Admission Counseling, criticizes common test-prep-industry marketing practices, including promises of big score gains with no hard data to back up such claims. The report also finds fault with the frequent use of mock SAT tests because they can be devised to inflate score gains when students take the actual SAT. The association represents 11,000 college admissions officers, high-school guidance counselors and private advisors.
“It breaks my heart to see families who can’t afford it spending money they desperately need on test prep when no evidence would indicate that this is money well-spent,” says William Fitzsimmons, Harvard University’s dean of undergraduate admissions, who led a group at the college admissions association that prompted the report.
Jonah Varon, a straight-A student at Lowell High School in San Francisco, took a mock SAT from a test-prep company last year and scored 2060 out of a possible 2400. A few weeks later, with no tutoring, he took the real test. His score: a perfect 2400, or 340 points higher.
Mr. Varon, who is headed to Harvard in the fall, was suspicious. The coaching company, Revolution Prep, of Santa Monica, Calif., says its mock tests are calibrated to be at the same difficulty level as the real SAT. So why had it seemed to the student so much harder?
After gathering test scores from 15 classmates who had had similar experiences, Mr. Varon and classmate Lydia O’Connor wrote an article for their school newspaper claiming that the mock test was far more difficult — or was scored more harshly — than the actual exam to make Revolution Prep appear to be raising test scores more than it actually does.
“It seems like dishonest advertising,” Mr. Varon says.
Revolution Prep says that the experiences of Mr. Varon and several of his classmates were “outliers,” and that surveys of students at Lowell High School generally show high satisfaction with the test-coaching company’s results.
Scores of coaching companies, including Washington Post Co.’s Kaplan unit and Princeton Review Inc., the two largest players, help prepare students each year to take the SAT, used by many colleges to help make admission decisions. Companies typically charge $1,100 for a class and $100 to $200 an hour for individual tutoring, the college admissions counselors’ report says. In total, about two million students spend $2.5 billion a year on test preparation and tutoring, including the SAT, according to Eduventures Inc., a Boston research and consulting firm.
Examining Test Prep
A new report says claims by SAT-prep firms may be inflated, raising questions about costly coaching.
- Studies find test prep boosts average SAT score by just 30 points.
- Critics say firms’ mock tests may be harder than actual exams, inflating score gains.
- At some colleges, even small score gains can help with admission.
The college counselors’ report concludes that, on average, prep courses yield only a modest benefit, “contrary to the claims made by many test-preparation providers.” It found that SAT coaching resulted in about 30 points in score improvement on the SAT, out of a possible 1600, and less than one point out of a possible 36 on the ACT, the other main college-entrance exam, says Derek Briggs, chairman of the research and methodology department at the University of Colorado in Boulder and author of the admissions counselors’ report.
The report was prepared by reviewing numerous academic studies from past years that examined the impact of test preparation on SAT scores. The studies predated the addition of the writing section of the SAT in 2005, which increased the possible score total to 2400 from 1600.
The report also noted that some college-admissions officers indirectly encourage applicants to sign up for SAT-prep courses by setting score cutoffs. A survey included in the report found that more than a third of schools with tight selection criteria said that an increase of just 20 points in the math section of the SAT, and of 10 points in the critical reading section, would “significantly improve students’ likelihood of admission.”
The nonprofit College Board, which oversees the SAT, is critical of colleges that select applicants based on small score differences that aren’t statistically significant. Laurence Bunin, a College Board senior vice president, says the board’s own research shows limited benefit from test-prep courses. He says familiarity with the SAT tends to provide the biggest short-term gains for students. He recommends free and low-cost College Board materials, including a $20 study guide.
Test-prep companies say that some students see substantial gains in their SAT scores as a result of coaching, even if studies show that average test-score improvements are limited. For example, Kaplan cites two of its former students, Lily and Emma Shepard, twin sisters who are seniors at Montclair Kimberley Academy in New Jersey. Kaplan says Emma increased her SAT score, compared with an initial diagnostic test, by 450 points to 2210, while Lily’s score rose 330 points, to 2190. The family paid $4,000 to Kaplan for a tutor to come to their home. “I learned new material as well as test-taking tricks,” says Lily, who will be attending Duke University next year. Emma is going to Georgetown University.
The sisters’ gains were smaller when compared with their scores on the Preliminary SAT, or PSAT, which the College Board says is a good predictor of SAT scores. In that comparison, Lily’s score improved 110 points, and Emma’s rose 300 points.
Kaplan officials say they take pains to make their diagnostic test similar to the real SAT. Seppy Basili, senior vice president at Kaplan, says that the PSAT doesn’t include higher-level algebra, while the SAT does, so some students score lower on the real test. In addition, he said, Lilly and Emma skipped many questions on the diagnostic test, which could explain the different scores.
Some test-prep companies acknowledge there is nothing to hold them accountable for score-gain promises. “The industry is not regulated,” says Paul Kanarek, a senior vice president with Princeton Review. “It is sort of the wild, wild West.”
Kaplan and Princeton Review say they make no claims about any specific average point increases, calling that practice inherently misleading because it is difficult to collect accurate data.
Revolution Prep offers a “score improvement guarantee” of 200 points for students taking its coaching courses. But co-founders Ramit Varma and Jake Neuberg say the guarantee doesn’t mean that all students will increase their scores by that much. If students don’t achieve a 200-point gain, they are entitled to a free repeat of the course, they say.
Revolution and other test-prep companies say they use their own diagnostic tests for baseline comparisons because the College Board publishes only eight practice tests — also simulations — in its official SAT guide, and many students have already taken them. In the past, the board published actual SATs from previous administrations of the exam, but discontinued that practice in 2005 when the writing section was added. The College Board says it will begin including three actual tests this summer in the new edition of its SAT guide, along with seven simulated tests.
In Newton, Mass., Summit Educational Group Inc. says its “proven score increases on the SAT are 180 to 400 points.” Chief Executive Charles O’Hearn says those figures are based on improvement only from real PSATs or SATs, not diagnostic tests. Still, he says the figures are based on surveys to which fewer than half of students respond. “I wouldn’t say there isn’t an element of marketing in this,” he says.
On its Web site, Elite Educational Institute Inc., of Irvine, Calif., advertises a 240-point average increase in SAT scores, calculating it in comparison with its own diagnostic exam. After an inquiry from a reporter, the company says it plans to take the claim, which it says was based on the SAT before the addition of the writing section, off its Web site. “Any test-prep company that gives you their own test with their own score scale could be accused of fudging the numbers to make students think they improved more than they really had,” Kevin Sung, Elite’s chief operating officer, said through a spokeswoman.
Write to John Hechinger at email@example.com
Corrections & Amplifications:
Lydia O’Connor, a senior at Lowell High School in San Francisco, co-wrote an article in the school newspaper that questioned the practices of an SAT test-preparation firm. A previous version of this story named only Ms. O’Connor’s collaborator as author of the article.