The Indian Express reports that new Canadian immigration rules will likely mean the end of the so-called “IELTS Wedding.”

When I summarized the changes a few days ago I didn’t even mention that spouses of most students will no longer be eligible for an open work permit in Canada. I guess that change is more meaningful than I earlier thought.

As regular readers know, a play about studying for the TOEFL won the Pulitzer Prize for drama in 2023. Perhaps we can expect a romantic comedy about the IELTS in the years ahead.

Great article this week in “Educational Measurement” about remote proctoring.

The abstract says:

“The main role of remote proctors is to make judgments about test takers’ behaviors and decide whether these behaviors constitute rule violations. Variability in proctor decision making, or the degree to which humans/proctors make different decisions about the same test-taking behaviors, can be problematic for both test takers and test users.”

No kidding.  This has driven me bonkers since 2020.

And also:

“Our results show that (1) proctors systematically differ in their decision making and (2) these differences are trait-like (i.e., ranging from lenient to strict), but (3) systematic variability in decisions can be reduced.”

The reports I have received from test-takers over the years suggest that when done on a wide scale proctoring is sometimes inconsistent. This issue has negatively impacted many of the test-takers I’ve communicated with.

Says the article:

“Taken together, a lack of proctor training and incomplete information about test-taking behaviors provide the foundation upon which more extraneous factors can influence proctors’ decisions. In fact, a considerable amount of research has demonstrated that human decision making is highly variable and due to a variety of idiosyncratic factors.”

It concludes that:

“Reducing variability in proctor decision making not only improves test takers’ outcomes but also strengthens test security: honest, well-intentioned test takers are more likely to receive certified test scores, and dishonest test takers are less likely to receive certified scores. Test users (e.g., university admissions) can also have greater confidence that certified test results are attributable to test takers’ abilities (e.g., ability to use the English language in an academic setting) when proctor decision making is more consistent.”

Alas, after four years of what I view as sub-par remote proctoring, I fear that many large testing firms don’t feel any urgency to improve this aspect of test-taker experience.

ETS has acquired PSI.  Check out the announcement on the ETS website. This is really great news for several reasons.

The most critical is that PSI operates the “Skills for English: SELT” test which is accepted by the Home Office of the UK for visas. This sort of acceptance is critically important for language testing firms nowadays.  Getting the TOEFL back into the good graces of the UK Home Office is one of ETS’s top priorities these days (I think) and this will certainly help.


  1. PSI has live online proctoring capabilities. In the short term, ETS really needs to bring all proctoring in-house. This purchase may facilitate that.  I don’t think ETS can keep paying an arm and a leg for oursourced proctoring services which I think are… not fantastic.
  2. PSI has asynchronous proctoring capabilities. In the medium term, ETS really needs to phase out live online proctoring. This purchase may facilitate that.
  3. PSI does a lot of tests across a variety of industries. This is what ETS used to do, and ought to do more of in the future.

The College Board’s 990 form for the year ending December 2022 is now available via Propublica. Though this duplicates much of the stuff in the audit released in September, a few fun facts are worth mentioning:

  1. Revenue for the year was about 1.04 billion dollars (including 126 million of investment income).  That’s up from 983 million the year before.
  2. Expenses were about 894 million dollars.
  3. The College Board’s profit was a cool 145 million dollars.
  4. Total assets are valued at 2.01 billion dollars (up from 1.9 billion)

As far as I can tell, this was the first year in recent memory where the College Board has both higher profits and larger assets than ETS.  That’s significant when you consider how ETS came into existence and the traditional relationship between the two firms.

The top earner for the year was CEO David Coleman, who took home about 2.1 million dollars.  President James Singer earned 1.8 million dollars.

The College Board paid ETS 316 million dollars for work on the Board’s behalf.  They paid Pearson 25 million.

The AP program brought in revenues of 493 million dollars (up from 466 million).

The SAT program brought in revenues of 289 million dollars (up from 280 million).

The College & Career Opportunities & Enrollment scheme brought in revenues of 106 million dollars (down from 120 million).

For the year ending 2019, the SAT program brought in revenues of more than 400 million dollars.  This seems to be a test that is in decline.  And the AP program in 2019?  490 million.  Go figure.

Investments valued at 217 million dollars are located in “Central America and the Caribbean.”

The government of Canada has lowered the boom on international students.  A few things are worth mentioning:

  1. The number of new study permits issued will be reduced by 35% and be capped at that level for two years.
  2. This reduction will be achieved, in part, by reducing the number of study permits issued for Ontario-bound students by 50%.
  3. Students attending so-called “public-private partnership” schools will no longer be eligible for post-graduation work permits.  This means the schools will lose their whole raison d’etre.  It also means Canada will be a much less attractive destination for international students in general.  The significance of this change cannot be understated.  To me, it is more significant than the aforementioned cap.

Many students are wondering which schools are public-private partnerships.  Here’s a list of the ones I am aware of:

  • Pures College, Toronto
  • Alpha College, Toronto
  • Canadore@Stanford, Toronto
  • Lambton College, Mississauga
  • Cestar College, Toronto
  • triOS College, Various
  • Toronto Business College, Mississauga
  • Toronto School of Management, Toronto
  • Hanson College, Various
  • Ace Acumen Academy, Various

As for how this will affect the business of language testing?  By my math, testing companies (mostly IDP and Pearson) will lose out on about $27 million USD in test registration fees (130 thousand tests @ $210 per test).  Not to mention all the extra stuff they profit from.  That’s significant.  IDP’s share price is down 9% in early trading (as I write this).

A bigger loser over the long term might be Canada’s favorite unicorn, ApplyBoard, which has achieved a valuation of about $4 billion thanks to its expertise in student recruitment.  Note that ETS is heavily invested in that firm through their private equity arm, ETS Strategic Capital.

According to this report, the IELTS test was taken “more than 4 million” times in 2023.

Not sure how that figure is broken down between IDP and the British Council, but IDP’s most recent fiscal report says that they delivered 1.9 million IELTS tests for the year ending June 2023. British Council’s most recent corporate report says that they delivered 1.6 million tests for the 2021-22 fiscal year, but they have since ceded their Indian operations to IDP.

How does this figure compare to the past? I found a press release from 2018 saying that the IELTS was taken 3.5 million times that year.How does this figure compare to other tests? ETS boss Amit Sevak recently noted on a podcast that the TOEFL is taken about a million times a year. Meanwhile, the PTE was taken about 600,000 times in the first six months of 2023, according to Pearson’s most recent figures. Finally, I’ve estimated that the Duolingo English Test was taken about 650,000 times for the year ending Q3 of 2023… but that’s just an educated guess.

Here are possible first steps toward a more customer-friendly and justice oriented approach to remote proctoring.

First: An individual in a leadership position outside of the office responsible for testing integrity might draw up a list of the most common reasons for test and score cancellation.  Off the top of my head, they might be:

  • Direct violation of rules by the test taker (looking away from the camera, speaking out loud during the test, etc).
  • Detection of unauthorized software running on the test-taker’s computer.
  • Suspected Plagiarism.
  • Statistical anomalies (score increased too much since previous attempt, large score differences in different skills, etc).
  • Inability to pass a pre-test system check.

Second:  Request files for tests which were canceled for each of the above reasons.  Start with ten instances of each.  Ensure that the files are selected at random from all tests taken within the past year.

Third:  Examine the evidence that justified the cancellation of each test.  Look for tangible evidence rather than statistical likelihoods of malfeasance.  This means that if a test was canceled due to suspected plagiarism, look at what was plagiarized.  Likewise, if unauthorized software was detected, look at which software was detected.

Fourth:  After examining each case, ask some questions.  Perhaps:

  • “Was the test-taker action that caused the cancellation malicious, or an accident? Should this make a difference in how we treat the test-taker?”
  • “Does the test-taker action warrant a cancellation with no refund, or does it warrant a cancellation with an option for a free retake?”
  • “Is it safe to share this evidence with the test-taker so that they can avoid committing the same violation in the future?”
  • “Could a reasonable person find grounds for an appeal of our decision?”
  • “Was our decision to cancel this score without an opportunity for a free retake compatible with our organization’s conception of justice and fairness?”

After reading Stanley Kaplan’s autobiography “Test Pilot” I became interested in discovering the exact location of the first Stanley H. Kaplan Educational Center. 

A few days of sleuthing revealed that Kaplan opened his first educational center at 1701 Quentin Road in Brooklyn. I found the address in a Brooklyn phone directory from 1965:

There it is – “KAPLAN STANLEY H tutorg.” The scan is a bit blurry, but thankfully Stanley paid extra for some added visibility!  The second address, 3931 Bedford Avenue, was Kaplan’s residence where he also taught classes.

Here’s how Kaplan describes his decision to expand beyond his Bedford Avenue basement and open his first center:

In the biography, Kaplan refers to the center being on East Seventeenth street rather than being on Quentin Street.  That’s technically correct, as the building was on the corner of Quentin Road and East Seventeenth.  Here’s a look at the building in 2022, from Google Maps:


I don’t know precisely when Kaplan opened the center, but it was operating as early as 1961, when he placed this message in the yearbook of the Yeshiva University High School for Boys:


For what it is worth, the second Kaplan Educational center was opened one block away at 1675 East 16th Street in Brooklyn.  Below is an advertisement placed in The Fordham Ram in 1969 which mentions that address.  It is worth mentioning that before this time Kaplan “eschewed direct advertising” and “found print advertising distasteful” (source).


The East 16th street location is described in “Test Pilot” thusly:

The text also indicates that he opened this location in 1967.  It is also referred to in “The Big Test” by Nicholas Lemann.

Here is a picture of the location in 2022.  At some point, the center moved two doors down to the old bank building at 1602 Kings Highway and operated from there until at least 2015, according to some Yelp Reviews.

So there ya have it.  The locations of the first two Stanley H. Kaplan Educational Centers.

As a special bonus, here’s a listing from a 1959 phone directory for Stanley Kaplan’s father, Julius Kaplan:

In “Test Pilot” Stanley Kaplan refers both to growing up on Avenue K and to his father’s work as a plumber. He also mentions that as a young man he taught his very first classes in the basement of that family home.


The independent audit of ETS for the year ending September 30, 2023 is now available from the Federal Audit Clearinghouse.  If you can’t figure out how to use that site just send me a note and I’ll pass along a PDF of the audit.  A few things are worth highlighting here:

  1. ETS had operating revenues of $1.024 billion for the year.  That’s lower than during the pandemic years.  I believe it is the lowest number reported since 2010.
  2. Operating expenses for the year were $1.109 billion.  That means the operating loss for the year was about $84.5 million.
  3. Total assets are valued at $1.811 billion.

As the audit notes, 30% of ETS’s revenue (about 300 million bucks) comes from a contract with one client. That client is the College Board.  According to the audit, the contract expires on June 30, 2024.  If anyone knows of any press coverage about the renewal of this contract (or lack thereof) I will be happy to read it.  It looks to me like both parties are cutting it pretty close.

ETS spent $8.4 million on media advertising in the year (up from 6.4 million the year before) and spent an additional $17.2 million on ad agencies (up from 16.4 million).

ETS received $69 million in federal expenditures for the year.  The projects are listed in the audit.

ETS paid $12.2 million dollars to buy something.  I’m quite certain it was Wheebox.

ETS sold something for $10.0 million in a deal that closed in January.  I have no idea what it was. 

I want to draw attention to an article about remote proctoring in “Language Testing” by Daniel Isbell, Benjamin Kremmel and Jieun Kim. It’s the best article on this topic I’ve ever seen. Everyone involved in the development of standardized tests that use remote proctoring ought to read it.

I’ve written about remote proctoring issues here ad nauseum. I’ve tracked test-taker complaints about their experiences with remote proctors since 2020, and have shared many of them here. I’ve focused on complaints about online about tests being terminated at inopportune moments, and scores being canceled for reasons that aren’t explained properly. Test-takers often complain that they aren’t provided with evidence of malfeasance and that they are not provided with opportunities to appeal these decisions. I’ve focused mostly on TOEFL test-takers, but people do have complaints about other tests.

I’ve thrown around words like “dignity” and “respect,” but the authors of this article have found an even better one:  “justice.” Here’s a relevant quote:

“When some test takers are accused of cheating, they might maintain that the accusation is false. False allegations of cheating could lead to more serious consequences than disregarding malpractice. To avoid such controversy, test providers should collect sufficient evidence of malpractice. Video and audio recordings of the test takers, computer screen, and testing environment can be recorded during remote proctoring. If malpractice did occur, test providers should be able to produce sufficient evidence. If malpractice cannot be established, or a test taker unintentionally committed a minor violation of test-taking rules, some leniency and a reattempt seem justified.”

This is a wonderful suggestion. Here’s another quote:

“The second warrant is that test takers and users have recourse to the test provider. When taking action against those suspected of malpractice, test providers should present test takers with solid evidence or reasons for score cancellation. Not knowing specifically what one has been accused of puts test takers at a disadvantage when making an appeal. Proactively providing test takers with reasonable criteria for score cancellation and comprehensible instruction for what is considered to be cheating would support both the fairness and justice of the test, for instance, specify the test takers’ right to voice their concerns about the testing process or the test results. For a test to be just, all test takers and test users should have easy access to clearly defined procedures for levying complaints or appealing malpractice-related decisions. Any test taker with low target language proficiency or disabilities, for example, should be able to understand the appeal procedure and make complaints to the test provider.”

I agree.

I’m not optimistic that testing firms will develop a justice oriented mindset any time soon. But we’ll see.

The new issue of “Language Assessment Quarterly” is the best thing I’ve read this year. I’m certain that anyone who is interested in the stuff I write about in this space will enjoy it a lot. Best of all… the articles are all free for a few months, so click the “download” button while you can.

I’ll link here to the introductory editorial by Xiaoming Xi, which is absolutely fantastic work (in later posts I’ll link to some specific articles from the issue). Xi’s editorial is a detailed survey of the topics explored in the issue. Including:

  1. The use of automated scoring in assessment. Xi argues that there is a need for more validity research in this area given the increased use of AI scoring in a variety of contexts. She also touched on when it is appropriate to use a “black box approach” to AI scoring. I’ve written here about my concern that advocates of AI scoring sometimes use a “just trust us, bro” approach instead of publishing detailed documentation of how the scores are actually generated. Is this appropriate?
  2. Automated feedback. This is a really key issue for people interested in test prep. As I’ve mentioned many times, test prep is a more self-guided journey than ever before. Students are increasingly able to prep for tests on their own instead of relying on costly tutoring thanks to new tools that make use of AI. Xi lays out some key validity issues that need to be considered before individuals rely on automated feedback. My pals at ETS will like this part as they have recently begun providing automated feedback to all test-takers, rather than a select few that sign up for specific products.
  3. Remote proctoring technology. This topic keeps me up at night. Really. Remote proctoring can be very good. It can be very bad. It can also be frigging horrific. That isn’t really the topic of the article, but Xi notes that there are issues surrounding remote proctoring that “may challenge the fairness and justice of tests.” She also raises potential validity issues surrounding the use of remote proctoring. It isn’t mentioned in the article, but at least one major test has a much higher average score when taken at home. Is that something that is worth studying? Should all test-makers release that information?

There is much more, but I will leave it at that. Let me know what you think.