The Home Office (UK) has published its “Review of English Language Assessment Methods.”  It is a survey of higher-ed providers which asks questions about how they use and think about English language tests.  It was completed in the spring of 2025.

The results include some really useful data about which tests are accepted by the HEDs.  So we can see that IELTS is accepted by 97% of institutions, TOEFL (regular) at 89%, TOEFL (home) at 65%, Kaplan at 33% and Duolingo at 25%.

The data on test selection criteria is also useful, but maddeningly vague.  It is great to know that 97% of institutions really care about test quality and that 95% care about security.  But how do they do this?  Here’s a quote from one respondent:

“We investigate whether the test covers all 4 language components and also consider assessment rigour (for example, length of the written test). We also look at how the test provider has benchmarked their test results against CEFR. We also ask for information about how the test provider deals with test taker breaches (and potential breaches). All tests are examined by the Head of Centre for International English.”

Checking that tests cover all four skills is a good start, at least.

One of the things I’ve been stressing in recent posts is that people working for HEDs must work hard to stay informed of some of the finer points regarding security and validity. Doing so ensures that they stay on even keel with test providers who have a financial interest in getting their tests accepted.

For instance, when approached by a prospective test provider they might ask for specific details about how a second camera works during an at-home test delivery.  Or about how their secure browser works. Or about how many people each online proctor supervises at the same time. Or, when it comes to validity, about how the test is actually scored. And how many people (if there are any people at at all) score each response.

This is not to say that the individuals surveyed here are under informed.  But I think you know what I’m getting at here. Tests are very complicated business nowadays. I would love a survey that asks pointed questions about whether respondents possess the background knowledge to make the best possible decisions.

The last thing you want, as a HED, is to be bowled over by a smooth-talking salesperson from a non-SELT provider (think: Springfield and the monorail). Or, worse yet, by one from a SELT provider who wants the industry to stay frozen in 1989 forever and ever (think: Han Solo at the end of “Empire”).

I saw that ETS Capital has announced the existence of a new advisory board, which features some high profile names.  The board will play a role in efforts to “scale ETS Capital’s investment and M&A platform,” according to director Emal Dusst.

Longtime ETSologists know that ETS Capital is the in-house investment arm of the Educational Testing Service.  It was founded using part of the proceeds of ETS’s one billion dollar sale of Prometric in 2018.  It has been somewhat quiet of late, perhaps because much of that money has been spent on investments (530 million to buy PSI, an estimated 12 million to buy Wheebox, an estimated 22 million to buy Kira Talent, alongside minority stakes in ApplyBoard, MPOWER Finance, CollegeDekho, LeverageEDU, etc).

If ETS Capital is scaling up, perhaps management is confident that more funds are forthcoming.  We may be quite close to the sale of the TOEFL and GRE tests, as indicated a few days ago in the Wall Street Journal.

If time permits, I’ll write a “how we got here” post about the potential sale of these products. The modern-day history (from the late 90s to the present) of ETS is fascinating and hasn’t been covered as much as the early history of the organization.  Given the tenor of my posts here (and elsewhere) over the past six years one might view such a sale as inevitable.  That’s a fair perspective, but I don’t think things necessarily had to go this way.  In any case, feel free to share your thoughts in the comments below.

Peculiar news from Japan this morning. IDP Education Ltd has announced that they will no longer manage the IELTS in that country.  Those duties will instead be handled by the Japan Study Abroad Foundation.

Here’s the English version of IDP’s announcement:

“As part of a global alignment of IELTS operations, delivery of IELTS in Japan will continue under the management of IDP Education’s long-standing partner, the Japan Study Abroad Foundation (JSAF), from February 2026.”

This is a pretty big deal as Japan is a pretty big market for the IELTS (and all English tests).

I assume that the British Council will go on doing IELTS tests in Japan as it always has.

While we are on the topic of American testing non-profits and their finances, it is worth taking a moment to glance at the latest 990 form from the GMAC.  It covers the year ending December 2024 (I think it was published earlier this month).  They lost 6.8 million USD on the year.

Program service revenues for the year (which come from the GMAT,  NMAT and “other services” for schools and test takers) were 56.4 million USD for the year.   Compare that to 89.9 million USD for the final pre-pandemic year.

We see it again and again:  people are just not taking the tests sold by these organizations like they used to.

Meanwhile, GMAC has net assets of 164.2 million USD.  That’s down from 174.8 before the pandemic. The good news for GMAC is that those assets are mostly in cash and publicly traded securities. These can easily be used to cover losses, so they can go on losing money for decades without having to worry about future unknowns.

While on the road in Egypt, I’ve been making use of “The Rough Guide to Egypt.” Traditional travel guidebooks have been out of fashion for quite some time, but I still use them when I can.  I suppose “Rough Guide” is the best of the bunch these days, especially since Lonely Planet revamped their popular line of books into something more closely resembling a series of coffee table books on countries of the world.

Rough Guide is good, but I terribly miss the old “directory” format of the old Lonely Planets from their golden age in the late 90s and early 2000s.  They included detailed route-planning and public transit information, while current guides mostly assume that travelers (even budget ones) will hire a private car to get between cities.  I also love how each city listing in those books began with essentials: postal services, money services, communications services, tourist information offices, etc.  They sometimes mentioned a particular restaurant owner who could arrange cheap taxis, or a hotel that was most convenient for late night bus arrivals.  I know that sort of stuff is less urgent now… but it was comforting for the long-term budget traveler.  I still recall the heft of the guide I used in China around 2006 and how its tissue-paper thin pages included detailed information about even the least-visited cities in that country.

The book I’m using now has really (really, really) detailed descriptions of the many tombs and temples found in Egypt, but is somewhat vague in terms of how to get around.  Coverage of minor cities is minimal.

It dawns on me now that travel guides might be good sources of short “reading in daily life” passages like those that appear on the new TOEFL.  I’ll be leaving my current guide behind when I move on to the next country, but I’ll hunt around my bookshelf for some materials to paste into future columns.

I really like this new “Scoring Information for Teachers and Partners” document from Pearson.  It contains a ton of wonderful details about how the PTE test is scored.  Included are detailed tables explaining how much weight each item has for each section, and for the overall score.  This includes integrated items, which affect more than one section score.

The document also explains why the overall PTE score is not an average of the four section scores (which is weird, but I think I’m starting to get it). 

There is also a good explanation of how score reviews work.

I would love to see more of this from test makers.  It would be especially useful to learn how scores are determined on the new TOEFL, and how the test adapts to test takers as they work through it.

The Wall Street Journal is reporting that ETS has held talks to sell both the TOEFL and GRE tests.  The non-profit is reportedly seeking a purchase price of about 500 million USD, or a for-profit partner who might invest (but not purchase outright) the tests.

According to the WSJ’s report, ETS “has narrowed the talks to several firms that could potentially buy the exams or act as a strategic investor to expand ETS’s reach into the Middle East and India, said people familiar with the matter.”

As for potential buyers, “[i]nterested parties include the Singapore-based investment firm Hillhouse, the private-equity firms Nexus Capital and Veritas Capital and the education entrepreneur Martin Basiri, some of the people said.”

ETS CEO Amit Sevak has declined to comment.

Regular readers know that Nexus Capital recently purchased the ACT test from its non-profit owner for about 240 million dollars.  Veritas Capital, meanwhile, purchased the assets of non-profit testing giant NWEA for about 890 million dollars. Martin Basiri, of course, is the co-founder of the fairly successful Canadian firm ApplyBoard (which everyone reading this is quite familiar with)  and more recently an admissions-related company called Passage (which we are all much less familiar with).

Most astute test watchers are well aware that we’ve been on the path to such a move for a very long time.  The dominance of both tests has been eroded since the pandemic years, while ETS itself has experienced financial woes that have necessitated several rounds of layoffs and buyouts. Several much smaller testing products have already been eliminated from the organization’s portfolio.

I’ll have more to say on this topic in the days ahead, I’m sure.

Jia Peng, 24, has been sentenced to three months imprisonment in Hong Kong for cheating on the TOEFL.  Jia was the “other dude” I mentioned in a post a few days ago about another case of cheating in Hong Kong.

According to court records, Jia hired a convincing lookalike to take the TOEFL for him at a test center in Malaysia.  It appears, as in the earlier case, that the scheme was uncovered by the university sometime after a score report was submitted.

I spoke yesterday with a source who has experience being locked up in Hong Kong. They described conditions in the city’s jails as “pretty spartan,” with inmates being kept two to a cell.  The same source described the food served to prisoners as “not bad.”

A point I’ve been trying to make here is that some test makers might insist that their on-site tests are far more secure than at-home tests offered by their competitors… all while not being fully aware of exactly how many people are cheating on those on-site administrations.

ELT Journal has published Duolingo’s response to last year’s article by Bruce, et al. about new online English tests and their use in UK admissions.  Regular readers will surely recall the article, as I have referred to it here several times.  It has also been mentioned in the higher-ed press many times. In short, the study suggested that the use of these tests has resulted in the admission of students with poor English abilities.

The authors of the response suggest that the study contains “several substantial flaws that potentially invalidate [its] conclusions.”

To break it down, their main concerns are:

  1. The study draws conclusions based solely on the opinions of university staff.  But it does not include performance data like student marks and progression rates to contextualize these perceptions.
  2. EAP professionals make up the largest respondent subgroup, “yet they typically work only with students requiring language support, potentially excluding perspectives on higher-proficiency matriculants.”
  3. While criticisms were leveled against the Duolingo English Test and other new tests, such tests were used by only a handful of the schools represented in the study.  This suggests that the students with poor English skills referred to by respondents were actually admitted with IELTS scores.
  4. I’ll just leave a quote for this one: “Low-quality research, including studies in which the methods and results do not support the conclusions, is potentially detrimental to both future research efforts and society.”
  5. Several of the authors of the study are affiliated with the owners of the test (IELTS) which was most praised in the study.  It is suggested that this was not adequately disclosed.

We are in the era of the new TOEFL.  A few notes:

  1. As expected, the old TOEFL will live on as “TOEFL iBT Australia” for those headed down under.  I understand that scores can only be sent to institutions in that country.  It could take some time for the new version to be approved by the Australian DHA.
  2. TOEFL TestReady lives on as a means of selling prep only for “TOEFL iBT Australia.”  As announced earlier, it is no longer used to supply prep for the new TOEFL.
  3. Accordingly, there is far less free prep available than before.  Someone can correct me if I’m wrong, but as I understand it there are no longer any free graded tests available for the new version. There are plenty of paid options, though.
  4. ETS is no longer advertising an updated “Official Guide to the TOEFL.”  Instead, they are advertising an “Official Guide to the TOEFL iBT Test: Pocket edition” (coming soon).  This product  is described as a “time-limited digital release.”  I’m not sure what that means.  Amazon still has a full official guide listed for a late-May release.  No sign of updated “Official Tests” books, though.

I’ll take the new test when I return home in a few weeks. Probably at a test center.