I read an article from Cambridge published by the Higher Education Policy Institute.  It’s a curious read.  The salient part is this:

“The proposed introduction of a Home Office English Language Test (HOELT) raises the stakes still further. The Home Office has indicated an interest in at-home invigilation. While innovation of this kind may appear to offer greater convenience, it also risks undermining quality, fairness and security. The HOELT process must be grounded in evidence, setting high minimum standards and ensuring robust protections against misuse. High-stakes decisions such as the creation of HOELT should not be driven by cost or convenience alone. They should be driven, instead, by whether the system enables talented students to succeed in the UK’s competitive academic environment, while safeguarding the country’s immigration processes.”

This is a fair and principled stance.  And sincerely held, I’m sure.  But to go the route of telling the Home Office that the test they are desirous of (a fully remote at-home test) is the wrong test is a curious move.  It could backfire.  I’m imagining a restaurant-goer who orders a salad only to be told by the chef that he ought to get a steak instead.

What’s more curious is that Cambridge has been offering an at-home IELTS test for more than five years. And with nary a peep about security concerns.  Indeed, to my eye it seems to have more reliable security than the traditional test center administration.  Hundreds of people take this version of the IELTS every day. So why not lean into that?

Duolingo’s bid is completely beatable.  Even by the standards of at-home testing Duolingo’s approach is new and scary.  I can picture the HOELT tender being won by a testing firm that openly states its love for at-home testing but presents the sort of conservative game plan that the Home Office is probably looking for.

A result where a smaller testing firm emerges to snag the tender might be Cambridge’s worst nightmare.  And a source of great amusement for the Duolingo team.

The Daily Sun (Bangladesh) has published a long article on IELTS exam leaks and organized cheating. I’ll post a link in the comments.

According to their report, two suspects have been arrested in Dhaka for their part in a cheating ring.  The Sun reports that the suspects used leaked IELTS tests to prepare students for the test.

Says the article:

“Students who followed the racket’s instructions reportedly achieved their expected band scores, raising serious concerns over the integrity of the IELTS examinations in the country.”

This comes after a lengthy Daily Sun investigation of a separate IELTS cheating ring operating on the same principle.  Basically, it is alleged that ringleaders acquired test forms the night before specific administrations of the IELTS test from a leaker operating within the IELTS partnership.  After that, customers were (allegedly) secreted away to local hotels where they had 4.5 hours to memorize the correct answers.  The next morning they (allegedly) took the test at a variety of test centers across the city.

Says the article:

“On 25 April, a correspondent stayed at Hotel Afford Inn in Uttara, where around 100 students were lodged overnight. Leaked questions were reportedly handed out at mid-night, after students were separated into teams according to their exam centres. The next morning, the students were transported by minibuses, microbuses, and CNGs to [test] centres including Patronas at Panthapath, Compass at Banani, IALC at Dhanmondi, and Penstone at Uttara. A similar operation was observed on 23 May at Hotel Central Inn in Motijheel, where 120-130 students were being prepared for the following day’s exam.”

Beneficiaries of this scheme confirmed (allegedly) that the leaked questions were identical to those on the real IELTS.

The Daily Sun report suggests that the aforementioned cheating rings “highlight serious flaws in exam security.”

The security of the paper-delivered IELTS has come to the forefront in recent months. Note that the IELTS Partnership pulled the plug on paper delivery of IELTS for UVKI in Bangladesh at the end of July.

The paper-delivered IELTS has also been eliminated or restricted in key markets of Vietnam, Pakistan, Sri Lanka, Malaysia and Nigeria. Meanwhile, test takers globally have recently been forbidden from accessing the paper-delivered test outside of their country of residence (or country of citizenship).

Some have speculated that these moves are connected to longstanding security concerns about the paper-delivered IELTS.

Earlier this year the UK Home Office raised concerns about possible cheating on IELTS tests, citing comments from HEIs that IELTS scores “don’t necessarily replicate the reality of the applicant’s English language ability.”

If you are still reading, do take a moment to peruse the whole Sun article.  It contains the most succinct description I’ve ever read of the ol’ paper-delivered IELTS cheating technique (allegedly).

The UK Home Office has published a fifth request for information regarding the Home Office English Language Test (HOELT). This one is a shocker.  It notes that “the Home Office is exploring a ‘Digital by Default’ service, with remote proctoring as the primary mode of delivery and physical test centres available where remote solutions are not feasible.”

This could explain the curiously low number of test centers mentioned in the fourth RFI, which is again listed as just 268 in 142 countries.

A Home Office choice to go with remote proctoring by default might favor a smaller test provider – like LANGUAGECERT, Duolingo or ETS – heretofore considered an underdog in the race to win the tender.  All three of those providers are well known for offering robust remote-testing options to test takers around the world.

On the other hand, the IELTS partnership (widely considered a front-runner to win the HOELT tender) currently offers remote tests only in select markets, while Pearson (another favorite) pulled the plug on its remote options back in 2024 shortly after stories broke about widespread cheating on the at-home PTE Test.

Of course this doesn’t mean remote testing is a sure thing. But it is worth paying careful attention to the possibility.

I want to cycle back to that pre-print from Maggie McGehee and Daniel Isbell which I wrote about a few weeks ago. Take a quick look at the data in Table 1. In the period under review (Fall ‘22 to Fall ‘23) 100 undergraduate students submitted ELP scores to the University of Hawaiʻi at Mānoa. Here are the tests they used:

  • DET:  61
  • TOEFL: 20
  • IELTS: 19

And limiting ourselves to Fall ‘23:

  • DET: 30
  • TOEFL: 5
  • IELTS: 8

So 61% of students submitting a score in the whole period opted to send a Duolingo English Test score. DET’s share reached 70% in the final semester under review, while the TOEFL had just a 12% share at that time.

What I wouldn’t give for numbers from 2019.

This is just a single less-selective school in the second most beautiful state in the union, but it speaks to the challenge that the TOEFL team faces as it implements a major relaunch of its flagship product.

That challenge being how to respond to a young person who knows that all his friends were admitted to good schools using a DET score, and who also knows that the DET is just a fraction of the price.

I wouldn’t know what to say.

Regardless, one can almost suss out the potential future (bright) of the TOEFL program. Some things come to mind:

  1. Maybe US admissions are something of a lost cause, but there are a ton of jurisdictions that are reluctant to accept DET scores right now. Places like the big new receiving countries in Europe and Asia.  Perhaps TOEFL could become the go-to “easy” test for these markets. I can envision it becoming something like the DET of the EU. I would mention Australia here, but the revisions have come too late to be accepted by the DHA.
  2. On the other hand, maybe the domestic market can be recaptured. The new TOEFL is hella streamlined. Development, administration and scoring will all be much cheaper than for the old TOEFL. Why not turn TOEFL into a $75 at-home test and instantly recapture much of the market share lost to DET? Heck, even knocking the price down to a cool $99.99 would probably do the trick. A test center version could be maintained at a hefty price tag for folks who prefer it. To put it in somewhat nefarious terms, ETS could charge test takers in the lucrative Chinese market $300 a piece and let everyone else test for $100.
  3. On the vestigial third hand, there are big players who greatly desire a bigger slice of the American pie.  Pearson has gone to all the trouble of developing their own DET competitor (the PEET Test) which will launch next month. That test looks fantastic, but it will take an enormous amount of time, money and effort to gain widespread acceptance at institutions. It took Duolingo six years and a pandemic to gain a stable foothold. Why not just… well… I won’t say it. But you know what I mean.  Everybody wins.

The cool thing is that all three of these avenues suggest a bright future for the TOEFL program. They just require boldness and fortitude.

The Oxford Test of English will soon be offered at test centers in China and Thailand. Congratulations to the team at OUP! That’s one step closer to my backyard!

China is an interesting case. Regular readers know that the two big English tests in that country – IELTS and TOEFL – are administered in partnership with the NEEA, a public agency associated with the Chinese Ministry of Education.  Test registration is done through an NEEA portal. The NEEA collects registration fees and later passes them along to ETS and the British Council… after pocketing an unspecified amount.

This is probably good for test takers as the NEEA is known for blocking price hikes and upselling. It also mandates that registration fees be charged in RMB. It probably isn’t great for testing firms, as the NEEA is known for blocking price hikes and upselling. It also mandates that registration fees be charged in RMB.

Not all language tests are required to operate in partnership with the NEEA. This is because there are two types of language tests in the eyes of the Chinese regime – we might translate them as “educational tests” and “commercial language proficiency exams.” The former must be administered in partnership with the NEEA, while the latter can be administered with any local partner.

The biggest English test to operate without NEEA partnership is the PTE. Pearson partners with an on-shore company called Beijing Ensi (d/b/a Pearson VUE China).

In the case of Oxford, that partner will be a group called GEC.

A funny case is that of Prometric. They run the CELPIP with local partners, but run the CAEL with the NEEA. Which makes sense, as the CAEL is used for educational purposes, while the CELPIP is not.

There is a point to all of this. Anyone who cares enough to still be reading already knows that the IELTS test is run in China through a partnership between the British Council and the NEEA. IDP Education is not involved, but traditionally the British Council paid them a per-test royalty as a sort of booby prize in recognition of their partial ownership of IELTS.

Last year, IDP gave up that royalty and began administering the IELTS themselves. It is my understanding that their strategy was to brand themselves as a provider of a commercial language proficiency exam. This was done with an unnamed local partner referred to by CEO Tennealle O’Shannessy as “a respected professional examination service provider.” Someone once told me the name of that partner, but I’ve long forgotten. I could look it up, if anyone cares.

That went well for a few months. But by December, testing had ceased. IDP is now in negotiations to resume testing. I don’t know how those negotiations are going.

I’ve always wondered how Pearson managed to skirt the NEEA requirement. I know that it is used for both educational and non-educational purposes.  Maybe that’s enough. But so are the IELTS and TOEFL tests, to some extent.  And, surely, they would like to be free of the NEEA.

Here’s a video from the Australian Broadcasting Corporation (ABC) about English tests in Australia and recent changes. Among other things, it touches on the fact that many visa holders in Australia test and re-test and re-test and re-test. Indeed, the recent financial results from IDP Education Ltd touched on how this kind of on-shore testing is a growth area for the IELTS.

My most recent “picture of the week” over on Substack was a shelf of test prep books at the Alderney Gate branch of the Halifax Public Library.  A fun way to do some boots-on-the-ground research into testing trends is to see what test prep books are checked out of your local library.

For instance, right now all six copies of IELTS 18 (General Training) are checked out of Halifax’s library system, with a waiting list of 15 holds.  Four out of five copies of IELTS 18 (Academic) are checked out.

Meanwhile, over on the left coast all 22 copies of IELTS 18 (General Training) are checked out of the Vancouver Public Library system, with a waiting list 7 holds deep.  Twenty-two out of twenty-four copies of the Academic version are checked out.

As IDP Education has indicated a few times in recent weeks, on-shore testing in Canada is a pretty big deal nowadays.  That’s partially because of new testing requirements for PGWP applicants that came into force back in November.

Interestingly, there seems to be less demand for IELTS 19.  The new cover design might have some people thinking it is a third-party product.

IDP Education Ltd has released Full Year Financial results for FY 2025.

IELTS volumes are down 18% for the year for a total volume of 1,293,800 tests.  Last year’s volume was 1.5 million tests.  The peak for IDP was 2022, when 1.9 million tests were administered.

IELTS volumes are down 50% in India for the year.  This comes after a 42% drop in India back in 2024.

Outside of India, IELTS volumes are up 2%. This is attributed to onshore testing in Canada and Australia.

IDP failed in its goal to launch IDP IELTS in China in the fiscal year.

Shares are up 31% on the news.

A new article from Vientnam.vn touches on the issue of “foreign currency loss” stemming from widespread use of international English tests for domestic university admissions in Vietnam.  It estimates that in 2025 over 300,000 IELTS tests will be taken in that country.  Each administration of the IELTS costs VND4,664,000 (about $177 USD), the majority of which is said to flow outside of the country.  The author also touches on how the cost of test preparation exacerbates this issue.

A domestically developed alternative to the IELTS, known as the VSTEP, is available but not widely accepted or used.  Vietnam is one of a few markets where IELTS testing volumes are still increasing year after year.

In earlier posts I’ve written about how Seoul National University in Korea developed the TEPS Test partially due to concerns that overuse of the TOEFL and TOEIC tests for domestic admissions were leading to currency outflows at a particularly sensitive time (the late 1990s IMF Crisis).  By all accounts, the TEPS Test successfully reduced the use of foreign tests for domestic purposes.  Eventually, universities were blocked from using tests in this way and this all became a non-issue.**

Interestingly, ETS has long been blocked from selling its “TPO” practice tests for the TOEFL direct to consumers in Korea.  Instead, they must be purchased via a local partner.  This was a big deal in the earliest days of the TOEFL iBT test before the development of a robust network of domestic prep providers.

**Let this be a warning for test providers.  One day the Vietnamese gravy train might come to a halt.

Some readers might be interested in an article by Maggie McGehee and Daniel Isbell which will appear in a future installment of “Studies in Language Testing.”  It examines the relationship between English test (DET, TOEFL, IELTS) scores and academic outcomes at the University of Hawai‘i at Mānoa (UHM). 

According to the article, “mean GPAs and proportions of student withdrawal/probation were similar for the DET and other tests (or no test), and no differences among them were statistically significant.”

I was happy to see the authors note that “[f]or students admitted unconditionally with higher ELP scores, there were no statistically significant correlations between ELP scores and first year GPA.”

As has been discussed in this space, there are many things which impact student outcomes.  The authors note the existence of students with really high ELP scores, but surprisingly low GPAs.  Go figure.

Interestingly, though, it is mentioned that “[f]or those admitted conditionally, with ELP scores reflecting a lower range of English proficiency, stronger, positive, and statistically significant correlations emerged for students submitting IELTS or TOEFL scores, with low or negative (but not statistically significant) correlations for DET takers.”

It is pointed out a few times that the sample size was pretty small, so further study is needed.

There is also some good stuff in here about the business of English testing, but I’ll touch on that in another post.  Check out Table 1 for a preview, though.  You’ll spot which tests are frequently used… and which are not.

Interesting article in Times Higher Education this morning by Sabrina Y. Wang, who is involved in test preparation in China.  She notes:

“June and July used to be the beginning of the busiest time of the year, leading up to the IELTS and TOEFL exam season from September to December, as students submit their applications to study abroad. However, in stark contrast to previous years, my education company has this summer received only one inquiry about IELTS preparation.

Nor is this an isolated case. Competitors from other training centres and university admission agencies are also complaining of having very few enquiries compared with the same period last year.”

Wang talks about how Chinese students – across various categories – are considering options other than studying in the USA and UK.  She points out that the reasons are often economic. Basically, the middle class is feeling the pinch.

That’s all somewhat outside my wheelhouse, but the stuff about tests is interesting.  A few things are worth pointing out:

  1. China is way less competitive than other markets when it comes to tests. IELTS Official and TOEFL still dominate.  If insiders are reporting a lack of interest in those tests, the whole English testing thing might be in trouble.  It isn’t like everyone is piling into the PTE and DET.
  2. Interestingly, the big TOEFL relaunch appears to include a renewed focus on the Chinese market.  ETS from, say, 2023 to early 2025 seemed hyper-focused on the Indian market.  Those efforts may have produced negligible (or possibly negative) returns.  It almost feels like that work has been put on the backburner in order to rebuild things in China.  If this article is any indication, that looks like a very smart idea.
  3. As far as I can tell, IDP Education Ltd is still plugging away at gaining regulatory approval to begin administering the IELTS in China (alongside the British Council).

Writing in the Financial Times, Cambridge University Press & Assessment’s chief product officer Pamela Baxter notes that “international students, who are expected to demonstrate only minimum standards of English for their admission to UK universities, often struggle to keep up with coursework and to contribute usefully to classroom discussions.”

This, she points out, “is a disservice to those students and their peers.”

Baxter ascribes this problem, in part, to the acceptance of English tests that rely too heavily on AI.  As proof, she cites a study (published in the ELT Journal) that the IELTS partners have promoted quite a lot since it was published at the beginning of last month. We’ve heard this message ad nauseum from the IELTS partners in recent weeks.

But one is left wondering about the connection that has been made between AI-based tests and poor student performance.  When the study was carried out, the Duolingo English Test was accepted at just 6 out of 40 surveyed higher-ed institutions.  Another test singled out by the authors of the report, the Oxford ELLT, was accepted by so few institutions that it wasn’t even listed among the results of the survey.

These tests have increased in popularity a bit since the study was completed, but not by much – the Duolingo Test was accepted at 5 out of 20 institutions I surveyed earlier this year.  The Oxford ELLT Test was accepted at 4 out of 20.

So why are we blaming tests that very few schools actually accept?

Traditional human-centric tests like the IELTS, TOEFL and Cambridge C1 tests meanwhile, were (and are) accepted at almost all major HE institutions in the UK.  Accordingly, such tests are still used by the vast majority of international students heading to the UK who are required to submit scores in the first place.  Odds are high that many or most of the struggling students referred to by Baxter used traditional human-centric tests during the admissions process.

This is not to say that traditional tests are doing a poor job of measuring the language skills of students heading to the UK.  Nor would I suggest that such tests are failing to contribute to positive washback among the test taking problem.  The tests are doing their job.  They are measuring English fluency. My point here is that if students are struggling to use English in the classroom, there are problems that go far (far, far, far) beyond one’s choice of English test.