I read today that there are now 238,905 international students in Korea (plus 75,033 people studying the Korean language).  That’s kind of interesting.

The top sending countries are China and Vietnam (76k and 63k students, respectively).

(raw data is here)

Some students are on a Korean-language track and might submit a TOPIK score during the admissions process.  Others are on an English-language track and might submit some kind of English test score during the process.

I bet that most of the Vietnamese students on the English track submit an IELTS score, while the Chinese students make use of either TOEFL or IELTS scores, depending on their preference (and I think they are more likely to be on the Korean-language track).

Most of the schools here seem to favor TOEFL and IELTS (and, god bless ‘em, TEPS). There may be an opportunity for smaller tests to make inroads in Korea if they engage with the universities a little more.

If anyone from Duolingo English Test is reading this, you could add Sungkyunkwan University to the tracker on your site.  People say that it’s a pretty good school.  Many years ago I lived near their Suwon campus – I never went inside, but would use it as a starting point for runs along the Suwon Dulle Gil.

The IELTS partners have published an excellent article by Nicola Latimer, Chihiro Inoue, Samantha Chan and Daniel MK Lam about how learners process lectures that involve both auditory and textual content… and how that sort of stuff could be included in assessments.

The authors note in their introduction that:

“Currently, the listening component in most major international language tests used for university admissions (e.g. IELTS, TOEFL iBT, PTE Academic, CAE) involves the presentation of audio material as the primary/sole source of information for comprehension.”

And in the conclusion:

“We argue that tests of lecture listening should present audio and slide text simultaneously as input, as real-life academic lectures would. The input would then need to cover a range of degrees of integration between slide text and the lecturer’s speech.”

On stage at PIE Live 26, I spoke about how English tests should iterate and evolve to keep up with research and with the test taker journey. I spoke even more about it on the convention floor.  I mentioned how test companies themselves often produce research that seems to wither on the vine, never managing to influence the assessments themselves.

One hopes that the research at hand eventually has an influence on the IELTS test.

That said, I’m reminded of an earlier IELTS report about the usefulness of integrated writing assessment which ended with:

“The question is not whether an integrated writing task can be implemented in an official IELTS test, but rather, the question is when and how it can be best implemented.”

Those words were written several years ago. We don’t seem to be any closer to seeing an integrated writing task in the IELTS.

I was quoted in this PIE News article about the IELTS consortium pulling out of the HOELT tender.  I said:

“If the IELTS consortium has determined that they are unable to deliver a suitably secure remote test, their decision to withdraw from the process makes a lot of sense. I haven’t read the letter they sent to the Home Office in full, but it almost seems like they have suggested that no one is capable of delivering a secure [remote] test. I’m not sure I agree with that assessment, as every potential bidder has their own strengths, expertise and technology.”

It could be true that remote testing can never be fully secure. It wouldn’t surprise me if the whole tender gets derailed as a result of the IELTS withdrawal.  But it may be more complicated than some coverage is suggesting.

Consider that the IELTS itself can be taken fully remotely.  The remote IELTS is used and trusted by universities around the world and has been used to admit tens of thousands of students into higher education.  What strikes me when I look at that product, though, is how it lacks some of the security features currently used by other testing firms. For instance, it does not require a secondary camera.  This is something that’s been a standard part of remote tests from LANGUAGECERT, ETS and Duolingo for quite some time.  In case you aren’t familiar with it, the secondary camera (a phone or tablet) points at the test taker’s keyboard and mouse for the duration of the test.  Among other things, it can help determine if the person taking the test is actually answering the questions or if someone else did that via remote access. Basically, you can compare the keys pushed by the test taker with the keystrokes actually recorded by the test.  Why is this not included in the remote IELTS?  I’m not sure.

While I’m sure that the remote IELTS is completely secure, I can’t shake the feeling that other testing firms have spent much more time working on this issue. They may be capable of stuff that the IELTS partners haven’t yet thought of.

I was quoted in this great article by Abhishek Nair about the coming end of the paper-delivered IELTS. I think it is really important that we keep an eye on how widely available the “writing on paper” option becomes. Wide availability could certainly allay many of the concerns people have with computer-based testing. If widely available, the change may not be such a big deal.

The IELTS partners have now confirmed via a post on their website that they have withdrawn from the HOELT tender.

The post indicates that “[d]ecisions as significant as granting visas to live in the UK require the highest trust in assessment outcomes. Evidence from research and regulatory practice shows that fully remote testing presents challenges in meeting these standards consistently – especially in the highest stakes environments where security is paramount.”

I’m interested to learn more about this decision, as the above statement is somewhat at variance with the fact that the IELTS consortium is currently one of the largest providers of fully remote tests for high-stakes purposes, including for admissions to universities in the UK and elsewhere. As regular readers know, students who take the IELTS (academic) have the option to forgo their local test center and instead take the test from the comfort of their own homes.

It is wise of the IELTS consortium to step away from the tender if they feel incapable of creating a sufficiently secure testing environment. That said, there may be other organizations which are more capable.

When I talk to clients about this sort of thing, I often explain it by referring to “generations” of remote testing technology. Generation 1 was characterized by tests that were delivered in regular Internet browsers with pretty basic security (like proprietary plugins and pop-up warnings when the test taker’s cursor strayed too close to the side of the window). Webcam access let a proctor on the other side of the world peer at a dozen test takers simultaneously as they took the tests. Generation 2 saw the introduction of custom-made secure browsers. Key features of succeeding generations included things like secondary cameras, asynchronous proctoring, rudimentary deepfake detection… and more. The lines between generations are sometimes blurred, but I think you get my point.

I’m not sure how to number the current generation (5? 6?) and I don’t know exactly what’s coming up, but I’m excited about the possibilities – like how psychometrics and technology can intersect to produce new kinds of test items that serve to mitigate the impact of test taker malpractice.

It is important to note that not every provider reaches the same tier at the same time.  In 2026, there are still some tests (including ones that are widely accepted) that run on gen-1 technology. That’s the best the companies behind them can do, and score users often lack the assessment literacy necessary to tell the difference.

I fear that I’ve meandered far off-topic. But the point I am trying to make here is that testing firms don’t all have the same level of expertise and technology at their disposal when it comes to remote testing. The IELTS partners might not be able to securely deliver a a remote test. But it would be wrong to automatically assume that the requirements are, therefore, impossible for anyone to meet.

I’ve mentioned here that the best way to learn about an English test is to rock up to a test center and take the darn thing. This often leads to more useful insights than, say, looking at an official practice test.

This approach is particularly useful if one wants to opine about tests in a professional or semi-professional context. I know that there are some pretty big gaps in my own testing experience, but I do my best to take the tests I talk about.

Consider the case of the new TOEFL. After glancing at the reading questions in practice tests provided by ETS, a common response was “Egad! The articles are much shorter than in the old TOEFL! This doesn’t require the same skills as the old test!

Here’s what the folks at IELTS said:

“Brief texts generally require less integration of information across multiple sections, involve fewer shifts in argument or perspective, and offer fewer opportunities to track cohesion or follow how ideas develop over time. Shorter texts also place lower demands on sustained attention and on maintaining coherence across larger stretches of discourse.

Longer texts naturally prompt readers to manage these additional layers of processing, such as resolving references across paragraphs, synthesizing information as it accumulates, and interpreting how earlier sections shape meaning later on.”

This is all true. But those of us who have taken the old TOEFL understand the need for nuance.

Yes, the old TOEFL contained academic articles of about 650 to 700 words.  And the new TOEFL contains academic articles of about 200 words.

But remember that 18 out of 20 questions in the old TOEFL’s reading section were prefaced by something like “According to paragraph 2…”.  And there was absolutely no reason to look outside of that paragraph for the answer. Which means that in each question the test taker engaged with only about 100 to 120 words. They weren’t really getting into all those layers of processing.

That preface is not found in the new TOEFL. As a result, test takers are forced to engage with all 200 words when answering each of the 15 academic reading questions (10 if they get cast down into the easy module).

It’s a meaningful distinction.

It becomes more meaningful when we realize that the first piece of advice mentioned by every TOEFL teacher and by every TOEFL YouTuber was to NOT READ THE ARTICLE. I suppose in the early days of the test people read the articles in their entirety… but few were doing so at the end of days.

I’m not saying that one reading section is better than the other. I’m just highlighting the value of engaging with tests on a deeper level – and perhaps walking the same road the test taker traverses as they discover, prepare for, and take a test.

In this case, when we go a bit deeper we can see that with smart design a shorter item can require more cognitive work than a longer item, and that the design process can include a thoughtful analysis of popular test taking strategies.

Reporting from the BBC suggests that the IELTS partners are no longer bidding for the HOELT tender. Apparently, the IELTS consortium have sent a letter to the Home Office stating that a fully digital at-home HOELT could undermine efforts to secure the UK’s borders and create “new and significant security vulnerabilities for the country.”

The letter includes the following declaration:

“Given the importance of secure English language testing for the UK’s immigration system and the protection of our borders, we cannot endorse the proposed approach by bidding for this tender while retaining our commitment to responsible, trusted and secure assessment.”

It seems that the IELTS team will not move forward with a bid for the HOELT tender. Could this move be enough to derail the whole process? I don’t know.

Meanwhile… if the consortium has given up entering the tender as a partnership there is always the possibility that one of the partners could move forward with a solo bid.

Interesting times.

On April 1, the cost of taking the IELTS test in India will increase by ₹1000, bringing the total cost to ₹19000.

This comes after an increase of ₹1000 in March of 2025, an increase of ₹750 in January of 2024, and an increase of ₹700 in April of 2023.

IDP Education is in a tight spot here.  I was set to make a dry comment here about the wisdom of continually raising prices at a time when competing testmakers are drawing away their customers by offering more attractive products in terms of content and customer experience.

But perhaps price hikes are the only way to keep their shareholders happy in the face of a declining Indian Rupee.  A year ago, that ₹18000 price tag converted to $342 AUD.  Now it converts to $270 AUD.

An additional challenge is the per-test royalty that IDP kicks back to Cambridge Press and Assessment, which I assume is fixed in GBP.

The paper-delivered IELTS will be eliminated worldwide in mid-2026. The IELTS Official team has announced that “…after careful review, from mid-2026, we will no longer offer IELTS as a paper-based test. All IELTS tests will be delivered on computer. Exact timelines will vary by market.”

This will not come as a surprise for regular readers of this space. Announcements about the elimination of the paper option in individual markets have trickled in one at a time over the past year.

Why are they doing this? Well, according to the announcement this is due to “higher satisfaction” among people who take the test on computer. But savvy test waters will already be aware that this will significantly improve the overall security of the IELTS test.

As a concession to people who like writing on paper, a special “writing on paper” option will be offered in certain markets. Says IELTS:

“[w]e are aware that some test takers like handwriting answers, so we are introducing a new option. In selected markets, we will introduce ‘Writing on Paper’. This update will allow test takers to personalise their test experience by handwriting their answers to the ‘Writing’ component on paper if they choose.”

Which markets will get the “writing on paper” option? That isn’t stated. But India will probably get it. Most readers already know that paper-delivered testing is still pretty popular there. I suppose China might get it as well. Many readers probably aren’t aware of how popular paper-based testing still is in that country.

Considering the ongoing debate about at-home testing, I’ve been asked a few times to estimate the total number of high stakes at-home tests being administered for the sake of higher-ed admissions and immigration. This requires a whole lot of guesswork, but it is fun to speculate.

First up, I estimate that the Duolingo English Test is administered about 650,000 times per year. This estimate is based on revenue figures included in Duolingo’s annual reports. Those are all at-home administrations.

Next, I estimate that the TOEFL is taken about 750,000 times per year. This is based on comments from Amit Sevak that the test was taken “almost a million times” circa 2021/22 and my assumption that volumes have declined since then (as they have across the whole industry). Anecdotally, it seems like the at-home TOEFL is extremely popular in some key South Asian markets, where I estimate that more than 40% of TOEFL tests are taken at home. But in the giant Chinese market, the at-home TOEFL appears to be quite unpopular. In fact, test takers there are required to use a workaround via a voucher from Hong Kong just to access it in the first place. The key Korean and Japanese markets seem somewhere in between. Consequently, I estimate that about 25% of all TOEFL tests are now taken at home, or about 188,000 tests per year.

Next, we have the IELTS Official. The most recent numbers from IDP and the BC indicate that they do about 3.5 million IELTS tests per year. Assuming the usual 75/25 split remains accurate, about 2.7 million of those are probably IELTS Academic tests (the only version available online). While the IELTS partners don’t seem very enthusiastic about at-home testing, it is still offered in 86 countries and in the occupied Palestinian territories. The list of where it is available is dominated by EU countries, but it also includes a few spots where access to test centers might be limited. It must be mentioned that the at-home test is NOT available in any of the mega markets that we are all familiar with (India, China, Vietnam and Nigeria). Given all of the above, I estimate that the at-home IELTS accounts for only 5% of all administrations, or about 160,000 tests per year.

Finally, we have the long tail of moderately popular tests that do at least a few at-home administrations (LANGUAGECERT, Michigan, Oxford ELLT, OET, etc) and those which are done exclusively online (Kaplan, Password Plus, PTE Express, etc). I’m not familiar enough with their operations to guess the totals, but surely they all add up.

Based on these estimates, it looks like the big three tests do a combined total of about 998,000 at-home tests per year.  If we include the smaller tests in our total, we can assume that well over a million at-home tests are being taken each year for admissions and immigration purposes.

A caveat to keep in mind, of course, is that some people take these tests for professional purposes, certification purposes and for their own amusement.

There is an article in the Financial Times which rehashes some of the discussions that folks in the UK have been having over the past few weeks.  The opening sentence is “[a] growing number of UK universities are using a test developed by language-learning app Duolingo to judge whether foreign students speak good enough English to enrol on their courses.”

The article includes some sample items from the Duolingo English Test and the IELTS Test.  Specifically, a sample of Duolingo’s “Is this a real word” task and a multiple choice academic reading task from the IELTS.  I’m not sure this is fair.  Perhaps it would have been better to compare the IELTS academic reading task to the Duolingo interactive reading task. Below a sample of that item type.

 

IDP Education Ltd has shared results for the first half of the fiscal year.  IELTS volumes are down 7% for the half year.  That includes a 27% drop in India.  The good news is that IELTS volumes are up 1% outside of India.

IDP administered a total of 638,000 tests in the half year.

Regarding the entry of IDP IELTS into the Chinese market, according to the investor’s call, IDP is “working with a well-established third party testing provider.”  The provider’s name wasn’t given, but IDP’s entrance was described as a “3P model” which is a comment worth examining.

It was also mentioned on that call that IDP’s opening of five test centers in the Yangtze River Delta area (Shanghai and its environs) gives them access to 60% of international students from China.  This may be true, but capacity could be a concern in the short term – note that the British Council runs 13 test centers in Shanghai alone.

The annual report notes that IDP is “modernising paper-based testing (IOC+P).”  It doesn’t define “IOC+P” but I assume that refers to an implementation of the IELTS where the questions are wholly delivered via computer, but some portion of the test (perhaps the writing section) is completed on a piece of paper.  I suspect this may be rolled out in markets that are particularly attached to paper-delivered tests.

Nothing meaningful about the HOELT tender was mentioned.

IELTS drops in India following IDP’s purchase of the British Council’s operations there are really interesting.  Here they are:

  • FY 2023: 9% drop
  • FY 2024: 42% drop
  • FY 2025: 50% drop
  • H1 2026: 27% drop

Quite a thing.

The paper-delivered IELTS will be eliminated in Lebanon and Laos on March 1, in Thailand and Mongolia on March 22, in Algeria and Tunisia on April 30, in Togo on 24 May, and in Ghana on June 21.

Since my last post on the subject, the test has been eliminated in Poland and Palestine.

My master list of “no paper” countries is now: Algeria, Cambodia, Bahrain, Bangladesh, Brunei, Ghana, Iran, Jordan, Korea (South), Laos, Lebanon, Libya, Malaysia, Mauritius, Mongolia, Nigeria, Palestine, Pakistan, Poland, Sri Lanka, Taiwan, Thailand, Togo, Tunisia, Uzbekistan, and Vietnam.

There are probably more. With no centralized list, I just make notes of country-level announcements I happen to see by chance.