I read in The Asia Business Daily that people who take the IELTS at certain British Council test centers in Korea can get a 21,000 KRW discount on their registration fees.

The discount is available at three test centers in Seoul. Included is the one just a couple blocks from my apartment so if I can find some free time before my holiday in January I’ll take the test. Since the IELTS hasn’t really changed in some time I haven’t felt a strong desire to register in recent years… but I can’t pass up a deal.

Note that your test date must be by January 31 if you want to get the deal. If anyone reading this tests in my neighborhood, do let me know. I’ll buy you a drink.

IELTS on paper will no longer be offered in Bangladesh after January 31 of 2026. After that date, only the computer delivered version of the test will be available.

It will also be eliminated in Taiwan on an unspecified date.

IELTS on paper has recently been eliminated in a bunch of key markets.  I suspect it will be eliminated entirely some time in the near future.

The official FAQ page for the IELTS difficulties has been updated.  It now includes answers to at least a few of the questions which have been posed by various observers.

For instance, we know that a total of 63,216 tests were affected.  That’s at the high end of my estimated range.

And more specifically, we now know that:

  • 93% of corrected tests (58,867) had an upwards correction to an individual test component
  • 7% of corrected tests (4,344) had a downwards correction to an individual test component
  • Five corrected tests had one downwards test component correction and one upwards test component correction.

It is also noted in the FAQ that no tests taken before August 2023 were affected.

More details at the link.

There is an excellent new article on the IELTS website which notes how “[r]esearch shows that the average minimum entry score is an IELTS band 6.6 across countries, below our expert’s recommended levels.” It says “[w]hen minimum English language scores are set too low, the consequences can be far-reaching.” 

We sometimes hear university staff complaining about how new international students don’t have the linguistic competency needed to excel in their new environment. But what do they expect when their institution requires only an IELTS score of 6.5? Or even a score of 7.5?

The people setting cut scores at universities aren’t wholly to blame when those scores are poorly set. It’s a tough thing to do, and they may not have enough information to work with. The aforementioned article links to the IELTS writing band descriptors. But while those are detailed, their usefulness is somewhat limited. For instance, here’s what they say about the grammar in a band seven essay:

“A variety of complex structures is used with some flexibility and accuracy. Grammar and punctuation are generally well controlled, and error-free sentences are frequent. A few errors in grammar may persist, but these do not impede communication.”

It’s a start, I guess.

If readers of the article dig two links deeper, they’ll find some sample essays with band scores. But there are only a dozen of them.  And, curiously, they are all from the paper edition of the test, and barely legible.

As scrutiny of tests increases, it is more imperative than ever that test makers provide access to whatever it is that test takers produce during the test.  Technology has caught up with the needs of decision makers, and it isn’t terribly hard to let university staff read all of the IELTS essays produced by all of their applicants. Obviously they probably won’t use this information to make admissions decisions, but it will give them crucial information about how to set the best possible cut scores. They’ll finally know, perhaps, what an applicant who has submitted a score of 6.5 is capable of.

A handful of ancient samples from the test maker can only go so far.  Scores can sometimes drift, inter-rater agreement isn’t always fantastic, the scoring criterions interact in curious ways that only become apparent when you’ve got a bunch of responses to look at… and there is always the niggling fear that the samples have been graded by item writers or test developers rather than actual raters. So… the more the better. And instead of talking about providing “more” essays, we should probably be talking about providing all the essays.

And we haven’t even talked about the speaking section.

It is worth mentioning that the TOEFL and Duolingo tests have provided sample responses along with every score report for ages. Other tests have as well.

Just when you thought this IELTS screw-up would blow over, the right wing press has decided to sink its teeth into the matter.

Says the Telegraph:

“Thousands of migrants may have been given visas despite failing mandatory English language tests following a blunder over marking, The Telegraph has learnt.

Up to 80,000 people sitting a language test run by the British Council were given the wrong results, meaning many of them were given pass marks even though they had failed.

Separately, evidence of cheating has been discovered in China, Bangladesh and Vietnam, where criminals sell leaked test papers to migrants so that they know the answers in advance.

It means students, NHS workers and other migrants with a poor grasp of English have been given study or work visas to which they were not entitled.”

And also:

“Because it took so long to discover the problem, many people who were wrongly told they had passed would have been able to obtain visas and come to Britain legally.”

That’s all sort of true, though unmentioned is the fact that a majority of the people who got incorrect scores likely used them to head to Canada, Australia and other places. Or that many IELTS tests are used for domestic purposes. And the fact that the British Council runs the test in partnership with a few other parties.

In any case, it isn’t a great situation.  And the IELTS partners could do better in terms of transparency.  For the record, here (once again) are the questions I would ask the partners if I were a real journalist:

  1. What exactly was the problem?
  2. Exactly how many tests were affected by this problem? “Less than 1%” could mean anywhere from 1 to 70,000 tests.
  3. Were any administrations from before August 2023 impacted by the problem? IELTS has contacted test takers who must be given new score reports. But what of test takers whose results have expired since they took the test? Did any of them receive incorrect scores?
  4. Are there any instances where a test was incorrectly scored, but a band score change was not necessary (and, by extension, the test taker was not contacted)? If so, how many? And if so, were these included in the “less than 1%” figure given to score users?

According to friends in Tehran, IDP Education Ltd will stop offering the IELTS in Iran at the end of January. This means that it will no longer be possible to take the IELTS in Iran at all, as the British Council pulled out of the country in 2009.

As a result of a special license issued to ETS from the Office of Foreign Assets Control, the TOEFL remains an option for Iranian test takers.

If anyone here has the ear of the IELTS partners, it is worth asking if they still plan to bid on the HOELT as a group.  It’s likely that they will do so – a bid that brings a sort of “white label IELTS” to the table would definitely have a lot of gravitas and psychometric heft.  That said, given that test center delivery doesn’t seem to be a factor anymore, the partners suddenly have less need for each other.  Any one of them could easily fulfill both the delivery and development requirements on their own.  In the early days of the HOELT we talked a lot about how the massive test center networks of IDP Education and the British Council were huge advantages, but obviously they don’t matter any more.

If you are interested in the possibilities, do take a moment to explore some of the tests that the partners currently develop and deliver on their own. Cambridge University Press & Assessment, for instance, has a very nice test called LinguaSkill.  I’d say that it leans “traditional” (if you will forgive the term).  It is delivered digitally at test centers but could probably be adapted to a remote format.  The British Council, meanwhile, runs the Aptis test which is also pretty traditional.  This product is mostly taken at test centers, but a remote version has been available for some time. IDP Education Ltd has a test called Envoy. It is mostly unknown at the moment but has a lot of modern features (adaptive content, 90 minute duration, wholly AI scoring, scores in two hours, etc) that could help it stand out from the crowd. I think proctoring of the Envoy test is async, but I couldn’t confirm that by looking at the website today.

Now, regular readers know that the IELTS partnership has endured for 36 years without any hint of disagreement or conflict between the partners.  So I’m pretty sure that a joint bid will take advantage of that history of positive cooperation.  But, as I said, it is worth asking for clarification.

It should also be mentioned that the winning test is unlikely to be (strictly speaking) the IELTS test, the Duolingo English Test, the PTE Test, or whatever.  It will build on the research and designs of one of those tests, but will be adapted to meet the requirements of the Home Office.

One more Google Trends chart.  This one shows interest in the search terms “IELTS” and “TOEFL” since 2004.  As you can see, back in 2004 there was a lot of interest in TOEFL.  Back then,  IELTS was something of an afterthought.  Today, though, IELTS is utterly dominant.


There is a lesson to be taught here.

Which is that it didn’t necessarily have to be this way.  You could say that IELTS is at the top of the heap because of the increased importance of new receiving markets over the past twenty years.  But I don’t think it was a given that a test from the UK would become the de facto English test for people headed to Australia.  And, needless to say, it wasn’t a given that a test from the UK would be the thing taken by people headed to Canada. TOEFL probably shoulda cornered that market.

So what happened?  Well, the TOEFL originally launched at some point before the beginning of recorded history and for several decades it was fairly dominant in its category.  Meanwhile, the IELTS was handed down on stone tablets in 1989 and quickly grew in popularity.  The data suggests that its popularity surpassed that of TOEFL sometime around 2009.

Note that the peak of TOEFL’s popularity, according to the data, was around 2006.  This was the year that the old paper-based TOEFL was largely replaced with the TOEFL iBT (note that a few markets got the iBT in 2005).  A key aspect of this transition is that the TOEFL jumped from being a roughly 2-hour test to a roughly 4 hour and five minute test.  Four hours!  Old timers will remember that the iBT included unscored content in either the reading or listening section, but might not know that the reading section could contain TWO unscored passages.

(Why so long?  Well, remember that the original TOEFL tested neither writing nor speaking.)

To me, this is the key reason why the test declined in relative popularity.  I’d say that the 2:44 length of the IELTS suddenly became a lot more attractive.  I think this was a major reason why test takers started piling into that particular test.

The TOEFL was quietly shortened to just under four hours sometime in the early 2010s with the removal of the second unscored reading passage.  In 2019 it hit about 3:20 with the removal of certain reading and speaking questions.  In 2023 it was reduced to just over two hours with the removal of all unscored questions and the longer writing task.  In each case the numbers were massaged to make the test look even shorter than it actually was.  In 2026 the test will clock in at just under 90 minutes, according to information that is currently available.  And that will include plenty of unscored material, so further reductions are entirely possible.

Test duration remains a big deal. People like short tests. Go figure.  The PTE has eaten away at the IELTS in part because it has always been a roughly two-hour test.  And the Duolingo English Test’s one-hour duration has helped it capture a lot of market share in recent years.

I wonder, though, if we have reached a test-duration floor.  The length of the PTE was quietly increased a few months back when certain revisions were introduced.  Even the Duolingo Test seems to be a sliver longer than it used to be due to the introduction of new integrated tasks a few months ago (not to mention the time it takes for test takers to carry out new security features).  And the IELTS team seems to be in no rush to shorten their test. Meanwhile, to my jaundiced eyes it almost seems like the TOEFL team is actually a little embarrassed about the potential length of the new version of their test.  And the so-called “long tail” of minor tests that trail the big four in popularity seem to be distinguishing themselves in ways that don’t relate to test length.

It will be interesting to watch how this goes in the years ahead.  Maybe the big tests will all become longer than they currently are.  Stranger things have happened.

The cost of taking the IELTS in China will drop by 180 RMB in a few weeks.   That’s about $25 USD.  After the drop it will cost 1990 RMB to take the test (about $280 USD).  It also seems that the UKVI version will have the same price as the standard version (a drop of 230 RMB).  Cancellation and change fees will also be lowered.

It is unclear if this is an effort by the British Council to make their product a bit more attractive as the TOEFL test kicks off a high-profile renewal process, an effort by the NEEA to make the test more consumer friendly, an impending IDP launch, or something else entirely.

Let me know if you see any price changes in your market. Here in Korea, the price was hiked in the summer.

(FYI: It currently costs 2100 RMB to take the TOEFL in China, which is about $295 USD)

I’ve attached here a Google Trends chart showing search interest in Canada over the past four years for the terms “IELTS” and “CELPIP.”  I share this because it demonstrates how CELPIP seems to be steadily closing the interest gap.

(The chart begins in November 2021 as including the giant one-day spike from earlier that year makes it unviewable, but obviously the gap was even wider in the past)

I haven’t actually taken CELPIP, but it appears to be a fairly attractive product for some test takers.  Indeed, as CELPIP is not used for university admissions, interest in it might have already surpassed interest in IELTS within its category (immigration).

This is something that people who follow the business of English testing will want to keep an eye on.  We talk a lot about how IELTS faces challenges from Duolingo or Pearson but there are many other products which might grab some of their market share in the years ahead.

As regular readers know, CELPIP was recently approved for immigration to Australia.  It may become popular there as test takers notice some of the same features that are currently attractive to would-be Canadians.

Anyway, below the chart I will share a couple of images showing the long wait times for CELPIP books from some favorite public libraries in Canadaland (the first is in Halifax, the second is Toronto).

 

 

I was quoted in this PIE News article  by Kimberley Martin about the ongoing IELTS hullabaloo.  Here’s my quotation, without the scaffolding of the article:

“This story highlights how important it is for test makers to identify problems as quickly as possible so that test takers have sufficient time to protect their interests.  In this case, the IELTS partners have indicated that some test takers received scores that were lower than they deserved.  As a result, many people around the world may have missed out on life-changing academic and professional opportunities for which they needed a particular IELTS score.  Others may have been left in a position where they were unable to meet requirements necessary for immigration or residence purposes. It may be too late for some of these individuals to get back on track. I feel for those people.

Conversely, those who were given scores higher than they deserved may have quickly found themselves struggling in academic environments which they were not prepared for.

Testing companies serve as gatekeepers for academia and for immigration.  When they mess up, the consequences can be far-reaching and profound.”

I think this is an important story, and I would like for it to be covered by more people (like Kim) who are better at journalism than I am.

Here’s what I would ask the IELTS partners regarding the ongoing issues with incorrect IELTS scores:

  1. What exactly was the problem?
  2. Exactly how many tests were affected by this problem?  “Less than 1%” sounds nice on paper, but that could mean anywhere from 1 to 70,000 tests.
  3. Were any administrations from before August 2023 impacted by the problem?  IELTS has contacted test takers who must be given new score reports.  But what of test takers whose results have expired since they took the test? Did any of them receive incorrect scores?
  4. Are there any instances where a test was incorrectly scored, but a band score change was not necessary (and, by extension, the test taker was not contacted)?  If so, how many?  And if so, were these included in the “less than 1%” figure given to score users?
  5. Were any IELTS UKVI administrations (SELT) impacted by this problem?  If so, has the Home Office been informed of cases where received scores are inaccurate?