The “Listen and Repeat” task on the new TOEFL attracted much attention soon after ETS published sample test forms. Some commentators pointed to it as evidence that the test would be “easier” than the old TOEFL. I suppose ETS didn’t do themselves any favors by selecting a list of sentences about visiting a zoo for inclusion in the first sample form. I certainly snickered when I first saw it.

In light of this, it is interesting to note that chatter on social media by actual test takers now paints this task as one of the most challenging parts of the new test. While working through practice sets on my own, I’ve enjoyed examining the mental processes involved in repeating longer sentences (~25 syllables) with two clauses. I find myself mentally replaying the first clause while the second is still being played aloud. And mentally repeating the whole thing once again before speaking. A lot goes on in my brain in the 15 seconds it takes to complete all this. Imagine doing it in an L2.

One could therefore argue that this sort of task, while of limited utility overall, can provide insights that we don’t get from a traditional constructed response item or an in-person interview. And since it only takes two minutes to administer, perhaps it can be included without taking time away from more traditional tasks. We could say that this is using the technology of 2026 to our advantage.

Notably, the PTE test also includes a “listen and repeat” task… but the PTE Express does not. I think the Duolingo English Test included one until about 2023, though alcohol consumption has made my memory of that period somewhat foggy. To me, this suggests that if a testmaker has 90 or 120 minutes to play with, they can go ahead and include some shorter items that poke and prod at narrow aspects of language use… even if they don’t exactly resemble real life encounters.

On the other hand, the TOEFL deep dive recently published by the IELTS partners examined the task and noted that:

“…the ‘Listen and Repeat’ task type appears to tap only minimally into higher-level cognitive processing and is weakly aligned with meaning-oriented, authentic oral communication.”

This mirrors comments in a review of the Versant English Speaking and Listening Test (from Pearson) published in Language Assessment Quarterly last month. Regarding the L & R task in that test, it notes that:

“[a]lthough the ‘Repeat the sentence’ task appears to measure some elements of the operational Listening and Speaking sub-constructs, it appears to have minimal relevance to the widely accepted oral communication construct.”

In a response, Pearson’s Bill Bonk and Jooyoung Lee said that:

“…the ability to accurately comprehend and reproduce sentence-level utterances is a foundational prerequisite for communication. Without reliable sentence-level processing – encompassing phonological decoding, lexical access, syntactic parsing, and short-term retention – higher-level discourse processing cannot occur.”

Food for thought.

There is an article in the Financial Times which rehashes some of the discussions that folks in the UK have been having over the past few weeks.  The opening sentence is “[a] growing number of UK universities are using a test developed by language-learning app Duolingo to judge whether foreign students speak good enough English to enrol on their courses.”

The article includes some sample items from the Duolingo English Test and the IELTS Test.  Specifically, a sample of Duolingo’s “Is this a real word” task and a multiple choice academic reading task from the IELTS.  I’m not sure this is fair.  Perhaps it would have been better to compare the IELTS academic reading task to the Duolingo interactive reading task. Below a sample of that item type.

 

IDP Education Ltd has shared results for the first half of the fiscal year.  IELTS volumes are down 7% for the half year.  That includes a 27% drop in India.  The good news is that IELTS volumes are up 1% outside of India.

IDP administered a total of 638,000 tests in the half year.

Regarding the entry of IDP IELTS into the Chinese market, according to the investor’s call, IDP is “working with a well-established third party testing provider.”  The provider’s name wasn’t given, but IDP’s entrance was described as a “3P model” which is a comment worth examining.

It was also mentioned on that call that IDP’s opening of five test centers in the Yangtze River Delta area (Shanghai and its environs) gives them access to 60% of international students from China.  This may be true, but capacity could be a concern in the short term – note that the British Council runs 13 test centers in Shanghai alone.

The annual report notes that IDP is “modernising paper-based testing (IOC+P).”  It doesn’t define “IOC+P” but I assume that refers to an implementation of the IELTS where the questions are wholly delivered via computer, but some portion of the test (perhaps the writing section) is completed on a piece of paper.  I suspect this may be rolled out in markets that are particularly attached to paper-delivered tests.

Nothing meaningful about the HOELT tender was mentioned.

IELTS drops in India following IDP’s purchase of the British Council’s operations there are really interesting.  Here they are:

  • FY 2023: 9% drop
  • FY 2024: 42% drop
  • FY 2025: 50% drop
  • H1 2026: 27% drop

Quite a thing.

The paper-delivered IELTS will be eliminated in Lebanon and Laos on March 1, in Thailand and Mongolia on March 22, in Algeria and Tunisia on April 30, in Togo on 24 May, and in Ghana on June 21.

Since my last post on the subject, the test has been eliminated in Poland and Palestine.

My master list of “no paper” countries is now: Algeria, Cambodia, Bahrain, Bangladesh, Brunei, Ghana, Iran, Jordan, Korea (South), Laos, Lebanon, Libya, Malaysia, Mauritius, Mongolia, Nigeria, Palestine, Pakistan, Poland, Sri Lanka, Taiwan, Thailand, Togo, Tunisia, Uzbekistan, and Vietnam.

There are probably more. With no centralized list, I just make notes of country-level announcements I happen to see by chance.

I will take the new TOEFL this coming weekend at the Fulbright Korea building.  I picked that spot since it seems to be the biggest and best test center here in Seoul, and the place where I am most likely to find some literature about all the benefits of studying in America.  I also want to try the custom headsets that have been distributed to TOEFL test centers, and I think Fulbright probably has ‘em.

For what it’s worth, I count nine TOEFL test centers in Seoul. There are also test centers in Incheon, Goyang and Yongin (one each).  Plus a few more in the southern part of the country.  Tests are administered three days per week, with both morning and afternoon times.

Regarding registration for the new test, a few things are worth noting:

  1. The process takes about five minutes.  That’s wonderful.  In the pre-pandemic era, registration was a 15-minute slog.
  2. It appears that paper score reports are still provided.  I clicked a button to have one mailed to me after the test.  That’s also wonderful.
  3. Coupons are a big part of the marketing of TOEFL now.  I wonder how many people pay full price.  I didn’t.
  4. The upselling that goes on during the registration might be considered “a bit much.”  I was prompted twice about my willingness to buy prep materials.  I will post screenshots.
  5. Following my registration I was sent an email with important information and told to “Save this email so that you can refer to it later.”  But the critical links it contains are all broken.  This includes a link that is supposed to direct test takers to TOEFL’s identification requirements, and another which is supposed to tell me what to expect at the test center.  I asked a fellow who registered for the test last week and he got broken links as well.  That’s not good.
  6. The same email told me the time of my test in EST.  But that’s not my time zone.  To avoid confusion, I think students should be told the time the test is administered in their own time zone.

A stark decline in the number of international students heading to Canada is one reason why test volumes are decreasing. The Toronto Star reports that 115, 470 students entered the country last year. That’s 61% less than the year before. I’ll post a link in the comments.

In recent years, a major part of Canada’s attractiveness as a study destination has been its immigration proposition: in just a handful of years an individual can progress from a short course of study to a post-graduation work permit and then permanent residence. But keep in mind that due to regulations that came into force back in November of 2024, it is usually necessary to take an English test at all three steps in this journey. As you can imagine, this has made Canada pretty important to testing firms.

Students heading to Canada have traditionally favored the IELTS. But Canadian study visa applications don’t come with a list of mandated English tests, so students are free to choose from a wide variety of options. My alma mater (a middling school on the east coast) accepts scores from eight different tests, for instance. Anecdotally, it seems like Pearson’s PTE and the Duolingo English Test have gained popularity among Canada-bound students in recent years. A recent IDP Education financial report highlighted on-shore testing in Canada as a bright spot in challenging times.

One fun wrinkle is that while university applications require scores from the IELTS Academic test (or some other academic-ish English test), the PGWP requires scores from the IELTS General test (or the CELPIP General test or the PTE Core). So even if your course of study was short enough that your original test score is still valid following your graduation, you will have to retest. Savvy timing could possibly enable people to use the same scores for PGWP and PR, though.

The IELTS partners have published a research report comparing the new TOEFL to the old TOEFL and current IELTS.  The report says that:

“Given the new TOEFL iBT’s substantially reduced text length and complexity, its narrowing of task types, the potentially increased susceptibility to ‘coaching’ and its new scoring system, among other considerations, this report concludes that the revised TOEFL iBT represents a substantial construct shift, undermining assumptions of score equivalence with earlier TOEFL versions. The report warns about the risk for score misinterpretation that can arise from using legacy concordance tables.”

Furthermore, the authors note that:

“…these changes warrant careful caution with respect to test comparability and the interpretability of scores, which raise important questions about the continued suitability of the test for the academic domain, a high-stakes context.”

Strong words.

This is a long report, but a summary of concerns raised in the article might include the following:

  1. The new c-test item represents “a relatively narrow construct that does not fully align with broader communicative testing principles.”
  2. The inclusion of daily life reading items “warrants caution when interpreting New TOEFL 2026 reading scores as directly comparable to IELTS or Former TOEFL 2023.”
  3. The use of shorter reading passages has “potential consequences for the representativeness of the construct sampled and the defensibility of academic-readiness interpretations.”
  4. The use of shorter items means that the listening section “no longer fully reflects the B2 descriptors requiring the ability to process extended, complex oral input.”
  5. The new email task “is likely to capture only rushed sentence-level construction rather than the planning and audience-oriented discourse management that characterise effective email writing” and could have negative washback effects.
  6. The test’s technical manual “does not clearly state whether every response is double-scored or whether some are scored solely by automation.”
  7. The listen and repeat task “may contribute only narrow evidence about oral ability and should be interpreted cautiously if used to support broad claims about communicative speaking proficiency.”
  8. It is unclear whether the stuff mentioned in the scoring rubrics for speaking are captured by the automated scoring engine.

But that’s just a quick list. There is more in the article. Good lord, there is more.

It is worth noting that quite a few good schools in the UK haven’t yet posted cut scores for the new TOEFL (I’m working on a list, but it is slow going).

A few weeks ago, the Wall Street Journal reported that ETS has explored the possibility of selling both the GRE and TOEFL tests.  In the days ahead, it may be worth exploring how things got to this point.  After all, here we have a major non-profit testing organization allegedly considering the sale of its two biggest testing products.

As I’ve indicated here before, my interest in the business practices of testing organizations is rooted in the fact that they have an enormous impact on the lives of tens of millions of young people around the world.  When it comes to organizations providing language testing, that impact is mostly felt by people living in the Global South who are asked to pay test fees that can sometimes equal a week or even a month’s wages.  Accordingly, I think it is incumbent on everyone to keep a close eye on the balance sheets of such organizations.

With that in mind, I present part one of my 435-part series, “Better Know a Testing Company.”

This installment explores some of the challenges ETS may have faced in recent years when it comes to program service revenues. It explores the state of the GRE and TOEFL, as well as a few other tests provided by ETS.  To summarize my point, revenues related to testing appear to be down.

It is often the case that when revenues are down things have to change.

Don’t take this the wrong way, though.  The piece is not a judgement of ETS’s business practices.  Indeed, revenues are down mostly because of factors outside of that organization’s control.

GRE

GRE volumes peaked in 2015/16 when the test was taken 584,677 times.  A year later Science published an article titled “GREs Don’t Predict Grad School Success.  What Does?”  And the Atlantic published “The Problem with the GRE.” This was the beginning of #greexit. Note that this was a pre-pandemic phenomenon.

By 2023/24, the number of test takers had dropped to 256,215.

A volume drop of 328,462 tests at $220 a test means a loss in revenue of about 72 million dollars. But it also means a loss in revenue from high margin stuff like score sending fees ($40 per school), the GRE search service ($1.25 per lead) and PowerPrep practice tests ($45 each).  Heck, test takers can still pay $20 to get their practice essays scored by the e-rater in just a few seconds.

That’s a lot of money.  And it is worth mentioning that half of the GRE subject tests (remember those?) have been completely discontinued.

Was this avoidable?  I’m not sure.  ETS shortened the test from four hours to two hours in 2023.  That reduced their costs a bit and made it more attractive to students… but if schools are communicating that the GRE doesn’t play a big role in admissions decisions, there isn’t much that ETS can do to convince people to take the test no matter its length. Perhaps ETS could have built a test that admissions people found more convincing and useful, but that’s an incredibly tall order.

TOEFL

As for the TOEFL, volume numbers aren’t publicly disclosed.  According to comments made by ETS head Amit Sevak to the PIE shortly after he joined the firm, the test was taken about a million times circa 2021/22.  By all accounts, it has lost some market share to Duolingo and other providers since then.  Take my bloviations with a grain of salt, as always, but I peg it at about 700,000 administrations per year nowadays.  At an average price-tag of about $250 (the cost ranges from $180 to $475 depending on the country), that’s another $75 million in lost revenue.  Plus all the high-margin extras.

Could this have been prevented?

To some extent, perhaps.  The TOEFL is still useful. People still need to take English tests.  The problem here isn’t that the test has lost its perceived usefulness. Indeed, it is still perceived as very useful.  TOEFL has lost market share as students have piled into tests that they find more attractive than the TOEFL.  Accordingly, the inclusion of more attractive features could have helped TOEFL maintain market share and volumes.

Speaking of attractive products, the TOEFL was relaunched a few weeks ago.  The new test is described as “fair, agile, smart, tailored.”  ETS has received much praise for the relaunch… with some claiming that it is, in fact, a more attractive test than the old TOEFL.

Mission accomplished!

Just remember that the new test is awfully similar to the “TOEFL Essentials” product that was announced back in May of 2021.  There is little that is totally new here. A c-test has been grafted on to the reading section, and the number of adaptive levels has been reduced from 3 to 2. The newness seems to mostly come from the fact that this stuff is now part of the mainline TOEFL iBT product, rather than being offered as a separate test.

What I’m saying here is that perhaps the old TOEFL could have been replaced five years ago.

It is also worth mentioning that many (many, many) of the people who built (and continue to build) the Duolingo English Test are former ETSers.  Had things gone a little bit differently they could have built a similar test at ETS and had it ready to go even before the pandemic hit.

TOEIC

The TOEIC was taken 4.8 million times in the final pre-pandemic year. It was taken 3.2 million times in 2024.  The volume drop is most pronounced in Japan (the test’s biggest market), where stats are publicly available.  Anecdotally, it seems to have dropped a lot in Korea (its second biggest market) as well.  ETS collects a per-head fee from local partners who run the TOEIC in each country, but I don’t know the exact amount

Other Tests

The portion of ETS’s state-level testing business done via Questar was sold to the NWEA in October of 2021.

The HiSET test was sold to PSI in November of 2021.

The ETS Proficiency Profile was moved to Territorium in July of 2023.

ETS stopped administering and developing the SAT on behalf of the College Board in 2024.  These are duties ETS had carried out since its founding in 1947.  In its final year, ETS’s contract with the College Board accounted for 30% of its operating revenues, or about $300 million dollars.  ETS continues to be involved in some College Board programs, including the AP tests, but the value of the contract for these services is unknown.

The Test de Francais International was discontinued in February of 2025.

From across the street, Mrs. Goodine and I spied the British Council building in Amman.  Students can sit for the IELTS there. There is also an Amideast/Jordan office somewhere in the city where they can take the TOEFL, but I didn’t see it.

I am home now (finally). If you want to chat about tests, feel free to send a note. I’ve already jumped into one project, but that will occupy only about 15 hours a week, I think. So I am also happy to work on anything interesting that needs doing.