Duolingo’s Q2 results for 2025 were published a few days ago. The Duolingo English Test had revenues of $10,088,000 for the quarter. Since the test costs $70, we might estimate that it was taken about 144,114 times in the quarter. But this is a very rough estimate because not everyone pays $70. Some pay extra for faster results, some pay less by purchasing a bundle or buying through a partner organization. Others pay nothing thanks to the access program (which gave away about 25,000 free tests in 2024).

In Q2 of 2024 the test had revenues of $10,968,000 at $65 a pop, representing something in the ballpark of 164,584 total tests.

This suggests about a 12% decline in volume from last year.

Here’s a nice post on the Duolingo English Test blog about graduate admissions at the University of Missouri.  It points out that students admitted with TOEFL, IELTS and DET scores all perform about the same when it comes to overall GPA.  In 2023/24 DET admits earned the highest overall GPA.  And in 2024/25 IELTS admits earned that honor.

That’s mildly interesting and unsurprising.  Academic success depends on a lot of factors.

More interesting are the total students admitted using each test in the period covered (23/24 and 24/25). Specifically:

  • DET: 124 students
  • TOEFL: 122 students
  • IELTS: 106 students

These figures speak to the urgency with which ETS is overhauling the TOEFL, I think.

Conventional wisdom has it that the TOEFL is trailing the DET badly when it comes to undergraduate admissions,  but that TOEFL is still well ahead when it comes to graduate admissions.  This is just one school (and DET’s acceptance for graduate programs remains spotty)… but maybe conventional wisdom requires a slight adjustment.

I read a report today from a test taker who has alleged that the proctor of her at-home test used her personal information to find her Instagram account. He then, allegedly, sent inappropriate DMs.  This sort of allegation is more common than you might think.

I’ve never liked live human proctoring, but I have begrudgingly tolerated it. In cases where it absolutely must be maintained I’ve advocated for an in-house system.  One of the many benefits of this approach is that it gives test makers more control over who can access the personal information of test takers.

That said, if your formula for “in-house” is (proprietary tech) + (a whole bunch of dudes brought in by an outsourcing company) you might as well just stick with one of the big proctoring firms. They’ll probably do a better job.

In any case, the days of live proctoring (in-house or otherwise) are likely numbered. It seems obvious to me that asynchronous proctoring (AI-based security during the test combined with a human review after the test) is the future of at-home testing.  This is Duolingo’s current approach.  ITEP uses it too.  It appears that Pearson will follow suit when they launch their new “Pearson English Express” Test later this year.  I assume that many other test makers are currently trying to figure out how to implement their own async systems.  A decade from now, I don’t think any at-home English tests will utilize live proctors.

How much security is being provided by someone who is simultaneously watching a dozen tests?  Could the same level of security be provided by someone who checks after the test?

Some test makers might protest that their customers prefer to have help with the check-in process.  That may be true if your check-in process is clunky with cumbersome elements.  But those elements can be ironed out of the process.

I’ve taken quite a lot of at-home tests over the years.  I’m always happy when I can take a test without a live proctor, as the process is much more comfortable that doing so with a proctor.  Indeed, I took the ITEP today and the process was pure bliss.

Test makers who eat their own dogfood and, uh, other people’s dogfood know this already.

The incident I mentioned above is bad enough, but a few other stories I’ve heard over the past few years come to mind.  Like:

  1. Proctors insisting that test takers click the “cancel test” button after they have finished the test, causing the results to be cancelled.
  2. Proctors working from public transit.
  3. Proctors extending a room scan to the hallways outside of a test taker’s flat.  And even insisting on a peek inside their building’s elevator.
  4. Proctors mistakenly allowing rules to be broken, resulting in score cancellations.
  5. Proctors forgetting to turn their microphones off. To hilarious effect.

I could continue, but I think you get my point.

The long, long, long awaited Cambridge study about test use in the UK is now available.

For all the heavy lifting the preliminary results have been doing in IELTS marketing over the past year, there is surprisingly little in here about the value (or lack thereof) of specific new tests. That’s not meant to be a criticism, of course. The authors of the study seem to have had a higher purpose.

That’s not to say it is totally bereft of that sort of thing. The study contains statements like this:

“There is a notable divergence in the perceived value of various tests among different groups within institutions. While tests like Cambridge Qualifications are praised for their ability to prepare students for academic study, others, such as the Oxford International Education Group’s (OIEG) ELLT and Duolingo, are viewed with scepticism. Specific concerns were raised about the validity, security, and overall suitability of these newer, more efficient or less established tests. One survey respondent expressed dissatisfaction with ‘the recent decision to accept OEIG’s online ELLT for the China market only (in order to boost recruitment)’ due to its lack of credibility and associated security concerns.”

And this:

“For instance, one respondent noted that ‘students who came with the Duolingo award were not in practice equipped to deal with HE life and study’, echoing concerns found in studies about the adequacy of such tests.”

And this:

“One of the most consistent findings is that IELTS is widely regarded as the international standard or ‘common currency’.”

One imagines that the IELTS partners will continue to lean on this research study when crafting marketing materials in the years ahead. Score users might be wise to keep in mind that the criticisms mentioned in the study are anecdotal and not presently supported by comparative data about actual student outcomes. Often, the statements seem to be based on the perspectives of very small numbers of individuals.

Apart from the above, most of the study highlights concerns about English fluency on campus (quite separate from the use of particular tests) and provides recommendations for how to properly assess the worth of new tests.

I would be remiss if I didn’t link to Nicholas Cuthbert‘s video from DETcon London, since it generated quite a lot of spirited discussion.  Notably, Nicholas describes Duolingo as “ahead of the game.”  And also says:  “Duolingo are winning.”

Hyperbole?  Perhaps.  Duolingo is certainly on a winning trajectory in terms of acceptance in key receiving markets, market share, brand awareness, test taker engagement, use of technology, and baffling social media campaigns.  But it is important to note that IELTS still does more test administrations than everyone else combined.  There are still people paying $530 to take an IELTS test.  IELTS will be the market leader for many years to come.  Accordingly, there is still plenty of time for the IELTS partnership to develop a “next-gen” IELTS that eats Duolingo’s lunch.

Heck, Pearson is hoping to do just that in about four months.

Recent messaging from the British Council and Cambridge University Press & Assessment suggests that the IELTS partners plan to double-down on their more traditional approach to assessment.  It seems like they don’t plan to change the way they assess students, but instead encourage score users to more carefully consider which tests they choose to accept.

Regardless, I’m convinced that Cambridge has got top people working on… something.  Check out yesterday’s article in TESOL Quarterly, for instance.

A well-informed industry watcher recently expressed some incredulity that LLMs haven’t totally disrupted the high-stakes language testing sector.  He was shocked that people still pay hundreds of dollars to take a test.  My response was that this stuff takes time.  Everyone knows that university governance is a slow process.  Immigration regulations are even slower.  But things might finally be coming to a head – coincidentally both ETS and Pearson announced the existence of their “next-gen” (my term) tests at NAFSA a few weeks ago.  Things are moving a tiny bit faster now.

By the way, you must get to one of these DETcon events if you get the chance.  They are a charming combination of research presentations, community building and Duolingo’s trademark irreverence.  I understand that at the most recent Pittsburgh-based event, Duolingo CEO Luis von Ahn was subjected to an unannounced Yinzer Test.  Not sure what that is, but I suspect it is similar to a Voight-Kampff Test. In any case, the results have not been shared publicly.

Dan Isbell has written a guest post about washback in English test preparation for the Duolingo English Test blog.  It discusses preparation for the DET in particular, and for English tests in general.  Isbell divides test prep into three types:  activities that improve your English in general, activities that help you perform better on a particular test (test familiarization), and activities that help you game a particular test (templates and guessing strategies, for instance).

I understand that a full report on this topic is forthcoming.  I will add a link when it is available.

It’s an interesting thing to explore.  Test preparation always includes at least some good washback.  No matter what test they are preparing for, most test takers complete at least a few practice tests.  As a result, they will spend time consuming stuff in English and producing stuff in English.  This is good. But does it have a really meaningful impact on their fluency in the language?  I don’t know.

The TOEFL iBT contains two 800-word articles which are excerpted from actual textbooks.  Students here in Korea take their preparation pretty seriously and might complete 20 or 30 practice tests before test day (or between several test days).  That means they spend a lot of time reading some pretty dense material in English.  Does that improve their fluency?  Of course.  Does it improve their fluency a lot?  I don’t know.

My test prep niche is writing.*  It gives me great joy to know that my students walk away from their lessons with a noticeably stronger command of English grammar and language use conventions.  But, needless to say, there are faster and more economical ways to learn about sentence fragments and collocations.

Does all of this test prep mean that students spend less time on more useful and effective language acquisition approaches?  Maybe.

Is it the job of a test maker to give a darn?  Or is their only job to accurately measure language fluency? I don’t know.

A few stray thoughts come to mind:

  1. I’m interested to know how the age of a test impacts the way that students prepare for it.  As a test ages, people working in test prep become more and more familiar with the design of that test and can use that knowledge to develop better and extremely granular type 2 strategies.  Elderly readers might recall that in the early years of the TOEFL iBT we had just one official book (badly written) and a handful of books from third party publishers (even worse) to go by.  We didn’t know very much about the design specifications of test items, nor about how speaking and writing items were scored.  Things are obviously much different now. Now we know almost everything there is to know. We know so much nowadays that it might be malpractice to not spend quite a lot of time on test familiarization strategies.  Should tests be meaningfully refreshed on a regular basis to mitigate the impact of this factor?
  2. I love reading about the early history of the Princeton Review.  That firm emerged in the early 1980s when the SAT was long in the tooth and probably at its peak terribleness.  Princeton Review taught students how to eliminate answer choices without actually reading questions.  They also taught students how to recognize unscored sections so they could enjoy a refreshing nap part way through the test.
  3. In 2019 Malcolm Gladwell and his assistant both took the LSAT for an episode of his “Revisionist History” podcast.  They got coaching from the one and only John Katzman beforehand.  The point of the episode is that time management (type 2) is the most important thing when it comes to getting a good score on this test. It made LSAT tutors really cranky.

*See also:  “The Whale,” 2022.

Here’s a new video that summarizes all of the coming adjustments to the Duolingo English Test. It describes the “interactive speaking” question I wrote about on Monday and also notes that:

  1. A “listen and complete” task will be added to the beginning of the “interactive listening” task.
  2. The “Read aloud” and “listen then speak” tasks will be removed.
  3. Minimum speaking and writing times will be removed from some of the longer speaking and writing tasks. Test takers can move along whenever they feel they’ve completed the tasks.

UPDATE:  More complete descriptions have been added to the DET Help Center.

 

Starting in July, the Duolingo English Test will include a new speaking task that seems like an attempt to simulate a back-and-forth conversation.  In this task, test takers will receive a series of short questions on a given topic followed by a series of short questions on some other topic.  They must quickly respond to each question. 

Interested parties can demo the new task by taking the free practice test via the DET website. When I took the test, I received four questions about staying focused at work and four about family and growing up.   I was given six seconds to prepare each response and 35 seconds to speak each time, so quick thinking was mandatory.  I suppose this task is a good measure of one’s ability to produce spoken English fairly spontaneously. Note that a few people have reported getting just three questions about each topic.

I suppose there is a discussion to be had about how regular test revisions reduce the impact of cramming and test prep strategies on scores in general. Perhaps this will be a feature of more tests, as we move forward. I would love to see it as part of the HOELT, for one.

Below are a few screenshots from the practice test.  A new article from The Koala News references this task. The article includes a link to an information session you can sign up for to learn more.

The Duolingo English Test got a big brand refresh this week.  Everything is a whole lot more green than before, and the test’s branding leans more heavily into its connection to the main Duolingo app. Also note that the test’s Facebook page now sports a big ol’ “Powered by Duolingo” banner.

Of course, the connection to the app has always been emphasised (except for a curious rise in the use of “DET” to refer to the test in recent months).  Now it’s just turned up to eleven.

Here’s the new page:

Here’s the old:

Duolingo has reported earnings for Q1 of 2025.  Revenue from the Duolingo English Test was $11,986,000.  That’s down from $12,755,000 in Q1 of 2024.

The test cost $65 from January 1 to February 5, and $70 from February 5 to March 31.  If we split the difference and assume a $67.50 price tag we might assume the test was taken 177,570 times in the quarter.  That’s a 17% decrease from Q1 of 2024, when we might assume the test was taken 216,186 times (at $59 per attempt).

Note, as always, that these are crude estimates – some people pay extra to get faster results, while others get a discount by purchasing bundles of attempts.  Others pay nothing at all thanks to Duolingo’s Access Program.  It is also worth considering Duolingo’s affiliate program which shares revenue with certain partners.

Anyways. For those following along at home, here are quarterly revenues for 2024, followed by my crude estimates of how often the test was taken:

  • Q4: $11,415,000  (175,615)
  • Q3: $10,772,000  (165,723)
  • Q2: $10,698,000  (164,584)
  • Q1: $12,755,000  (217,000)

Total Tests: 722,922

And for 2023:

  • Q4 2023 – 10,819,000 (183,372)
  • Q3 2023 – 10,600,000 (179,661)
  • Q2 2023 – 9,800,000 (166,101)
  • Q1 2023 – 9,970,000 (203,469)

Total Tests: 732,603

And for 2022:

  • Q4 2022 – 8,410,000 (171,632)
  • Q3 2022 – 8,192,000 (167,183)
  • Q2 2022 – 8,036,000 (164,000)
  • Q1 2022 – 8,080,000 (164,897)

Total Tests: 667,712

And for 2021:

  • Q4 2021 – 8,095,000 (165,204)
  • Q3 2021 – 6,695,000 (136,000) 
  • Q2 2021 – 4,833,000 (98,632)
  • Q1 2021 – 5,035,000 (102,755)

Total Tests: 502,591

And, what the heck, for 2020:

  • Q4 2020 – 4,197,000 (85,653)
  • Q3 2020 – 5,607,000 (114,428)
  • Q2 2020 – 4,598,000 (93,836)
  • Q1 2020 – 753,000 (15,367)

Total Tests: 309,284

There is an interesting experiment going at the end of the DET practice test.  Duolingo seems to be trying out “describe-the-picture” tasks that are based on images connected to the test taker’s locality.

I’m in Korea and received a picture of a scene containing elements that would be immediately familiar to most people here.  After describing the scene I was asked how familiar I was with the image, how well I thought I did on the task, and if I would like to see similar items on the real test.

Following that, I received an image with elements that would be less familiar to people here and was asked similar questions.

I wrote a few days ago about the quick growth of some tests due to how “nimble” certain test makers are.  Duolingo certainly benefits from their ability to quickly run large experiments like this via their online practice test.

I was researching historic test volumes for a client using the wayback machine. I spotted an announcement from the IELTS partnership that the IELTS test was taken 3.5 million times in 2018.

And today? If we combine figures from the most recent annual reports of the British Council and IDP Education we get a total of about 3.6 million tests. So not much of a change from 2018.

In the same time frame, annual PTE test volumes have increased from about half a million to about 1.1 million. Annual DET test volumes have increased from about zero to an estimated 700,000. And, of course, the long tail of smaller tests we see today wasn’t so long back in 2018.

Some of these numbers require guesswork, but they get to the point of what I’ve been nattering on about for the past decade – the market is getting bigger and newer and more nimble firms are taking advantage of that. Much more so than older and less nimble firms. I realize that I’m being Captain Obvious here… but I guess it is worth stating now and then.