Duolingo’s quarterly financial report was published a few days ago.  It indicates that for the three months ending September 30, revenue from the DET was about $10,772,000.  If we divide that number by the cost of the DET ($65) we can estimate that the test was taken about 165,723 times in the quarter.  That’s down about 8% from the same period last year, when the same math suggests the test was taken about 179,864 times.

Keep in mind that this is an imprecise calculation.  Some people pay less than $65 by purchasing a bundle of tests.  Others pay more than $65 to get fast results.  Some people pay less (or nothing) because they get vouchers from a third party partner or through Duolingo’s Access program.

With that said, the math suggests that the test was taken 547,223 times for the nine months ending September 30.  That’s basically unchanged from the same period last year, when the test was taken about 549,231 times.  The lack of a year-on-year decline is due to a very strong Q1 in 2024.  Indeed, I think Q1 of 2024 is actually the high water mark for the test.

These numbers track what’s going on in this industry as a whole.  Pearson has suggested that PTE test volumes will likely be flat for the year.  And IDP Education reported an 18% decline in IELTS volumes for the year ending June 30, 2024.

Note that if you are looking for these sorts of numbers from Duolingo, you have to check the 10-Q forms filed by Duolingo.  Nowadays, they aren’t reported elsewhere. 

I spotted a couple of interesting comments about the PTE in the transcript of Pearson’s Nine Month Trading Update conference call (which happened at the end of October).  They suggest that PTE test volumes may be flat or slightly up on the year.  They also suggest that the PTE may be taking market share from competitors in the English testing market.

During the call, a fellow from Goldman Sachs asked: “could you give some colour on PTE revenues and volumes in Q3?

The response from Pearson was:

“On PTE, you will remember that volumes were down when we disclosed those at the half year. Revenue was down very slightly. In Q3, it did return to growth and so that indication I had given that it was likely to be down for the full year, it could be flatter, maybe even slightly up.”

More interestingly, a representative of Citigroup asked: “I would love some extra detail on whether you think you are taking share within the PTE, specifically talking about PTE, but whether you are taking share within the English language proficiency testing market.”

The response from Pearson was:

“…on the PTE topic that you asked about, Tom, again, without naming names, if you go and look at some of the competitive players in the space they will talk to you about meaningful late teens, 20% percentage reductions in testing volumes. That is what they’re seeing in the market overall. So the fact that actually Gio and the team delivered growth in PTE in the quarter in that kind of a market context, of course implies that we are taking share, and of course implies that we are executing in a very intense and focused way, which you can expect us to continue to do in the context of a market that is indeed subject to some of these policies in different countries. And we understand that market very well.”

I ventured into Seoul a few days ago to visit the Appenzeller/Noble Memorial Museum.  The museum is housed in a building, constructed in 1916, that was once home to Korea’s first “western style” school.  It’s a pleasant hidden gem in the city, and admission is free.

Across the lawn, I was happy to spot the Korean offices of the British Council!  It seems fitting that the British Council occupies space in what used to be the grounds of this historic school (which still exists and is the BC’s landlord, I believe).

You can go to these offices to take your IELTS test.  They also run programs for children, and distribute literature that extols the virtues of a nice British education.  I was also very happy to spot a Poppy Appeal donation box, perhaps the only one in Seoul.

It’s worth checking out this story of a test cheating ring in Texas if you have an interest in test security.

Basically, test takers are alleged to have paid an imposter to take a teaching certification exam on their behalf.  The imposter then bribed a test center operator to look the other way while he took the tests.  The cheating was so brazen that the imposter allegedly took more than one test at a time.

CNN’s reporting highlights some of the weaknesses inherent in the test center model. Their report is a good reminder that test centers are not necessarily more secure than at-home testing as this particular approach to cheating is not really possible in at-home testing.

Canada has ended the Student Direct Stream (SDS).  This program was introduced by IRCC in 2018 to provide expedited study permit processing from select countries, including the key sending countries of India and China.  

Applicants opting for the SDS route were subject to requirements beyond those in the regular non-SDS study permit application.  These requirements included a language test result.

Moving forward, all students must apply through the regular study permit stream.

Notably, the regular study permit stream does not require a language test score.  It merely requires a letter of acceptance from an institution.  Institutions set their own language test requirements and issue the LOA once applicants have fulfilled them.

In terms of language testing, one imagines that this change will generate business for Duolingo’s DET, which is widely accepted by schools across the country but was not accepted for use in the SDS stream.  It may reduce volumes of more expensive tests like the TOEFL, PTE and IELTS, scores from which were accepted as part of the SDS stream.

I suppose, though, that one should keep an eye on the specific requirements of the student permit program.  Perhaps a language test requirement will be added in the future.

Note that the Nigeria Student Express program has also been eliminated.  It was similar to the SDS.

The official TOEFL website now includes an FAQ!  It’s a really wonderful thing to see.  It contains a whole bunch of questions spread across seven categories.  And it is searchable! 

If anyone from ETS is reading this, I’d be happy to check my server logs to see what answers people search for most commonly on this website.

You know, I spent much of early 2021 suggesting how the TOEFL could be improved.  At that time I wrote that ETS should:

“Provide a beautiful FAQ page that quickly answers the questions that are asked every day. This will not only improve test taker experience, but will reduce calls to your support number by a huge amount.”

As I’ve expressed to a few people in private conversation, I’m feeling pretty optimistic about the TOEFL nowadays.  I felt little optimism back in 2021 when I published a handful of articles about how to improve the experience of people taking the TOEFL. It gives me some joy to see that two thirds of the things I suggested in 2021 have now been implemented.  They include:

  1. Fix the Official Guide to the TOEFL.
  2. Implement a modern support system and FAQ.
  3. Provide more practice tests.
  4. Implement automated scoring in free practice tests.
  5. Rephrase word count recommendations.
  6. Be more transparent about automated scoring.
  7. Modernize the voucher system.
  8. Use website UX practices from 2021, not 2008.
  9. Make it possible to register for the test (and pay for it) in less than five minutes.
  10. Eliminate the TOEFL Search Service*.

The suggestions not currently implemented are the really challenging ones:

  1. Provide a free practice test that is different (almost) every time it is taken.
  2. Stop charging a fee to send score reports.
  3. Charge the same price for the home edition in every country.
  4. Get rid of the ‘cancel scores’ button at the end of the test.
  5. “Active.  Noise.  Canceling.  Headphones.”

 

*I think the search service has been eliminated, but I could be wrong.

The folks at Duolingo have published an article (and matching blog post) about how much time test takers should be given to complete writing tasks. Their research suggests that shorter tasks are just as useful as longer tasks in terms of reliability and validity.

This is a controversial topic among people who take the time to look at the sorts of tasks that are included on tests of English proficiency. It has generated some discussion.

Nowadays, both test makers and test takers seem to favor test forms that are shorter (in duration) than those used in the past. Since long essay tasks require a significant amount of time to complete, they are less popular than they used to be. Recall that last year the 300(ish) word “Independent Essay” task was dropped from the TOEFL, in favor of a 100(ish) word “Academic Discussion” response, meant to simulate a message board interaction. Research provided by ETS indicates that the shorter task is just as useful as the longer one it replaced.

A separate (but related) controversy relates to how closely test items should resemble real-world tasks carried out by students in the course of their future studies. The move to shorter writing tasks means that newer tests often include items that do not simulate real academic work. Some people find this problematic. Some do not. Yet others argue that the tasks on more traditional tests never actually simulated this sort of thing in the first place.

I read that the PTE test turned 15 years old last week. The folks at Pearson have accomplished quite a lot since the launch of the test. They’ve played a part in the breaking up of old testing monopolies and have catapulted the PTE from an annual test volume of zero to about 1.2 million, according to the firm’s most recent annual report. By my math, that makes it the second biggest test in its category.

I’m probably the last person who should be writing about the success of the PTE, since I’ve been paying attention only for the past five years. But a few things come to mind when seeking to account for what has happened. The following list is mostly for the benefit of other test makers who are trying to catch up.

Said things are:

 

  1. Pearson is really good at government relations. Over the past 15 years, the company has enjoyed an enormous amount of success getting its tests accepted by the Canadian, Australian and UK governments. None of the other IELTS competitors come close. ETS’s TOEFL has actually lost governments in the same time frame.

 

  1. The PTE benefits from so-called “perceived easiness.” When I ask students why they took the PTE or some other non-legacy test, their response usually includes some variation of “it’s easier than XYZ.” Are the non-legacy tests actually easier? Probably not. Do these students have any idea what items on legacy tests actually look like? Definitely not. Why do they think the tests are easier? Good marketing, I suppose. It probably has something to do with positive user experience and test taker stress levels as well. This is controversial, but it is a big big factor.

 

  1. The PTE is not a fee-generating machine. Unlike the legacy test makers, Pearson doesn’t charge onerous fees for sending scores to institutions. It also includes generous rescheduling and cancellation policies (both things can be done without any fee at all more than two weeks before test day). Late booking fees are reasonable.

 

  1. PTE results come very quickly, usually within 48 hours. When I started teaching 15 years ago, students didn’t really care about getting results quickly. They took responsibility for their deadlines, and scheduled language tests well in advance. But today’s students are different.

 

  1. Pearson is not a student placement agency. This makes it easier for the company to build relationships with agents in key markets like India.

 

There are more factors (a whole post could be written about differing approaches to at-home testing) but I will leave it at that.