British Council’s 2022-23 annual report is now available. I believe it covers the twelve months ending March 2023.

British Council delivered 1.8 million IELTS tests in this period, a 12 percent increase from the year ending March 2022. Compare that to IDP’s most recent annual report, which mentioned just a 1% increase in the volume of IELTS tests delivered (and, notably, a 5% decrease in testing revenue in India).

Remember that BC doesn’t do IELTS in India nowadays (they ceded that market after selling their interests there to IDP in 2021). It looks like ROW is a growth market for IELTS, while India is somewhat flat.

Now in preprint from Daniel Isbell and Nicholas Coney is an article examining how English Language Proficiency tests are used at 146 research-intensive universities in the USA.  It examines which tests are used for admissions, what cut scores are used (and they compare across tests), and how subscores are used in admissions decisions.

The authors learned that the TOEFL iBT, IELTS, Duolingo, and PTE-A tests are most widely accepted, and in that order.

A few fun bits:

  • The TOEFL iBT is accepted for unconditional undergraduate admission at 135 schools, the IELTS at 133 schools, the Duolingo at 110 schools and the PTE-A at 61 schools.  I suppose this will be a priority of the folks at Pearson in the years ahead.  Though the test has (I think) moved into the #2 spot worldwide in terms of test taker volume, they still have plenty of room for growth in this area.
  • For unconditional graduate general admissions, the numbers are a bit different. The TOEFL is accepted at 117 schools, the IELTS at 116, the DET at 62, and the PTE-A at 54.  Obviously both the Pearson and Duo folks may wish to prioritize this area.
  • I was very pleased to see that the TOEFL CBT, which ceased to exist in 2006 is still accepted for unconditional admission to 11 undergraduate programs and 10 graduate general programs.  The TOEFL PBT, which was discontinued in 2017, is even more popular.  I suppose ETS ought to prioritize communications with score users in the years ahead.
  • As I have noted in my “score requirement tracker” posts, Duolingo cut scores have not always kept pace with revisions to their score concordance tables.

That’s just the tip of the iceberg. There is some really wonderful data here, so do check it out.

Note, of course, that the above figures may have changed since the time the data was gathered.

How closely should an English Test result reflect a test taker’s ability if they go into the test mostly blind?

If we know that the test taker is at a C1 level, is it proper to expect them to get an equivalent test score (95 on the TOEFL iBT, 7-8 on the IELTS, 76-84 on the PTE Academic, etc) if they take the test with only a cursory amount of preparation? Or is that expectation inappropriate?

I think this is probably too much to ask for, but how close should the theoretical test taker’s result be to their actual fluency in the language?

A wise test watcher recently noted that students pick their tests based on “perceived easiness.” I think that part of “perceived easiness” is the perception that the test result will reflect their actual English ability even if they don’t spend hundreds (thousands?) of dollars on test prep products.

Perhaps this accounts for the skyrocketing popularity of certain tests in recent years. Perhaps test takers feel those tests provide a better opportunity to certify their English language skills without investing too much in supplementary products.

Thoughts?

You know, if I were a wealthy man I would pay for my mother to take every English test with a CEFR concordance available. I’d see which results more closely match the fact that she’s a C2 user of the English language. I guess it doesn’t have to be my mom taking the tests, but she does seem to have a lot of time on her hands nowadays.

The folks behind the IELTS recently published a head-to-head comparison of the IELTS General and the new PTE Core, encouraging individuals on the journey to residence in Canada to opt for the former product.

The comparison is a very eloquent defense of their product, but it highlights some of the challenges that the so-called “legacy” test makers face when dealing with competition from newer tests.  Specifically, a lot of the purported benefits of the legacy tests may be considered somewhat antiquated by test-takers in 2024.

For instance, the article notes that the IELTS can be taken on paper if one prefers.  I’m really not sure that the paper option is a big selling point in 2024.  It later suggests that the IELTS is better because it doesn’t use any AI.  I’m not sure that is a big selling point in 2024, either as people really like AI nowadays.  There is also some stuff about the decades-long legacy of the IELTS, which test-takers probably don’t care about one bit.

The article concedes that the IELTS is 55 minutes longer than the PTE core, noting that the IELTS is  “a bit longer, but we promise, we’re worth it! – we test the skills you need to succeed so you can feel confident starting your new life in Canada.”  Maybe in the distant past people thought about the positive washback of their test prep, but I’m not sure they view tests through that particular lens nowadays.

On the other hand, there is some very valid stuff about how it can be distracting to speak into a room with many other test-takers present.  That really is something people worry a lot about.

Anyway.  Competition is very good for consumers.  I really do hope that work began on the next-gen IELTS and TOEFL tests at least a few years ago. I want them to appeal to young test-takers.  Despite my sometimes dismissive tone, I really don’t want those products to lose TOO MUCH market share to newer tests. That would be bad for consumers in the long run.

The Duolingo English Test blog has a new feature on “jagged profiles” in language assessment.  This refers to a language user who is quite strong at one or more aspects of the language and quite weak at some other aspect(s).

I was quite happy to read the following:

“Are jagged profiles common? In a word, yes! Because language assessments, and especially high stakes tests like the DET, usually evaluate multiple language skills at the same time including reading, listening, speaking, and writing, jagged profiles are often detected as a result of such tests. For example, a test taker might score high in reading comprehension but struggle with writing or speaking tasks, and consequently earn lower scores for those skills. This is a common scenario with test takers who have jagged profiles, because we have long known that production-based skills develop later than perception-based skills in L2 learning.”

Regular readers know that jagged profiles are one of the criteria used to justify the cancellation of scores on the TOEFL iBT Test.  ETS might argue that jagged profiles do not result in cancellation all by themselves, but only in combination with other factors.  That said, I’ve long called for jagged profiles to be removed from the equation entirely.  I’ve voiced that opinion more loudly since the shortening of the TOEFL iBT a few months ago, as the removal of variable (unscored) questions reduced the amount of data available to officials in the Office of Testing Integrity who are making these sorts of decisions.

Before the formal appeals process for such cancellations was removed, affected test-takers often reached out to me for assistance in planning their appeals.  Usually, that involved helping those test-takers explain the reason for their jagged profile.  I was occasionally successful in having cancellations overturned.  But usually not. You may recall the story I related last month about an autistic test-taker having his score cancelled (without a refund) in part because of his low speaking score. I reached out to him this week for an update, and was sad to hear that his scores have not been reinstated.

Like most of you, I’m a big fan of the “Tried and Tested Podcast” from PSI services.  In an episode from October, PSI president Janet Garcia said:

“The only thing we’re trying to test is knowledge and competence.  We’re not trying to test their digital literacy skills.  We’re not trying to test their patience.”

I like that last sentence a lot.  When trying to account for why certain language tests are gaining popularity, it’s important to remember how desirable it is to provide a smooth and seamless testing experience.  It’s been about four years since the jump to at-home testing and while everyone is getting better at it, many testing firms still struggle to do this. Indeed, test-taker patience is often tested.

I still hear stories about proctors forgetting to turn off their microphones, forcing test-takers to listen to their conversations.

I still hear about room scans being requested at the most inopportune times (and without the test being paused).

People still fail pre-test checks because one major proctoring service still doesn’t account for how modern Apple computers utilize RAM.

Note-taking is often clunky and awkward.

The list goes on.  I’ve talked about all this before, so I won’t repeat myself.  But, yeah, I love that line about not testing the test-taker’s patience. It’s so incredibly relevant.  If you want people to take your test, make the test experience as pleasant as you can.

Some may be interested to know that publication of the three official TOEFL books has been pushed back again. Per Amazon, they will be published on April 24. The bundles still have an April 22 street date, but I suppose that will be changed.

In other book news:

  1. The 18th edition of Barron’s TOEFL will be released on April 2.
  2. The print publication of “English,” Sanaz Toossi’s Pulitzer prize winning depiction of a TOEFL classroom, has been pushed back to June 25.
  3. When I reduced the price of the Kindle version of my TOEFL writing book to a buck, sales increased ten-fold. But surprisingly, about half of the sales were of the print version, which had the same old price. Maybe this is a good tip for self-publishers. If you get some attention via a cheap ebook version, people who like the looks of the book will go ahead and get the paper version.
  4. I was hoping that some of the sales would result in reviews on Amazon. Sadly, only one more person wrote a review. So, if you like the book, it would be cool if you could take a moment to write a few words on Amazon.

Password English Language Testing‘s at-home “Password Skills Plus” test is now available around the world. This comes after a long pilot project in the UK and Cyprus. The test is intended to be used for university admissions. I haven’t paid much attention to this one, but it looks pretty solid. I hope to take it in the near future.

A few key details are worth mentioning:

  1. The test takes about three hours to complete.
  2. Live proctoring is carried out by Examity (part of Meazure Learning)
  3. The fee is 110 GBP (about 140 USD), which is less than most tests in the category.
  4. I can’t find confirmation, but it looks like scoring is done by humans, given the 7-day waiting time for results and the ability to request that W and S responses be rescored.
  5. A handful of accepting institutions are listed.

At first glance, there are a few things I really like:

  1. There is a 30 minute essay that looks a lot like the old TOEFL Independent Writing Task. I miss that task a lot. I’m not alone.
  2. There are some long and challenging reading and listening questions. The test designers have really taken advantage of the test’s 3 hour duration.
  3. In case of a test cancellation (for any reason) the test-taker can pay a 50 GBP fee to have the cancellation manually reviewed. This fee is refunded if the cancellation is overturned. I think more testing companies should offer this kind of service. Listen: doing this (and keeping it in-house) would both please your customers and give you valuable insights/data.

The language testing market is getting pretty crowded, but we’ll all seen over the past few years how much students benefit from competition. For that reason, I’m always happy to learn about a new English test.