The IELTS partners have published a new guide to assessment literacy and picking English tests.  It’s pretty good.

They suggest asking five questions:

  1. Is there research to support test validity?
  2. How are speaking and writing assessed?
  3. How do your tasks align with academic demands?
  4. What security measures are in place?
  5. Is it possible to review how test taker performance is assessed against specific criteria?

These are all excellent questions. Score users should certainly ask them.  One might also read Goodine’s Guidelines.

From a business perspective, these questions seem to highlight how IELTS Official continues to find itself in a tight spot.  They are dealing with competition from more contemporary tests like the PTE Academic Test and Duolingo English Test which are successfully putting forth the argument that shorter items (what IELTS calls “limited-response items”) have a role to play in snapshotting the language proficiency of an applicant (even if only in combination with longer items).  On the other hand, they also face competition from fairly traditional tests like LANGUAGECERT which, as recent events have suggested, may be supported by much more stringent security measures than is the IELTS.

You can read a transcript of IDP Education Ltd’s Annual General Meeting right here. It’s mostly quite dull, but I suppose it is interesting to see how IDP’s future depends on governments becoming more open to immigration in the near future. That’s a dicey proposition in a world where voters around the globe are becoming more nativist and more nationalist with every passing day.

While I was searching the warrens of the College Board’s website, I stumbled upon something I didn’t know I needed: this technical manual for the AP International English Language Test!  Developed for the College board by ETS and offered from 1997 to 2002, the APIEL Test was used by some schools in the United States alongside TOEFL and IELTS to confirm English fluency when admitting international students.

Testing enthusiasts might find the content of the manual somewhat interesting.  This was a three-hour test (plus time for instructions) and if you squint at it long enough you can see how it may have been influenced by research being done in support of the TOEFL iBT.  For instance, this was the first ETS test with both a speaking section and a writing section.  Meanwhile, in true AP fashion, the reading and listening items are sourced from real-world publications rather than the somewhat stilted constructions we see in tests today.  I even spotted an uncredited excerpt from “The Spy Who Came in From the Cold.”

Perhaps some readers of this space contributed to the development of the test and have some memories to share.

This test actually came up in a conversation I had at DETcon 2024; another attendee reminisced about how ETS held similar events in the 90s to promote the product. Recall that back then ETS ran most of the College Board’s testing programs.

This seems to have been the College Board’s only foray into the world of high stakes English testing (aside from their early management of the TOEFL).  I’ve often wondered if they have ever considered getting into this market.  There is still some potential for profit, I think.  Just off the top of my head I can count TEN firms trying to make money on high stakes testing for college admissions in the UK, but only four trying to profit from the much larger US market.

City & Guilds Institute has sold its assessment business to PeopleCert (they own LanguageCert) for an undisclosed amount.  According to FE Week, CEO Kirstie Donnelly and almost all of City & Guilds’ 1,400 staff will move over to the PeopleCert organization. They will leave behind a very wealthy charity which will, one assumes, be seeking a new role to play in the world. PeopleCert has a press release out.

Recall that PeopleCert earlier acquired the City & Guilds English assessment business and from that formed the LanguageCert group of tests.

I’ve written quite a few times about non-profit assessment organizations selling their assets to buyers from the for-profit world.  There will be more sales to come, I’m quite certain.

There is now a Technical Manual for the revised TOEFL iBT Test.  You can find it via its home on the website of the ETS Research Institute.

Many readers will be most interested in its detailed descriptions of the items included on the new test.

A few frequent questions have been answered:

  1. The reading and listening sections will each contain 35 scored questions in total.  The reading section might contain 15 unscored questions, while the listening section might contain 12 unscored questions.
  2. The exact number of questions of each type is listed.
  3. Everyone gets the same mix of items in the routing module, of course. The easy reading module doesn’t contain an academic reading passage and the easy listening module doesn’t contain an academic listening passage. The hard reading module contains one academic reading passage and the hard listening module contains two academic listening passages.
  4. Human raters will not score every speaking and writing response.  Just some of them.  The manual notes that “[f]or responses where the automated scoring lacks confidence or encounters difficulty, human raters step in to provide scores, ensuring reliability across all responses. In addition, a random sample of responses is regularly reviewed by certified human raters to ensure quality and inform model updates.”

Those changes to English requirements mentioned in the White Paper are set to come into effect January 8. Requirements for applicants in the Skilled Worker, High Potential Individual and Scale-up routes will increase from B1 to B2.

The change is part of a collection of changes which the Home Office says will replace “Britain’s failed immigration system” with something better.

The Home Office (and press) refers to this as requiring “A-Level standard” of English. Maybe they are trying to make a comparison to earning an A-Level in a foreign language, and not that immigrants will have to be able to understand Wuthering Heights or Sense and Sensibility.

I took the new “English Express Test” from Pearson. That’s their new product intended for students applying to American schools. It basically exists in the “contemporary affordable” category that Duolingo has dominated in recent years.

I liked the test. Practice materials are free and seem to be plentiful. The UX is pretty smooth and I was happy to see some integrated tasks.

Read on for a detailed look at the test.

(I suppose it is worth mentioning that I write the same sorts of reports for clients, but at a much greater level of detail. You can call me if you are interested.)

Pre-Test

  1. Free practice tests are delivered within the secure app used on test day.  I think I wrote positively about this feature back in 2023 when this test was a Versant product.  And I stand by that – but I realize that some test takers won’t like it.  It means that you have to follow most of the test day rules just to take a practice test.  You must disable your second monitor, you must unplug or switch off extra microphones and headphones, etc.  And your webcam will be activated during the whole practice test (so I felt obligated to put a shirt on).  This places some amount of cruft between the test taker and the practice tests they want to take.  It likely limits the variety of operating systems and devices the practice test can be accessed on.
  2. The practice tests are scored!  Even the writing and speaking. It takes 15 minutes, though.
  3. The practice tests are just 30 minutes long.  I suppose that Pearson, like Duolingo, will eventually realize that it is best to just deliver full-length practice tests.
  4. I took two free practice tests and they were different!  I don’t know how many variations there are, but this is the future.  Test companies that are leaning heavily into making money from prep should realize that the days of selling practice tests for $55 a pop are coming to a close.
  5. The test costs $70.
  6. The user account is blissfully clean and uncluttered. I’m in love.

The Test

  1. This test is not dissimilar to the TOEFL iBT.  It has some similar question types and some similar design philosophies.  Many of the tasks are similar, and in general both seem to favor short form items. As I’ve h
  2. There is not much of what one might call “academic English” on this test.  Less than the Duolingo English Test.  And less than the new TOEFL.  I counted about two such questions. They made up about 4 minutes of the 60 minute test.  There is plenty of “campus life” English, but that’s not the same thing.  This is not a complaint, but it might be a point of controversy as some people argue that “academic English” is a thing of nebulous definition. Remember that this test hits the market just a few months before the TOEFL test relaunches in a format that drastically de-emphasizes traditional “academic English.”
  3. Note-taking is not permitted.
  4. The test UI is clean and pretty.  Prettier than average.  But it should include a button to increase the text size.  Or they should just make everything frigging huge, like on the Duoligo test.
  5. I was surprised by the number of British and Australian accents I heard, considering that this is a test for only those seeking to attend American schools.  I imagine they would be better off replacing these with non-native accents, to be honest.
  6. One of the speaking tasks requires test takers to speak for 60 seconds, but there is no timer on the screen. 🙁
  7. There are some tasks which we might call “integrated.”  I’m happy to see that Pearson is doing its best to include integrated tasks, despite the short length of the test.  The most notable is the “summarize a conversation” task where the test taker must listen to a fairly long conversation between three people and then summarize it orally.

The Security

  1. The test uses both a secure browser and a second camera.  I like that the second camera is implemented via a web-based application instead of a whole app that must be installed.
  2. No room scan, though.

Post Test

  1. Unofficial results came in 15 minutes! This feature could set the test apart from Duolingo’s product.  My official scores were reported in about 48 hours, as promised.  Will try to append a copy of my score report to this article.
  2. The test doesn’t send a “test successfully completed” e-mail at the end.  Test makers should realize that students feel anxiety if they don’t get such a message (even if they can learn as much in their user account).
  3. I can’t find any research supporting the design and/or validity of the test.

 

I was happy to discover that Pearson has published updated print guides to the PTE Academic test (one for students, one for teachers). They cost $86 AUD a piece and only seem to be sold from Pearson’s Australian web store. But they do exist. Apparently.

I am still a big believer in print books because they can be stocked by libraries around the world. Libraries remain a major source of free test prep – the long reservation queues for prep books at many libraries speak to that.

The ETS Global event yesterday included updated timing details for the reading and listening sections of the revised TOEFL iBT. It is mildly interesting to see how the timing has evolved since the revisions were first announced back in July.

After three years of test makers portraying their tests as significantly shorter than competing products (and sometimes fudging the details to do so) it is cute to see how test makers are now pulling back from that strategy a bit.