I watched this long webinar from Times Higher Education and Oxford University Press about “rethinking language assessment for a changing world.”  It includes a wonderful exchange at around 23:00 about the use of “proxy tasks” to predict success in academic programs.  Around that point, Hannah Jones talks about the need for clear and accessible peer-reviewed research into the predictive validity of these sorts of tasks.

To me, that’s a really fair point, as we sort of take it for granted that certain tasks are more predictive than others.  But are they really?  I’ve written here about how tasks on certain tests sort of LOOK better for various reasons (perhaps because they are quite long)… but a closer examination suggests that surface-level observations aren’t particularly useful.

Tony Green, meanwhile, talked about how all English tests don’t have very much predictive validity at all and that “looking to predictive validity isn’t a very productive route to go down.”  He mentioned that other measures, like A-Levels, have even less predictive power than these tests.

Again, this makes a lot of sense because success in university depends on a whole lot of things which are not measured on an English test.  And, of course, these things are often acquired and honed sometime after our admission.  I, for one, earned my highest GPA only in my final year of studies. Needless to say, it took me some time to get the hang of university-level work and, for that matter, the academic register.

I also liked this bit from Tony:

“…a 250-word essay is not actually what people have to write when they get to university.  And I think we could do a good deal more in helping [them] to understand that it’s not what they need to do when they get to university.  And there’s a big gap between those things.”

That’s a welcome perspective from someone working on a test.  I say this because in recent years some traditional test makers have, perhaps, been overselling their products in response to competition from newer providers.

So what does this all mean when it comes to the development and use of tests?  That’s the “rethinking” part and you should probably watch the webinar to learn more about it.

I made it home from London! I had a lovely time at I do recommend the PIE events to anyone seeking to expand their knowledge of the international education sector and to connect with some fairly interesting folks. And if you are just interested in tests? Good lord, do the test companies have a presence. There will be an event in Australia in July… or you can wait ‘till April of ’27 for the next London gathering.

A few stray thoughts:

  1. I gave a talk about testing in 2026, and I think it went pretty well. There was a big crowd, I said everything I wanted to say, and got a decent amount of participation from the audience. I signed up for this in part to get out of my comfort zone, so I’m happy to have done a good job. Regular readers might argue that talking about English tests is precisely my comfort zone, but you know what I mean…
  2. I can’t mention all the excellent presentations, but I will highlight “Kathmandu or Kathmandon’t? Is Nepal a viable recruitment market?” That’s a market I should know more about. I’m always happy to learn about the scene in specific countries. There were a few interesting comments and questions about proof of language proficiency.
  3. Walking around and around the exhibition space I was able to chat with kind folks from just about every testing company.  And a handful of off-site meetings took me all over London (and beyond). I learned an incredible amount about these assessments, and collected enough literature to keep me learning for some time to come.
  4. I learned a bunch about the new ISE Digital Test from Trinity College London (a blind spot because of its newness), about the new PTE Express, about the new TOEFL… about all the tests.
  5. Notwithstanding recent changes to the TOEFL, I’m starting to get the impression that integrated questions are on the rise.
  6. I went to Oxford to gain a bit of knowledge about the Oxford Test of English. That’s been a real blind spot for me since it is only taken on-site and they don’t have any test centers in Korea. And, as a somewhat new-ish member of the “long tail” of upstart tests, it hasn’t been covered in the press as much as other products. But it is backed by an impressive depth of talent that few testing companies could ever hope to match. That may set it apart from the pack. We should all keep an eye on it.
  7. Oxford is a pretty town. The realization that I would not be able to remain there for the remainder of my days filled me with some wistfulness.
  8. I actually recommended a couple of test centers in Korea that the Oxford test might make use of. But a later meeting with some folks who are actually in that business showed me that there is a lot more to the industry than I am aware of.
  9. If you go to the British Library you can peer at the handwritten manuscript for Mrs. Dalloway. How about that?