Speaking of paper score reports, I took the LANGUAGECERT academic test while working on a project last month.  This week I got my score report in the mail. As you can see, it is dispatched in a nice cardboard mailer.  Inside the mailer is a sealed paper envelope containing the score report itself, along with another sheet of cardboard to provide a little more protection.  The score report is a single page, with the score on one side and explanations on the reverse.

This is provided at no extra charge.  Test takers are not required to opt in.  How nice.

The delivery originates in Germany, which regular readers know has competitive rates for overseas shipping.

Might just be my pack-rat sensibilities kicking in, but maybe they should toss in an inspirational bookmark or collectable postcard.  Or stickers.  Kids love stickers.

 

The IELTS partners have published a new guide to assessment literacy and picking English tests.  It’s pretty good.

They suggest asking five questions:

  1. Is there research to support test validity?
  2. How are speaking and writing assessed?
  3. How do your tasks align with academic demands?
  4. What security measures are in place?
  5. Is it possible to review how test taker performance is assessed against specific criteria?

These are all excellent questions. Score users should certainly ask them.  One might also read Goodine’s Guidelines.

From a business perspective, these questions seem to highlight how IELTS Official continues to find itself in a tight spot.  They are dealing with competition from more contemporary tests like the PTE Academic Test and Duolingo English Test which are successfully putting forth the argument that shorter items (what IELTS calls “limited-response items”) have a role to play in snapshotting the language proficiency of an applicant (even if only in combination with longer items).  On the other hand, they also face competition from fairly traditional tests like LANGUAGECERT which, as recent events have suggested, may be supported by much more stringent security measures than is the IELTS.

City & Guilds Institute has sold its assessment business to PeopleCert (they own LanguageCert) for an undisclosed amount.  According to FE Week, CEO Kirstie Donnelly and almost all of City & Guilds’ 1,400 staff will move over to the PeopleCert organization. They will leave behind a very wealthy charity which will, one assumes, be seeking a new role to play in the world. PeopleCert has a press release out.

Recall that PeopleCert earlier acquired the City & Guilds English assessment business and from that formed the LanguageCert group of tests.

I’ve written quite a few times about non-profit assessment organizations selling their assets to buyers from the for-profit world.  There will be more sales to come, I’m quite certain.

The UK Home Office has published a fifth request for information regarding the Home Office English Language Test (HOELT). This one is a shocker.  It notes that “the Home Office is exploring a ‘Digital by Default’ service, with remote proctoring as the primary mode of delivery and physical test centres available where remote solutions are not feasible.”

This could explain the curiously low number of test centers mentioned in the fourth RFI, which is again listed as just 268 in 142 countries.

A Home Office choice to go with remote proctoring by default might favor a smaller test provider – like LANGUAGECERT, Duolingo or ETS – heretofore considered an underdog in the race to win the tender.  All three of those providers are well known for offering robust remote-testing options to test takers around the world.

On the other hand, the IELTS partnership (widely considered a front-runner to win the HOELT tender) currently offers remote tests only in select markets, while Pearson (another favorite) pulled the plug on its remote options back in 2024 shortly after stories broke about widespread cheating on the at-home PTE Test.

Of course this doesn’t mean remote testing is a sure thing. But it is worth paying careful attention to the possibility.

The Australian Department of Home Affairs has finally released new English test score requirements for visa applicants.  As most readers know already, CELPIP General, MET and LanguageCert have been added to the list of acceptable tests.

Additionally, many of the section score requirements have been adjusted.  I won’t list them all here, but a few examples might be useful.

For instance, “proficient English” was formerly achieved by earning 65 points in each of the four sections of the PTE test.  Now, that requires the following scores:  listening 58, reading 59, writing 69, speaking 76.

Indeed, applicants submitting PTE scores will need higher speaking results across the board (with one exception, see below).  Some have speculated that this might make the test less attractive moving forward.  As has been discussed in this space many times, perceived easiness is always at top of mind when students pick a test.

Applicants submitting TOEFL scores will also need better speaking results than before, though the increase isn’t as dramatic as for Pearson’s test.

Interestingly, requirements for subclass 485 visas (Temporary Graduate Visa) have been lowered slightly (but not for IELTS).  Also, TOEFl and PTE requirements for “functional” English have been lowered slightly.

Changes apply to tests taken on or after August 7.  I’m not sure how long old test results can be used.

I can’t find any word about what will become of TOEFL iBT when that test changes in January of next year.

As predicted in this space, both LANGUAGECERT and Michigan Language Assessment have announced that their tests have been approved by the Australian Department of Home Affairs. They will be added to the list of tests that can be used to prove one’s fluency in English when applying for a visa.

Still waiting for something from the team at CELPIP. Also waiting for the Department to announce specific score requirements.

It dawns on me that new requirements from the Australian Department of Home Affairs will probably kick off around August 7.  That’s the date that the big PTE changes will take effect… changes which were partially (largely?) mandated by the DHA.  It will be nice to finally stop checking the DHA website for updates after making my morning coffee.

A few things come to mind:

  1. LANGUAGECERT, CELPIP and MET have all been through the arduous DHA acceptance process. As part of that process, they have all published concordance studies linking their tests to IELTS.  I don’t see any reason why these three products won’t be added to the list of acceptable tests.  These tests currently have fairly small volumes, but they are backed by organizations with very deep pockets (PeopleCert, Prometric and Cambridge University Press & Assessment) and will grow over time.  They will draw customers away from PTE and IELTS.
  2. As I wrote yesterday, the DHA now has access to concordance tables for speaking, writing, reading and listening.  Accordingly, we might see adjustments to the required section scores for Australian visa applications.  Notably, we might see higher PTE speaking requirements, which could slow the use of the PTE for Australian visas.  As regular readers know, in recent years that test has become somewhat dominant among individuals going to Australia. That’s partly because the required PTE scores are perceived to be easier to meet than the required IELTS scores.
  3. Obviously the enhanced TOEFL (launching January 2026) has not been approved by the DHA.  Given the scope of the changes to that test it probably never will be.  One imagines that ETS will maintain a version of the classic TOEFL iBT solely for Australia-bound students (and for the handful of other use cases that will be unlikely to accept the new test) but it is sometimes hard to gauge what the folks in New Jersey are thinking these days.

That’s all I can think of now.  Lemme know your thoughts.

The folks at LANGUAGECERT have published a new report that compares LanguageCert and IELTS scores.  Note that the report is dated October 2024, but was only made available in the past couple of weeks (before that a preliminary report was available).

The report says:  “The current study found a very high overall correlation between LANGUAGECERT Academic and IELTS Academic (r = .87). This strong correlation is important as it suggests a substantial similarity in the constructs measured by the two tests. It implies consistency in how the two tests rank test takers according to their language abilities and that both assess similar aspects of academic English proficiency. Scores on one test can be reasonably indicative of performance on the other. “

It is worth mentioning here that the content of the LanguageCert test is really quite similar to the IELTS Academic test (a theme I explored last year after taking the test).  About half of this report compares the content of the two tests and while differences are highlighted, they aren’t quite as striking as the differences between, say, the IELTS and TOEFL tests.

Do take a moment to check page 44 of the report for some comments about how familiarity with a test can impact this sort of report.  That’s a topic I’ve been mulling over since reading the new TOEFL/IELTS concordance a few days ago so I’m really happy to see it directly addressed in this report.  In my neck of the woods a participant in this sort of study might be really, really, REALLY familiar with one test… and have just a passing familiarity with the other one.

Finally, there is some interesting data on page 40 that hints at which markets this test enjoys the most success.  The top three nationalities of participants were:  Chinese (47%), Indian (26%), Iraqi (9%).  Some may be surprised that the LanguageCert test has a large Iraqi customer base, but that does track with what I’ve been told in recent months.

Slightly interesting fact: when writing about the TOEFL/IELTS concordance I observed that none of the participants achieved an IELTS writing score of 8.5 or 9.0.  Of the 1008 participants in this study, one had a writing score of 8.5 and no one had a score of 9.0.

LANGUAGECERT has just kicked off a promotional campaign called “Lives Retold.”  It looks like the campaign will, through video vignettes, tell the stories of people who used the LanguageCert test to help start their lives abroad.  The first is about Zhenyi, who moved from Liaoning (China) to Manchester after taking the test.  I lived in Liaoning for several years (a long time ago), so I really dug this particular story.

Well-made content with good production values can put a human face to English tests.  Beyond promoting products and attracting “likes,” it can also build enthusiasm and community amongst the people that influence test taker decisions.  Community building is something test makers used to do really well, but sometimes struggle with nowadays.

A couple of weeks ago I received this keen certificate from LanguageCert in the mail.  It was printed on nice cardstock and came protected by a durable mailer.  Everyone who takes the LANGUAGECERT Academic test gets a certificate – it isn’t necessary to opt-in and no extra fees are charged.

I appreciate how, at $165, the LanguageCert test represents a pretty decent value.  It seems to be part of a new category of at-home tests that has emerged in recent years.  We might dub the category “affordable-traditional.”  This category also includes the Kaplan Test of English which costs about $149, and the Password Skills Plus Test, which costs about $139.

The category has become particularly valuable for test takers as fees charged by more established companies have increased precipitously in recent years.  At-home tests from legacy firms cost more than $400 in some countries.*  Notably, the at-home TOEFL just hit $470 in its most expensive market.

Of course my friends at Duolingo will be quick to point out that their test costs just $70.  But I think you get my point – in an ideal world individuals aren’t charged an arm and a leg just because they opt for a more traditional testing format.

*Yes, some testing companies charge a different price depending on which country the test-taker is located in. Prices can differ by hundreds of dollars depending on the location of the test-taker.

So here’s my LANGUAGECERT statement of results along with a separate certificate I was issued.  As you can see, the statement of results provides the sorts of details one normally finds in such a document, while the certificate is more concise and better suited for sharing online.  The certificate is a nice touch as traditional score reports necessarily contain a lot of clutter, including certain personal information that some test takers would rather not share publicly.

The results took nine days to reach me, including the January 1 holiday.  That’s a bit longer than the category average.

If my reading of the LANGUAGECERT website is correct, a paper version of the statement of results is sent automatically to all test takers.  That’s nice too.

You can also read part one of my experience taking this task, and part two.

 

 

A few more stray notes about the LANGUAGECERT Academic test before I move on.

(you can read my initial notes over here)

  1. The test really does feel like a somewhat modernized take on the IELTS Test.  A frequent complaint about the IELTS is that it hasn’t changed a lot since the 1990s. One gets the impression that the people behind the LanguageCert product set out to design something very much like the IELTS… but contemporary. Many of the items on this test are broadly similar to those on the IELTS, with small tweaks. Which is fascinating, as most test makers seeking to compete with the IELTS have gone in a wholly different direction. I wouldn’t be surprised to learn that LanguageCert is staffed by a lot of ex-IELTS folk.

 

  1. Again, I want to emphasize that being able to do a room scan with my phone was pure bliss. It sometimes seems that test makers are not aware of how frigging terrible room scans are. Most people take tests using a built-in laptop camera and must carry their machine around the room to complete a scan of the walls, of the ceiling, of under their desk, under the seat of their chair, inside their desk drawers, etc.  And this experience has only gotten more burdensome. I recently related a humorous story of how one test taker was required to open the door to their flat to show the public hallway and elevator to their proctor. I was once required to hoist my heavy office chair to chest level to give my proctor a peek at the bottom.  Since I am middle-aged and out of shape, that request left me with a sore lower back for the rest of the afternoon. Needless to say, all of this is easier to do with a phone than with a bulky laptop with peripherals dangling off of it.  I urge all test makers to consider implementing a “scan by phone” option. Even if you don’t think it is necessary to require the phone as a secondary camera while the test is in progress, you should consider implementing it as an option for room scans.

 

  1. There is so much Australian-accented English on this test that one would think it is a made-in-Australia test. Perhaps the folks at PeopleCert are making a play for visa acceptance in that country. Most tests include a few dudes that sound like Geoffrey Rush and leave it at that. That’s not the case here. The speakers are immediately identifiable as Australian.

 

  1. I’ve attached a screenshot to highlight the “NVIDIA Issue” I mentioned in my previous post.  Note the three NVIDIA applications that the test software detected. I hit “okay” multiple times with no effect. The software was not able to shut them down and no error message was displayed. I worked around this by shutting down the test software and manually disabling the NVIDIA junk… but by the end of the test, it was running again. I’ve been writing about this potential security issue since 2022. I am convinced that it is the cause of many “unauthorized software detected” cancellations of various test products.

 

  1. The second screenshot is from my account following the test. The UI is very nice.