A few quick notes about changes to the ALP Test at Columbia in 2025.

First up, newly accepted students (including transfer students) will still take the regular ALP Test that has been used in the past.  You can read about it in my guide, or watch my latest YouTube video about it.  If you need tutoring as you prepare for the test, you can email me at mgoodine@gmail.com

There is also some information about it on the ALP website.  Specifically, it notes:

The ALP Essay Exam is a 120 minute writing test taken by students who have been admitted into a degree program at Columbia University, and whose first language is not English. The purpose of the test is to confirm English language ability for the purpose of admission to a degree program, or for satisfaction of graduation requirements, as determined by individual schools and departments of Columbia University. The exam measures a student’s ability to apply their knowledge of English grammar and vocabulary in the context of an academic essay.

 

The CHANGE is that students who are currently enrolled in an ALP class will now be assessed through a portfolio of work at the end of the class.  This portfolio will include:

  • A summary
  • The third draft of an argumentative essay
  • An in-class argumentative essay on a known topic
  • A reflection 

This looks challenging.  The most challenging part is probably that in-class essay.  Previously this was referred to as the “qualifying exam” but the ALP program doesn’t use that term anymore.

This is still pretty new, but it was piloted and the end of 2024. At that time, the in-class essay differed from the ALP essay in a few ways.  Most notably:

  • Students were given the topic one week in advance.
  • Students were given the two articles in advance.  They were MUCH LONGER than the articles used in the regular ALP Test.  They had to  use the articles when writing their essay, just like in the regular ALP test
  • Students were given one more source ON TEST DAY, which they also had to use.  The source could be a short article, a graph, an image, or something else.

If you need some help preparing for the essay, feel free to contact me at mgoodine@gmail.com

More details are available on the ALP page.

The CGFNS (Commission for Graduates of Foreign Nursing Schools) has rescinded its decision to raise its PTE speaking score requirement for foreign-trained nurses seeking work visas for the United States.  The requirement will once again be a score of 50 (down from 63).  This comes after other issuers of nursing certifications declined to follow the CGFNS’s lead on the issue.

This isn’t an industry I follow closely, but it appears the score requirement was increased in December due to concerns that nurses were entering the USA with limited speaking skills.  According to the CGFNS, there was “an alarming increase in healthcare workers holding a lower speaking score over the past three years.”

Discussions with the US Department of Health are ongoing, however. Those may soon result in a mandate that all issuers of nursing certificates require higher scores.

There is some interesting data buried in the CGFNS announcement. It suggests that Pearson has captured much of this lucrative market.

From 2022 to 2024, the percentage of nurses submitting PTE scores to the CGFNS increased from 7% to 50%.  The percentage submitting IELTS scores decreased from 84% to 35%.

The required IELTS speaking score is currently 7.0. Some other tests are also accepted.

Note that the CGFNS does what is known as visa screening based on score requirements set by the government. State regulatory bodies have their own requirements that regulate proof of language skills in each state which nurses must also adhere to following their arrival in the USA.

Morningstar just covered IDP Education in its regular “ask the analyst” column.  The analyst’s perspective is rosier than my own. 

The analyst feels that IDP’s present woes are mostly connected to student placement, and he isn’t too concerned about English testing.  The article notes that:

“IDP’s recent share price troubles – the stock is down more than 50% in the past twelve months – can largely be put down to concerns over the outlook for its placements business.”

This is followed by a great discussion of that business.  Do check it out.

Testing only comes up as a “bonus question” at the end.  The columnist asks:

“Could Duolingo’s offshoot language testing business, which seems to be growing like a weed, impact the value of IDP’s core IELTS language testing asset?”

The analyst doesn’t think so:

“In short, [the analyst] doesn’t appear to be too concerned at the moment. While he recognises the immense growth and success of the Duolingo English Test’s low-price offering, this business is overwhelmingly skewed to the US market. With just 10% of IELTS volumes coming from the US, this is very much a secondary playing field for IDP’s language testing business. In its core market of academic institutions in Commonwealth countries, IELTS remains by far the preferred option and Duolingo hasn’t gained as much ground.”

And:

“[he] still sees the IELTS test as enjoying a strong advantage over most other tests. With over 11,000 schools, employers and migration authorities accepting IELTS globally, students are drawn to this test due to its wide acceptance. Meanwhile, this large and growing pool of IELTS certified students and visa applicants makes it more attractive for institutions to accept, and so on.”

A few developments on the NAEP front over the past few weeks:

  • The Department of Education cancelled the national test of 17-year-olds.
  • NCES commissioner Peggy Carr was placed on administrative leave.
  • Mark Schneider, who ran the Institute for Educational Sciences in the previous Trump administration was asked about this situation by “the 74.”  They reported that “he’d prefer the next commissioner to have state-level experience and to be more ‘critical of these big research houses’ like ETS, which has held NAEP contracts for roughly 40 years and just won another competition in January. “
  • Liam Knox just reported that the NCES has “closer to five than 10” employees following this week’s mass layoffs at the Department of Education.
  • Update: Politico reports that the Institute of Educational Sciences is down to a single employee.

It almost seems like irrelevant small potatoes at this point, but one imagines that ETS may lose some amount of its NAEP-related revenues despite decisions to scrub references to DEI from their public-facing materials.

The College Board messed up a recent administration of the SAT. On March 8, tests were submitted at 11:00 sharp, whether or not test takers had actually finished.  This was possible because the SAT is delivered digitally nowadays.  Affected test takers were given a refund and a voucher for a free re-test.  The incident attracted some media attention: Scott White wrote an article about it for Forbes (which has been viewed 53,000 times in three days) and a few local outlets also ran items.  But the College Board’s screw up isn’t really the point of this post.

I just want to mention that this story highlights one reason why I write so much about standardized tests of English.

When the College Board messes up, it makes headlines. Media outlets report on it, people discuss the screw up, and there’s some amount of accountability. There is often consideration of how certain groups are more affected when a test administration is bungled.

But when English proficiency tests are badly run it gets very little attention. Sometimes no attention at all. I might roll out of bed and write something on LinkedIn while waiting for my morning coffee.  But not always.

I’ll give you an example.

In April of 2023 ETS stopped accepting the Aadhaar Card for admission to TOEFL test centers in India.  But for 13 months, the front page of the official TOEFL website for India still proclaimed that “ETS is temporarily accepting the Aadhaar Card as primary ID until further notice.”  The page also linked to a copy of the TOEFL Bulletin which repeated this incorrect statement.

For 13 months, students registered for the test, paid the hefty fee, showed up at their local test center… and were turned away for not having proper ID.  ETS kept their registration fees.

My friends at ETS might argue that the correct information was listed elsewhere, or that it was included in emails sent before test day.  And, yes, a majority of students probably got the correct instructions eventually.  But the incorrect information was very prominently displayed.  For a very long time.

Had something like that happened in the USA during an administration of the SAT it might have made the New York Times.  But since it involved an English test it attracted no attention whatsoever.  For 13 months!

That’s today’s point.

There’s a good reason why this sort of thing doesn’t get any attention.  Needless to say, writing about English tests doesn’t pay the mortgage.

Since problems with English testing will never be closely examined, it is incumbent upon testing firms to strive to do the best they can for their communities. No one is looking over your shoulder… but you STILL have to try as hard as you can to deliver for your test takers.

Supplementary GRE fees were just hiked. It now costs $40 to send a score report after the test (a $5 increase). Rescheduling a test date now costs $55 (also a $5 increase).

I’m glad to see that the GRE website has already been updated to reflect these changes. Remember that while supplementary TOEFL fees were hiked about a month ago, the TOEFL website still lists the old prices. As a result, many test takers only learn the actual cost of essential services after they have financially committed to taking the TOEFL. That’s not good; indeed, some* have argued that it raises ethical concerns related to ETS’s commitment to fairness and transparency.

*Just me, really.

Last week, the Sunday Times published a story about the Prince Of Wales delivering an address in Welsh with proficiency in the language that the headline writer said was “thanks to Duolingo.”

The article prompted a letter to the editor from Pamela Baxter, who is Chief Product Officer, English, over at Cambridge University Press & Assessment.  While Baxter expresses her belief that apps are helpful, the letter does note that “human expertise is indispensable when it comes to language learning.”

The folks at Duolingo would probably agree with that.

Amusingly, the letter includes a somewhat clunky reference to the IELTS test.  I’ll quote the whole paragraph:

“Human expertise is indispensable when it comes to language learning. Our company has been providing language learning and testing for more than a century. Via the International English Language Testing System we are clear that people must play a critical role at every step of teaching, assessment and qualification, even as we use technology to improve the learning experience. To learn a language properly the quality and integrity of teaching and assessment matters. That means human expertise, interaction and challenge. Free apps can be a good start, but they are no alternative to the human side of learning.”

Why shoehorn the IELTS test into an otherwise well-written letter?

Well, the future King of Canada isn’t the only stuffy Brit using Duolingo’s app.  Here’s a snippet from the Duolingo English Test blog, taken from a post made in December:

“Starting on January 1, 2025, MP and Peers will be able to compete in the Westminster Language Challenge, running until the end of March. The stakes are high: the top three performers will win a share of £20,000 to donate to a charity of their choice, with the overall winner crowned Duolingo’s Westminster Language Champion at an event in Parliament in April.”

One might argue that this challenge is part of a charm offensive to gain wider acceptance of the DET in the United Kingdom. Zoom in on the second photo on the blog if you want a little proof of that.

Anyway. The people at Duolingo are quite good at what they do.  As I’ve mentioned here ad nauseam, there is more to the success of the DET than cost and perceived easiness.

In recent weeks, some people have been prompted to enter their ID number in order to download their TOEFL score report PDF.  They get a screen that looks like this:

The user is prompted to “Enter your ID information to complete your request and ensure your data stays secure.”  I can’t explain it.  Apparently entering the information does not make it possible to download the PDF right away.  Instead, they must enter the information and then wait a few hours.  After that, the download button will appear and work as it is supposed to.

Anyway, this is really weird.  Leave a comment if you are experiencing it, and maybe let me know what country you are in.

IDP Education shares are currently trading at $8.86, reflecting a 53% decrease over the past year and a 78% decline since November 2021.

These declines come after a 1% increase in IELTS volumes in FY2023, a 18% drop in FY2024, and a 24% drop over the half-year reported on last month.

IELTS has always been very important to IDP’s bottom line. In 2019, it was responsible for more than 60% of the firm’s total revenues. Clearly then, IDP faces challenging times.

While IELTS volumes will certainly increase when regulatory and political conditions become rosier, declines in market share may be more difficult to make up. I’m not certain that future generations of test takers will rediscover the IELTS test and find a new appreciation of its virtues. Test takers aren’t like the kids today turning their backs on CDs and streaming in favor of vinyl records. Once test takers are gone, they may be gone for good.

Some may find this situation eerily familiar.

Consider the TOEFL test circa 1988. Taken about a half million times per year, it dominated its segment back then. At that time the Cambridge exams were taken about 130,000 times a year. The IELTS test was taken zero times a year because it didn’t yet exist. There were a few regional exams from Canada and Australia, but their test volumes were negligible. In 1989 things changed with the arrival of the IELTS.  In the years that followed, that test grew and grew and grew, mostly at the expense of the TOEFL. Despite fairly hard resets in 1998, 2005 and 2019 the TOEFL has been unable to recapture much of that lost market share. TOEFL’s customers departed, and they never really came back. Test volumes today are about the same as they were 25 years ago even though the number of students studying abroad has increased dramatically since then. (source: “Cambridge English Exams: The First Hundred Years”)

So how might IDP salvage the situation?

New testing products are an option. I’m happy to see that the British Council has once again become a test developer. Revenues from new in-house products like the EnglishScore and Aptis tests may help them make up for lower revenues from the IELTS. Their pals at Cambridge now offer the Linguaskill test, which might do the same. Even ETS has diversified in recent years, acquiring existing (and profitable) products from Pipplet and PSI.

But IDP? I don’t see any new testing products on the horizon.

EDIT: A commenter mentions that IDP has a product called “Envoy.”

Are things totally hopeless? Of course not. Eventually IDP will be given the green light to resume language testing in China. That’ll help a lot. And the results of the HOELT tender are due any day now. Those could help as well (unless they make things worse).

Those last two points are worth a post of their own, so I’ll leave it at that. But I would love to hear an opposing point of view if you’ve got one.

I read that results for the paper version of the OET are now available in five days or less, when the test is taken in India.

It is evident that test takers today really value quick score reporting. In the past, test takers usually gave themselves plenty of breathing room between their test date and whatever date the scores were needed. Today, they seem less likely to do that.

The Educational Testing Service (ETS) just published a document called “Reimagining Educational Assessments: AI Innovations for Enhancing Test Taker Experience.”

The document says, seemingly in reference to scoring of the TOEFL iBT, that:

“ETS combines the efficiency of AI with essential human oversight. While AI manages most of the scoring, human raters review a sample of the machine-scored responses.”

That appears to be a departure from how the TOEFL has traditionally been scored.  Traditionally, every response (not just “a sample”) has been graded by both a human rater and by AI.  Until now, it has never been accurate to say that AI “manages most of the scoring.”

That said, the phrasing used in the document is somewhat vague.  Maybe I’ve misunderstood it.  Perhaps someone from ETS can confirm what it means.

UPDATE: I have been informed by reliable sources that there has been no change to the scoring process.

This comes a few months after ETS began the process of offshoring human scoring of TOEFL test taker responses to facilities in India.

Here’s a copy of my Duolingo English Test score report.  The scores arrived exactly 48 hours after I finished the test. You can also check out the link that Duolingo provides for easy sharing.  I’m happy to see that my user account includes a “make private” toggle, which I can use when I no longer want that link to work.

Note how the reports now contain individual and integrated subscores.

My account also includes proper integration with the “add license or certification” function of LinkedIn.  That’s something more test makers should figure out how to do.  I’m surprised it isn’t more common.

I’m able to access the video interview and writing sample which are sent to score recipients from my account but I don’t think I can share them publicly.  Perhaps that’s something which could be implemented in the future.

I understand that some test makers still don’t send speaking and writing samples to score recipients.  That’s something else they should figure out how to do.