And there are state of the art weather prediction transformers.
The reason I want to specifically harp on this is because a lot of people are selling this narrative where "AI is becoming superintelligent" or whatever by making an amorphous blob out of a bunch of separate advances that use machine learning techniques. This has been happening for a while, is a great thing for science, and it's clear that machine learning methods are here to stay in science. I'm a machine learning researcher. I've understood, celebrated, and tried to help with this as best I can manage over the last 9 years of my life. And it's been going on for a lot longer than the general public has been in this AI hype wave. The entire modern field of bioinformatics is arguably built on the backbone of machine learning, and has been since before I went to grad school.
This is really different from "We fed everything into a language model and now it's superintelligent and is making scientific advances all by itself" or even "scientists just ask chatGPT shit and it figures it out for them". The breathless tech press really makes it sound like anything that happens in AI research, which increasingly includes the entire usage of ML toolkits in the sciences (Which is pervasive, and expectedly so! ML is an extension of statistics and statistics has been the basis of science for like a century) is just some amorphous force called "AI" that's suddenly gained this aggregate body of competency. Imagine if we anthropomorphized statistics that way. Or Math for that matter. This kind of narrative gives me the overall impression that this is not being talked about honestly, and it's clear that this is profitable to do. I don't have to use charged words like "con" or "fraud" to think this deceptive framing is not a great thing