Aside from that being false, novelty is not what I'm buying.
Word2vec was created, patented, and published in 2013 - by Googlers
which begat
Attention Is All You Need was published in 2017 - by Googlers
which begat
GPT-1 - 2018
GPT-2 was pre-trained on a dataset of 8 million web pages - 2019
None of it is more than 20 years old. Let alone half a century. It is all new.
And even if it was a 100 years old you still need the infrastructure and data to implement it on a computer. You can make it on a paper tape if you want, how many tokens per second do you think that will produce?
In this case we aren't paying for the algorithm, the Pure Mathematics. We are paying for the results of running the algorithm to real things - Applied Mathematics.
Now some of what some people in the industry might be doing is a scam. Overselling it and getting massive investment and bankrupting Oracle! I won't comment any further on that.