The cost of indexing using third party API is extremely high, however. This might work out well with an open source model and a cluster of raspberry pi for large library indexing?
EDIT: Whoops, I found more details at the very end of the article.
I'd just reiterate two general points of critique:
1. The point of establishing connections between texts is semantic and terms can have vastly different semantic meanings dependent on the sphere of discourse in which they occur. Because of the way LLMs work, the really novel connections probably won't be found by an LLM since the way they function is quite literally to uncover what isn't novel.
2. Part of the point in making these connections is the process that acts on the human being making the connections. Handing it all off to an LLM is no better than blindly trusting authority figures. If you want to use LLMs as generators of possible starting points or things to look at and verify and research yourself, that seems totally fine.
"Don't be curmudgeonly. Thoughtful criticism is fine, but please don't be rigidly or generically negative."
But this thing isn't (so far as I can tell) even slightly proposing that we feed books into an LLM instead of reading them. It looks to me more like a discovery mechanism: you run this thing, it shows you some possible links between books, and maybe you think "hmm, that little snippet seems well written" or "well, I enjoyed book X, let's give book Y a try" or whatever.
I don't think it would work particularly well for me; I'd want longer excerpts to get a sense of whether a book is interesting, and "contains a fragment that has some semantic connection with a fragment of a book I liked" doesn't feel like enough recommendation. Maybe it is indeed a huge waste of time. But if it is, it isn't because it's encouraging people to substitute LLM use for reading.
I'm a Literature major and avid reader, but projects like this are still incredibly exciting to me. I salivate at the thought of new kinds of literary analysis that AI is going to open up.