So what does the true definition of "AGI" actually mean? It depends on who you ask.
It appears to many to mean "A Great IPO" or "A Gigantic IPO" at this point rather than "Artificial General Intelligence" which has been clearly hijacked to mean something else.
No worries, there will be a startup creating "AGI Bench", >=80% means you're AGI, they will be valued $50B.
AGI - Automatically Generating Income
I mean, the goalposts shifted. The game Go used to be considered to require true AI. Passing the turing test. Scanning, analyzing and improving complex codebases largely on their own would have been considered some sort of AGI by me 6 years ago.
Now sure, we all know they lack true understanding. But it gets blurry at times what that does mean.
But I don't buy that there will be a magic point, where self improving AGI explodes towards singularity. The current approach is very, very energy and compute intense and that is unlikely to change.
https://www.scmp.com/news/china/science/article/3351721/chin...
But in general I do believe AI has the potential to be a great positive for humanity on its own - if the open models stay strong and not only a few people control them.
And yes, humans as a whole are not even ready for cars or nuclear weapons. We build and used them anyway.
But my brain is still pretty busy and I don't think the younger generation is getting dumber because of LLMs, rather mindless consuming TikTok and co
LLMs are a also great learning tool and anyone using them should know their limits quickly. Not all do, though. That is obvious.
If your stake is > 30 billion seems more of a reasonable and realistic criteria to me.
One key thing I've heard about AGI which I think would be the most determining factor for me is a model that learns on the fly. Which could be done one way or another, but when you consider that LLMs basically run like "ROM" files, it makes it a little complicated.
I think we need to re-imagine how LLMs are built, train, and run. But also, figure out how to drastically lower the cost of running them.
1. since AI has captured the imagination of capitalists and they think this is the next industrial revolution, they gotta be in it to win it. combined with the fact that i believe most people here are wealthy or at least aspirationally so, that explained half of it.
2. the other half is that AI as a tech is interesting from a mathematical and compsci point of view, tho certainly not interesting enough to justify the proportion of topics about it here.
i guess i should add a 3rd reason.
3. ycomb has a financial stake in spreading the news about how wonderful this tech is!
lolol
Also: nothing gets sustained attention on HN unless good hackers find it interesting. Our entire objective is to be the website that attracts the best hackers, serves them the most interesting content and facilitates the most interesting discussions. That can’t happen if we’re nefariously pushing a commercial agenda.
Jessica Livingston's personal stake in OpenAI is maybe at most 0.1% or less and Paul Graham's, afaik, is 0.
So the bias doesn't seem as large as OP thinks
*https://xcancel.com/paulg/status/2041366050693173393
And "toughness, adaptability, and determination" >>> "ambition", frankly
"Less" doesn't mean "not at all", of course—that would be too big a loophole. But it does mean strictly less, and we stick to that, despite its various downsides, because the upside is bigger.
In the present case, it means we haven't applied any moderation downweights to this post, even though it's obviously the sort of thing we would downweight under other circumstances, since it's neither particularly substantive nor intellectually interesting (though it could be some other kind of interesting, at least to some readers).
It’s a sobering reminder and worthy of being on the front page on that basis alone, but I don’t see much of a discussion to be had. “Unusually quiet for a front page post” is probably where this post is meant to be.
As far as I know this is the first time anyone has publicly claimed to know, quoting insider sources, what YC's actual stake in OpenAI is.
I'd go as far to say that it's impossible at this point to form an AI company without YCombinator not investing in it.
Paul Graham of Y-Combinator in response tweeted some positive things about Altman, emphasising that they didn't fire him as CEO of YC (though not going as far as declaring him trustworthy).
Now John Gruber of DaringFireball (an Apple blog) added context by claiming that YC owns a 0.6% stake in OpenAi, worth around $5bn, which might colour Graham's judgement.