
Two years ago Google famously observed that neither they nor OpenAI had a moat when it came to AI. Meaning they had no protected business model they could monopolize to build revenue streams in the tens or hundreds of billions.
As 2025 starts this is even more true. Open-Source AI is now mere weeks behind the leading cutting-edge efforts investors have poured hundreds of billions into, in the hope of 'unicorns' and big returns. The trend is largely driven by companies trying to 'poison pill' each other's efforts to pull ahead. The logic being, if I don't get to be the unicorn earning hundreds of billions, at least I can stop others from doing it.
It's worth asking – how much longer will this trend last? Will it last all the way up until the development of AGI?
If it does it has some profound implications. It means when the power of AGI arrives it won't be in the hands of the few, it will be in the hands of the many. The arrival of AGI was always going to be a profoundly disruptive event, now it seems how it will play out may be even more unpredictable.
What if Open-Source AI continues to equal investor-funded AI all the way to AGI?
byu/lughnasadh inFuturology

3 Comments
>If it does it has some profound implications. It means when the power of AGI arrives it won’t be in the hands of the few, it will be in the hands of the many.
The “*leading cutting-edge efforts investors have poured hundreds of billions into*” you mentioned, are already in the hands of many. Just pay the monthly subscription and you got it.
I suspect open source is far more than “mere weeks behind” o3. *if the hype is real. But we will see. I very much could be wrong.
>If it does it has some profound implications. It means when the power of AGI arrives it won’t be in the hands of the few, it will be in the hands of the many. The arrival of AGI was always going to be a profoundly disruptive event, now it seems how it will play out may be even more unpredictable.
Just because a model is open source doesn’t mean just anyone can use it however they want. There are still gatekeeps. Llama 3.1 405b is open source, but do you have the eight A100 80GB cards that cost $17,000 a pop to run 405b locally? If you don’t, you are still at the mercy of a cloud service provider and providing your data to someone else.
Running this stuff at scale and with a wrapper around it offering additional functionality is worth paying for and people will pay.