Working at the cutting edge of AI is unfortunately expensive. For example, Google not only has DeepMind, but also Google Brain, Research, and Cloud. They also have TensorFlow, TPUs, and own about a third of all research in the field, holding their own AI conferences.
I strongly believe that compute horsepower will be crucial, and maybe even sufficient, to achieve AGI. Progress in AI has historically been driven by systems – compute power, data, and infrastructure. The fundamental algorithms used today have remained largely unchanged since the 90s. Algorithmic advances can be quickly implemented and integrated, but without the necessary scale, they are not impactful.
OpenAI seems to be burning through cash and their funding model may not allow them to compete seriously with Google, a company worth 800 billion dollars. Continuing research openly could inadvertently benefit Google, as they can easily incorporate any advancements at scale.
A for-profit pivot could provide a more sustainable revenue stream and attract significant investment. However, shifting focus to building a product from scratch may detract from AI research and take a long time. It is uncertain if a company could catch up to Google’s scale, and investors may push in the wrong direction.
Attaching OpenAI to Tesla as a cash cow could be a promising option. Tesla’s existing infrastructure and technology could be leveraged to accelerate the development of a full self-driving solution, benefiting from OpenAI’s expertise. This could significantly increase Tesla’s market cap and fund AI research at a larger scale.
I see no other option with the potential to reach sustainable Google-scale capital within the next decade.