AI capabilities have surged dramatically over the past two years, with large language models like ChatGPT, Dall-E, and Midjourney becoming ubiquitous tools. Generative AI programs are now integral to various applications, from email responses to creating marketing content and images. The rapid adoption of AI by individuals and companies highlights a transformative shift in technology usage.
Historically, reports have shown a steady increase in the computational requirements for AI development, stressing the need for enhanced infrastructure. Surveys from different periods indicated varying levels of AI adoption, but recent figures show a significant surge. These historical data points align with current concerns about the sustainability of AI advancements given the rising costs and resource demands.
Rapid AI Adoption
A recent survey by McKinsey disclosed that 65% of companies have integrated generative AI into at least one business function, a notable rise from 33% at the start of 2023. Despite this enthusiasm, the resource-intensive nature of AI training and operations presents significant challenges. As it stands, big tech firms dominate the sector, raising concerns about potential centralisation.
Rising Computational Costs
The World Economic Forum reports that the demand for AI computational power is growing annually by 26% to 36%. Epoch AI’s study supports this, projecting that the costs of training AI could escalate to billions of dollars soon. Ben Cottier, an Epoch AI staff researcher, indicated that the cost growth rate could result in billion-dollar expenses by 2027.
“The cost of the largest AI training runs is growing by a factor of two to three per year since 2016, and that puts billion-dollar price tags on the horizon by 2027, maybe sooner,” noted Epoch AI staff researcher, Ben Cottier.
Companies like Microsoft are heavily investing in AI infrastructure, as evidenced by their recent $100 billion data center project. This trend of substantial investments is also seen in other tech giants like Google, Alphabet, and Nvidia, which are channeling significant funds into AI research and development.
Decentralisation: A Potential Solution
The current centralisation of AI computing resources by big tech companies might change with the introduction of decentralized computing infrastructures. Qubic Layer 1 blockchain, with its useful Proof-of-Work (uPoW) mechanism, presents a decentralized solution by leveraging its network of miners to provide computational power for AI tasks. This approach could reduce reliance on big tech and distribute AI development more broadly.
AI innovations are still in their early stages, but access to computational power remains a significant hurdle. Big tech’s control over these resources complicates the innovation landscape and raises concerns about data privacy and fairness. However, decentralized infrastructures like Qubic could democratize AI development, reducing costs and fostering a more inclusive technological environment. This shift could also mitigate the risks associated with centralised control over such a critical technology.