Join senior executives in San Francisco on July 11-12 to learn how leaders are integrating and optimizing AI investments for success. Learn more
Barely a week goes by without another dramatic report on humanity and the planet reaching a climate change tipping point. The latest reports were a jaw-dropping analysis from the World Meteorological Organization and startling criticism from the UN Secretary-General. The two were shared in the last days of April.
Artificial intelligence will determine whether we blow through the tipping point or come back from the brink.
AI is one of the important tools that remain in the fight against climate change. AI has turned to risk prediction, preventing damaging weather events, such as wildfires, and carbon offsets. It has been described as key to ensuring that companies meet their ESG objectives.
However, it is also an accelerator. AI requires a lot of computing power, which generates energy when designing algorithms and training models. And just as software ate the world, AI should follow.
Join us in San Francisco on July 11-12, where senior executives will share how they integrated and optimized AI investments for success and avoided common pitfalls.
AI will contribute up to $15.7 trillion to the global economy by 2030, which is greater than the GDP of Japan, Germany, India and the UK. That’s a lot of people using AI as pervasive as the internet, from using ChatGPT to create emails and writing code to using text-to-image platforms to create content. ‘art.
The power used by AI has been increasing for years. For example, the power needed to train the largest AI models doubled approximately every 3.4 months, increasing 300,000 times between 2012 and 2018.
This expansion provides opportunities to solve major real-world problems in everything from security and medicine to hunger and agriculture. It will also have a punitive impact on climate change.
The cost of high energy
IT goes hand in hand with high energy costs and a larger carbon footprint, which pushes the accelerator pedal of global climate change.
This is especially true for AI. The large number of GPUs running machine learning algorithms get hot and need to be cooled; otherwise, they melt. Training a single large language model (LLM) requires a mind-boggling amount of energy with a large carbon footprint.
As we move into the GPT4 era and the patterns grow, the energy needed to form them increases. GPT-3 was 100 times larger than its predecessor GPT, and GPT-4 was ten times larger than GPT-3. All the while, bigger models are coming out faster. GPT-4 arrived in March 2023, almost four months after ChatGPT (powered by GPT-3.5) was released in late November 2022.
To balance, we shouldn’t assume that as new models and companies emerge in space, the carbon footprint of AI will continue to grow. Geeta Chauhan, an artificial intelligence engineer at Meta, uses open-source software to reduce the operational carbon footprint of LLMs. His latest work shows a 24x reduction in carbon emissions compared to GPT-3.
However, the popularity of AI and its exponential power are undermining much of the climate action in effect today and calling into question its potential to be part of the solution.
We need a solution that allows AI to flourish while limiting its carbon footprint. So what do we do?
Tempering carbon dependence
As always, technology will get us out of this predicament.
For the explosion of AI to be sustainable, edge computing must come to the fore and do the heavy lifting for many of the tasks currently performed by AI. The good news is that we already have advanced computing technologies that are ready to perform these tasks more efficiently and faster than AI, with the added benefit of using much, much less energy.
In short, advanced computing is the most effective tool we have to temper the carbon addiction of AI. With it, we can slow down the progression of climate change.
There are a number of different technologies in advanced computing that can solve some of the problems AI is currently tackling.
For example, quantum computing is superior to AI in drug discovery. As humans live longer, they face ever-increasing numbers of new, complex and incurable diseases. This is known as the “better than the Beatles” problem, where new drugs have modest improvements over already effective therapies.
Until now, drug development has focused on rare events in a data set and making educated guesses to design the right drugs to target and bind to disease-causing proteins. LLMs can be used effectively to help with this task.
LLMs are remarkably good at predicting which words in our vocabulary can best match a sentence to accurately convey meaning. Drug discovery is not much different, as the problem is to identify the best fit, or configuration, of molecules in a compound to achieve a therapeutic outcome.
However, molecules are quantum elements, so quantum computing is much better at solving this problem. Quantum computing has the ability to rapidly simulate a large number of binding sites in drugs to create the right setup for the treatment of currently incurable diseases.
Advanced Computing: Quantum and Beyond
Quantum’s abilities mean these can be solved much faster and with much less power consumption.
Another development with real possibility for improving AI is photonics, or so-called optical computing, which uses laser-generated light instead of electricity to send information.
Some companies are building computers that use this technology, which is much more energy efficient than most other computing technologies and is increasingly recognized as a way to achieve Net Zero.
Elsewhere we have neuromorphic computers. It is a type of computer engineering where the elements of the computer system are modeled after those of the human brain and nervous system. They perform calculations to replicate the analog nature of our neural system. Trials of this technology include projects from Mythic and Semron. Neuromorphic is another greener option that requires additional investment. Its hardware has the potential to run large deep learning networks that are more energy efficient than comparable conventional computer systems.
For example, the processing of information through its one hundred billion neurons consumes only 20 watts, like an energy-saving light bulb in a house.
The development and application of these innovations are imperative if we are to curb climate change.
Leaders in Advanced Computing
There are many startups (and investors) around the world obsessed with edge computing, but there are only a handful of companies that focus on so-called impact areas like healthcare, environment and climate change.
Within quantum computing, the most interesting companies developing use cases for energy and drug discovery are Pasqal (its co-founder was awarded the 2022 Nobel Prize in Physics), Qubit Pharmaceutical and IBM. In photonics, we consider leaders with global impact such as Lightmatter and Luminous, while in neuromorphic computing, we follow the progress of Groq, Semron and Intel.
Edge computing is key to achieving the energy efficiency we need to fight climate change. It simply takes too much time and consumes too much power to run artificial neural networks on a GPU.
By adopting advanced computational methods as alternatives to AI, companies can significantly mitigate AI’s impact on the environment while ensuring that its vast power can mitigate some of the impacts of climate change, such as anticipating forest fires or extreme weather conditions.
The existential end point is approaching for our environment. But the situation is not hopeless.
The deployment of advanced computing is a credible and powerful resource to counter the problem. We must invest in these technologies now to solve the greatest challenge facing humanity.
Francesco Riciuti is a VC at Runa Capital.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including data technicians, can share data insights and innovations.
If you want to learn more about cutting-edge insights and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.
You might even consider writing your own article!
Learn more about DataDecisionMakers