AI & RoboticsNews

How AI and advanced computing can pull us back from the brink of accelerated climate change

advanced computing

Barely a week passes without another dramatic report about humanity and the planet reaching a climate change tipping point. The latest reports were a heart-stopping analysis from the World Meteorological Organization and arresting criticism from the UN Secretary-General. Both were shared in the final days of April.

Artificial Intelligence will determine whether we blow through the tipping point or row back from the brink.

AI is one of the significant tools left in the fight against climate change. AI has turned its hand to risk prediction, the prevention of damaging weather events, such as wildfires and carbon offsets. It has been described as vital to ensuring that companies meet their ESG targets.

Yet, it’s also an accelerant. AI requires vast computing power, which churns through energy when designing algorithms and training models. And just as software ate the world, AI is set to follow.

AI will contribute as much as $15.7 trillion to the global economy by 2030, which is greater than the GDP of Japan, Germany, India and the UK. That’s a lot of people using AI as ubiquitously as the internet, from using ChatGPT to craft emails and write code to using text-to-image platforms to make art.

The power that AI uses has been increasing for years now. For example, the power required to train the largest AI models doubled roughly every 3.4 months, increasing 300,000 times between 2012 and 2018.

This expansion brings opportunities to solve major real-world problems in everything from security and medicine to hunger and farming. It will also have a punitive impact on climate change.

Computing goes hand-in-hand with high energy costs and a larger carbon footprint, which tap the accelerator pedal on the world’s climate change.

This is especially true for AI. The huge number of GPUs running machine learning algorithms get hot and need to be cooled; otherwise, they melt. Training even one large language model (LLM) requires an eye-watering amount of energy with a large carbon footprint.

For example:

As we move into the GPT4 era and the models get larger, the energy needed to train them grows. GPT-3 was 100 times larger than its predecessor GPT, and GPT-4 was ten times the size of GPT-3. All the while, larger models are being released quicker. GPT-4 arrived in March 2023, nearly four months after ChatGPT (powered by GPT-3.5) was released at the end of November 2022.

For balance, we shouldn’t assume that as new models and companies emerge in the space AI’s carbon footprint will continue growing. Geeta Chauhan, an AI engineer at Meta, is using open-source software to reduce the operational carbon footprint of LLMs. Her latest work shows a 24-fold reduction in carbon emissions compared with GPT-3.

However, AI’s popularity and its exponential power undermine much of the climate action in force today and call into question its potential to be part of the solution.

We need a solution that allows AI to flourish while arresting its carbon footprint. So, what do we do?

As always, technology will drag us out of this predicament.

For the explosion of AI to be sustainable, advanced computing must come to the fore and do the heavy lifting for many tasks that are currently performed by AI. The good news is that we already have advanced computing technologies that are primed to execute these tasks more efficiently and quickly than AI, with the added benefit of using much, much less energy.

In short, advanced computing is the most effective tool we have to temper AI’s carbon addiction. With it, we can slow the creep of climate change.

There are a number of different technologies in advanced computing emerging that can solve some of the problems AI is currently tackling.

For example, quantum computing is superior to AI in drug discovery. As humans live longer, they are encountering, in ever greater numbers, new diseases that are complex and untreatable. This is called the “better than The Beatles” problem, where new drugs have modest improvements on already successful therapeutics.

So far, drug development has focused on rare events within a dataset and making educated guesses to design the right drugs to target and bind to the proteins that cause disease. LLMs can be efficiently used to help with this task.

LLMs are remarkably good at predicting which words in our vocabulary can best fit a sentence to accurately convey meaning. Drug discovery isn’t wildly dissimilar as the problem is identifying the best fit, or configuration, of molecules in a compound to get a therapeutic result.

However, molecules are quantum elements, so quantum computing is much better at tackling this problem. Quantum computing has the capacity to quickly simulate vast numbers of binding sites in medicines to create the right configuration for treating currently incurable diseases.

Quantum’s capabilities mean that these can be solved much faster and with much less energy usage.

Another development with a real possibility to be an enhancement to AI is photonics, or so-called optical computing, which uses laser-produced light instead of electricity to send information.

Some companies are building computers that use this technology, which is much more energy-efficient than most other computing technologies and is being recognized increasingly as a route to achieving Net Zero.

Elsewhere, we have neuromorphic computers. This is a type of computer engineering where elements of the computer system are modeled on those in the human brain and nervous system. They perform computations to replicate the analog nature of our neural system. Trials of this technology include projects by Mythic and Semron. Neuromorphic is another greener option that needs further investment. Its hardware has the potential to run large deep learning networks that are more energy-efficient than comparable classical computing systems.

For example, processing information through its hundred billion neurons consumes only 20 watts, similar to an energy-saving light bulb in a home.

Developing and applying these innovations are imperative if we are to apply the brakes on climate change.

There are many startups (and investors) around the world obsessed with advanced computing but there are just a handful of companies that are focusing on so-called impact areas like healthcare, the environment and climate change.

Within quantum computing, the most-exciting companies that are developing use cases for energy and drug discovery are Pasqal (its cofounder was awarded the Nobel Prize in Physics 2022), Qubit Pharmaceutical and IBM. When it comes to photonics, we view the leaders with global impact as Lightmatter and Luminous, while in neuromorphic computing, we are tracking the progress of Groq, Semron and Intel.

Advanced computing is vital for achieving the energy efficiency we need to fight climate change. It simply takes too long and is too energy-intensive to run artificial neural networks on a GPU.

By adopting advanced computing methods as alternatives to AI, businesses can greatly alleviate the impact that AI has on the environment while still ensuring its vast power can mitigate some of the impacts of climate change, like anticipating wildfires or extreme weather.

The existential endpoint is approaching for our environment. But the situation is not hopeless.

The deployment of advanced computing is one credible and powerful resource to counteract the problem. We need to invest in these technologies now to solve the greatest challenge facing humanity.

Francesco Ricciuti is a VC at Runa Capital.

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


Barely a week passes without another dramatic report about humanity and the planet reaching a climate change tipping point. The latest reports were a heart-stopping analysis from the World Meteorological Organization and arresting criticism from the UN Secretary-General. Both were shared in the final days of April.

Artificial Intelligence will determine whether we blow through the tipping point or row back from the brink.

AI is one of the significant tools left in the fight against climate change. AI has turned its hand to risk prediction, the prevention of damaging weather events, such as wildfires and carbon offsets. It has been described as vital to ensuring that companies meet their ESG targets.

Yet, it’s also an accelerant. AI requires vast computing power, which churns through energy when designing algorithms and training models. And just as software ate the world, AI is set to follow.

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

AI will contribute as much as $15.7 trillion to the global economy by 2030, which is greater than the GDP of Japan, Germany, India and the UK. That’s a lot of people using AI as ubiquitously as the internet, from using ChatGPT to craft emails and write code to using text-to-image platforms to make art.

The power that AI uses has been increasing for years now. For example, the power required to train the largest AI models doubled roughly every 3.4 months, increasing 300,000 times between 2012 and 2018.

This expansion brings opportunities to solve major real-world problems in everything from security and medicine to hunger and farming. It will also have a punitive impact on climate change.

The cost of high energy

Computing goes hand-in-hand with high energy costs and a larger carbon footprint, which tap the accelerator pedal on the world’s climate change.

This is especially true for AI. The huge number of GPUs running machine learning algorithms get hot and need to be cooled; otherwise, they melt. Training even one large language model (LLM) requires an eye-watering amount of energy with a large carbon footprint.

For example:

As we move into the GPT4 era and the models get larger, the energy needed to train them grows. GPT-3 was 100 times larger than its predecessor GPT, and GPT-4 was ten times the size of GPT-3. All the while, larger models are being released quicker. GPT-4 arrived in March 2023, nearly four months after ChatGPT (powered by GPT-3.5) was released at the end of November 2022.

For balance, we shouldn’t assume that as new models and companies emerge in the space AI’s carbon footprint will continue growing. Geeta Chauhan, an AI engineer at Meta, is using open-source software to reduce the operational carbon footprint of LLMs. Her latest work shows a 24-fold reduction in carbon emissions compared with GPT-3.

However, AI’s popularity and its exponential power undermine much of the climate action in force today and call into question its potential to be part of the solution.

We need a solution that allows AI to flourish while arresting its carbon footprint. So, what do we do?

Tempering the carbon addiction

As always, technology will drag us out of this predicament.

For the explosion of AI to be sustainable, advanced computing must come to the fore and do the heavy lifting for many tasks that are currently performed by AI. The good news is that we already have advanced computing technologies that are primed to execute these tasks more efficiently and quickly than AI, with the added benefit of using much, much less energy.

In short, advanced computing is the most effective tool we have to temper AI’s carbon addiction. With it, we can slow the creep of climate change.

There are a number of different technologies in advanced computing emerging that can solve some of the problems AI is currently tackling.

For example, quantum computing is superior to AI in drug discovery. As humans live longer, they are encountering, in ever greater numbers, new diseases that are complex and untreatable. This is called the “better than The Beatles” problem, where new drugs have modest improvements on already successful therapeutics.

So far, drug development has focused on rare events within a dataset and making educated guesses to design the right drugs to target and bind to the proteins that cause disease. LLMs can be efficiently used to help with this task.

LLMs are remarkably good at predicting which words in our vocabulary can best fit a sentence to accurately convey meaning. Drug discovery isn’t wildly dissimilar as the problem is identifying the best fit, or configuration, of molecules in a compound to get a therapeutic result.

However, molecules are quantum elements, so quantum computing is much better at tackling this problem. Quantum computing has the capacity to quickly simulate vast numbers of binding sites in medicines to create the right configuration for treating currently incurable diseases.

Advanced computing: Quantum and beyond

Quantum’s capabilities mean that these can be solved much faster and with much less energy usage.

Another development with a real possibility to be an enhancement to AI is photonics, or so-called optical computing, which uses laser-produced light instead of electricity to send information.

Some companies are building computers that use this technology, which is much more energy-efficient than most other computing technologies and is being recognized increasingly as a route to achieving Net Zero.

Elsewhere, we have neuromorphic computers. This is a type of computer engineering where elements of the computer system are modeled on those in the human brain and nervous system. They perform computations to replicate the analog nature of our neural system. Trials of this technology include projects by Mythic and Semron. Neuromorphic is another greener option that needs further investment. Its hardware has the potential to run large deep learning networks that are more energy-efficient than comparable classical computing systems.

For example, processing information through its hundred billion neurons consumes only 20 watts, similar to an energy-saving light bulb in a home.

Developing and applying these innovations are imperative if we are to apply the brakes on climate change.

Advanced computing leaders

There are many startups (and investors) around the world obsessed with advanced computing but there are just a handful of companies that are focusing on so-called impact areas like healthcare, the environment and climate change.

Within quantum computing, the most-exciting companies that are developing use cases for energy and drug discovery are Pasqal (its cofounder was awarded the Nobel Prize in Physics 2022), Qubit Pharmaceutical and IBM. When it comes to photonics, we view the leaders with global impact as Lightmatter and Luminous, while in neuromorphic computing, we are tracking the progress of Groq, Semron and Intel.

Advanced computing is vital for achieving the energy efficiency we need to fight climate change. It simply takes too long and is too energy-intensive to run artificial neural networks on a GPU.

By adopting advanced computing methods as alternatives to AI, businesses can greatly alleviate the impact that AI has on the environment while still ensuring its vast power can mitigate some of the impacts of climate change, like anticipating wildfires or extreme weather.

The existential endpoint is approaching for our environment. But the situation is not hopeless.

The deployment of advanced computing is one credible and powerful resource to counteract the problem. We need to invest in these technologies now to solve the greatest challenge facing humanity.

Francesco Ricciuti is a VC at Runa Capital.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers


Author: Francesco Ricciuti, Runa Capital
Source: Venturebeat

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!