AI & RoboticsNews

AI experts urge machine learning researchers to tackle climate change

At the Tackling Climate Change workshop at this year’s NeurIPS conference, some of the top minds in machine learning came together to discuss the effects of climate change on life on Earth, how AI can tackle the urgent problem, and why and how the machine learning community should join the fight.

The panel included Yoshua Bengio, MILA director and University of Montreal professor; Jeff Dean, Google’s AI chief; Andrew Ng, cofounder of Google Brain and founder of Landing.ai; and Cornell University professor and Institute for Computational Sustainability director Carla Gomes.

The Tackling Climate Change workshop explored a wide range of topics, from the use of deep reinforcement learning to improve performance for ride-hailing services like Uber and Lyft to the application of deep learning to predict wildfire risk, detect avalanche deposits, improve plane efficiency with better wind forecasts, and conduct a global census of solar farms.

The workshop is put together by Climate Change AI, a group that hosts workshops at AI research conferences and a forum for collaboration between machine learning practitioners and people from other fields.

Above: Left to right: Yoshua Bengio, Andrew Ng, Carla Gomes, Lester Mackey, and Jeff Dean

Valuing research

One essential step in better addressing the world’s pressing challenges, says Bengio, is changing the way AI research is valued.

Bengio, who talked about the development of what he calls “basic consciousness” earlier in the week, was the top-cited computer science researcher in 2018. He said the machine learning community needs to change its attitude toward the research submitted to major conferences like NeurIPS by evaluating the work’s genuine impact on the world.

“I think the sort of projects we’re talking about in this workshop can potentially be much more impactful than one more incremental improvement in GANs or something,” Bengio said.

NeurIPS organizers said Wednesday that they may make AI models’ carbon footprint part of future submission criteria for conference papers.

“The reason we count papers is because we just decided that this was the metric we wanted to optimize, but it’s the wrong metric. We should be thinking about what, why — ‘Why am I doing this work, and what do I contribute to society?’” Bengio said.

“The current psychological and cultural mood is so focused on publication, you know, being first author and putting more things on your CV to get a good job, that it’s not healthy. It’s not something where students and researchers feel good. We feel oppressed, and we feel like we have to work [an] incredible number of hours, and so on. Once we start stepping back from this and thinking about what we can bring to the world, the value of truthful long-term research, the value of doing projects that can impact the world — like climate change — we can feel better about ourselves and our work, less stressed, and at the end of the day even [create] better science,” he said.

Going small to get big  

The panel also discussed specific technical advances in machine learning that can most effectively combat climate change.

Andrew Ng, along with other panelists, called for progress on ML that works with small data sets and applications like self-supervised learning in tandem with transfer learning so that training models requires less data.

“A lot of machine learning, modern deep learning, has grown up in the large consumer internet companies, which have billions or hundreds of millions of users, and have large data sets, in the climate change setting when we look inside their imagery,” he said. “Sometimes we have only hundreds or maybe thousands of pictures of wind turbines or whatever… [with] these very small data sets, I find that you need new techniques in order to address them, and [what] I see broadly is that for machine learning to break into other disciplines outside software [and] internet [companies], we need better techniques to deal with small-data or low-data regimes.”

Gomes agreed and said working on machine learning challenges for climate change is a two-way street, where advances in solving problems wrought by climate change can lead to innovation for machine learning.

“I do think [for] the future of AI and ML, a great challenge is scientific discovery. Indeed, how to embed prior knowledge, scientific reasoning, and how to be able to deal with small data,” Gomes said.

At an earlier NeurIPS workshop, Facebook AI Research director Yann LeCun talked about how energy efficiency in machine learning is necessary to make new tech like AR glasses a reality.

During the panel discussion, Dean suggested that transfer learning and progress in multitask learning are both promising techniques that could have applications for climate change. The challenges of climate change could at least be an interesting test bed for those kinds of techniques, he said.

The groundwork

The panelists were not speaking off the cuff about climate solutions. In fact, they’ve been working on these issues for some time.

In June, Bengio, Ng, and Gomes joined a cadre of more than 20 Climate Change AI steering committee and advisor members, including DeepMind founder Demis Hassabis. Together they published a paper titled “Tackling Climate Change with Machine Learning,” which contains 650 references. You can find a searchable summary and accompanying data sets on the group’s website.

The paper explores machine learning applications for climate change, like forecasting supply and demand or extreme weather events, or predictive AI that can make more efficient cities and transportation and electricity systems.

The authors said the paper is intended not just for AI practitioners, but can aid a range of people who need to meaningfully participate in climate change work, including entrepreneurs, investors, and business and government leaders.

On the subject of how machine learning practitioners can get started in climate change, Ng suggests that rather than focusing on the scale of the problem, start slow with review of related data sets, experiments with friends, and eventually publish your research or engage in conversations with climate scientists.

Gomes recommends working with people outside the machine learning research community.

“I do worry about computer science. We think we are good at everything — coming up with solutions that are completely unrealistic and don’t make sense in the particular domain, so it’s important to connect with the experts and create the network,” she said.

Bengio said that guarding against “reinventing the wheel badly” requires humility and collaboration with experts in fields where ML can be applied.

Lester Mackey works at Microsoft Research on models that predict subseasonal weather forecasts two to six weeks out to better predict floods, fires, and changes that are already happening due to climate change.

He suggests getting involved in a climate change competition as a good way to get started.

“There’s a lot of low-hanging fruit in the space, and it would be great for everyone to move in and fill that space,” Mackey said.

A code of ethics

Ng suggested the AI research community adopt a sharper code of ethics, along with legal protections to back up those ethical norms in the same way that doctors are beholden to patients. Any code of conduct, he said, should be written by the AI research community, for the AI research community.

Whether it’s a collectively agreed upon code of ethics or something else, Ng said a more explicit or actionable social agreement should be made within the AI research community. He added that erosion of trust in tech is something that needs to be addressed.

There are, of course, many AI codes of ethics out there already. But many are too vague to be useful. For example, Ng said he read the OECD code of AI ethics principles to engineers and then asked them how they would change the way they do their jobs as a result. The feedback was an almost unanimous “not at all.”

Regarding the existing AI codes of ethics, Ng said, “The Google one is well thought out, the Microsoft one is well thought out, I think the OECD is well thought out, but I think there is more we need to do.”

The panel conversation also touched on the importance of including people impacted by climate change in the creation of solutions meant to combat climate change.

In a paper on relational ethics that won the top paper award at the Black in AI workshop at NeurIPS last week, University of Dublin researcher Abeba Birhane called for machine learning practitioners to work closely with communities impacted by the systems they create.

In a keynote address to start the workshop, Dean called climate change the problem of the 21st century and talked about the potential of AI with no carbon footprint.

“We can make computation itself zero carbon so that it’s not contributing to the problem that is actually being used to apply and find solutions to some of these problems. Algorithms alone aren’t enough. You really need these algorithms integrated into systems that are then tied into applications that are going to have the most impact in climate-related problems, and attacking hard climate problems is an important part of what we should be doing,” Dean said.

He also talked about ways to bring about behavioral change — like helping people understand their personal carbon footprint. Following a question from the audience about sharing CO2 predictions in Google Maps, Dean said the company is thinking of including more information in Google search results to give users a predicted carbon output for choices they make, like ordering a certain product.

“I think careful observations and education of the public to incorporate language that makes clear this is a real imminent thing, not a made up thing — I think the scientific consenus is 100% on this — we just need to continue to try to push on educating everyone and really not only make them accept that this is happening, but also what they can do to make better decisions,” he said.

Dean also highlighted Google machine learning  projects that have the potential for climate impact, such as one that’s intended to create fusion energy, the use of Bayesian inference for things like weather predictions, and Project Sunroof, which looks at a person’s roof and local weather patterns to predict total savings if they chose to install solar panels. Google also expanded its flood predictions for people living along the Ganga and Brahmaputra rivers in India earlier this year.

And a paper released by Google AI during a workshop poster session Saturday highlights how machine learning can be applied to radar imagery to predict rain.

In an interview with VentureBeat Thursday, Dean supported the kind of carbon per watt standard benchmark for AI hardware recommended by Intel AI general manager Naveen Rao and shared his thoughts on 2020 AI trends.

The next Tackling Climate Change with Machine Learning workshop will be held in April 2020 at the ICLR conference in Addis Ababa, Ethiopia.


Author: Khari Johnson
Source: Venturebeat

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!