AI & RoboticsNews

Google’s AI powers real-time orca tracking in Vancouver Bay

Google AI today shared that it’s created a model for detecting an endangered species of orca whales in the Salish Sea, a waterway between the United States and Canada. Underwater microphones situated at a dozen points in the Salish Sea that includes the state of Washington and Vancouver Bay are used to alert officials when a Southern Resident killer whale is detected.

Less than 100 of these whales are thought to still be alive, according to the Center for Whale Research.

The orca detection model is the latest from Google, and it follows previous acoustic AI work to detect the sound of chainsaws in rainforests to stop illegal lumber operations and work last year with the National Oceanic and Atmospheric Administration (NOAA) in the U.S. to help protect humpback whales.

The orca model runs on a platform operated by the nonprofit Rainforest Connection. Detection alerts are sent to the smartphone of Department of Fisheries and Oceans (DFO) officials in Canada, who can dispatch the Canadian Coast Guard to clear boat traffic in Vancouver Bay.

The DFO provided 1,800 hours of underwater audio recordings with 68,000 labels to train the model. Notifications also play an audio recording of sounds detected by a deep neural network so human experts can verify the prediction and make their own predictions about the whale’s current state of health.

Additional work is ongoing to better discern between the Southern Resident and other orca species and understand when specific whale sounds are associated with health problems, Google AI engineer Matt Harvey told VentureBeat.

The company shared the news today at a Google AI event in its San Francisco office. In addition to real-time whale tracking, Google AI announced that it can now detect signs of anemia in people from retinal eye scans. This is the latest work from Google that uses computer vision to find patterns in eye scans, following work to identify diabetic retinopathy and a range of eye diseases.

Google AI shared plans to bring transcriptions to Google Translate for long-form interpretations akin to the kind of speech-to-text transcriptions Pixel 4’s Recorder app now provides. No release has been set, a company spokesperson told VentureBeat.


Author: Khari Johnson.
Source: Venturebeat

Related posts
AI & RoboticsNews

Microsoft brings AI to the farm and factory floor, partnering with industry giants

AI & RoboticsNews

Edge data is critical to AI — here’s how Dell is helping enterprises unlock its value

AI & RoboticsNews

Box continues to expand beyond just data sharing, with agent-driven enterprise AI studio and no-code apps

Cleantech & EV'sNews

Porsche launches three new Taycan EV models, adding more performance and range

Sign up for our Newsletter and
stay informed!