AI & RoboticsNews

Google’s AI powers real-time orca tracking in Vancouver Bay

Google AI today shared that it’s created a model for detecting an endangered species of orca whales in the Salish Sea, a waterway between the United States and Canada. Underwater microphones situated at a dozen points in the Salish Sea that includes the state of Washington and Vancouver Bay are used to alert officials when a Southern Resident killer whale is detected.

Less than 100 of these whales are thought to still be alive, according to the Center for Whale Research.

The orca detection model is the latest from Google, and it follows previous acoustic AI work to detect the sound of chainsaws in rainforests to stop illegal lumber operations and work last year with the National Oceanic and Atmospheric Administration (NOAA) in the U.S. to help protect humpback whales.

The orca model runs on a platform operated by the nonprofit Rainforest Connection. Detection alerts are sent to the smartphone of Department of Fisheries and Oceans (DFO) officials in Canada, who can dispatch the Canadian Coast Guard to clear boat traffic in Vancouver Bay.

The DFO provided 1,800 hours of underwater audio recordings with 68,000 labels to train the model. Notifications also play an audio recording of sounds detected by a deep neural network so human experts can verify the prediction and make their own predictions about the whale’s current state of health.

Additional work is ongoing to better discern between the Southern Resident and other orca species and understand when specific whale sounds are associated with health problems, Google AI engineer Matt Harvey told VentureBeat.

The company shared the news today at a Google AI event in its San Francisco office. In addition to real-time whale tracking, Google AI announced that it can now detect signs of anemia in people from retinal eye scans. This is the latest work from Google that uses computer vision to find patterns in eye scans, following work to identify diabetic retinopathy and a range of eye diseases.

Google AI shared plans to bring transcriptions to Google Translate for long-form interpretations akin to the kind of speech-to-text transcriptions Pixel 4’s Recorder app now provides. No release has been set, a company spokesperson told VentureBeat.


Author: Khari Johnson.
Source: Venturebeat

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!