AI & RoboticsNews

Google Assistant can now share photos, search podcasts, and make notes with third-party apps

Google Assistant on smartphones, smart displays, and smart speakers is getting a series of upgrades today, like the ability to create notes and lists with third-party apps and search podcasts by topic.

Google Assistant on smartphones is also gaining the ability to share photos with voice so you can say “Hey Google, show me photos from last weekend.” Once Google Photos results appear on your screen, you can choose the photos you want to share and tell Google Assistant to send them to one of your contacts. Last fall, Google Nest smart displays got the ability to share photos that appear on the device’s digital photo frame or in voice search results after users say things like “Hey Google, show me my photos from San Diego.”

Podcast search by topic can be carried out with voice commands like “Hey Google, show me podcasts about New Year’s resolutions” or “Hey Google, find a podcast about holiday cooking.” Podcast search by topic is available today for English speakers on Google Assistant devices worldwide. It’s worth noting that Google began to include podcasts in Google.com search results in August, and Google Assistant was previously able to recommend podcasts.

Notes and lists made with Google Assistant can now be added to apps like Google Keep, Any.do, AnyList or Bring. About a year ago, Google launched the ability to create new lists with your voice and shared plans to integrate with apps like Any.do and Google Keep. Google Nest smart displays can display lists but are currently unable to create notes. Notes with Google Assistant appear to be limited to 30-second dictation using speech-to-text AI to translate a recording into words. Unfortunately for list lovers, Google Assistant is still unable to add multiple items to a list in a single utterance, like “Hey Google, add avocados, walnuts, and champagne to my shopping list.” Alexa got the ability to add multiple items to a list in 2018.

Today’s news follows a range of recent upgrades for Google Assistant, including the ability to assign reminders to members of your household, a faster Google Assistant for Pixel smartphones, and a deeper integration with G Suite for business users.

Google Nest smart displays also recently got ultrasound sensing to better understand when a person is nearby and changes information that appears on screens based on what it predicts will be most important for the user at a specific time of day. It also changes how that information appears based on a person’s distance from a device.

Google Assistant’s Ambient Mode began to roll out to Android devices this week and gives users the ability to complete voice commands with their phone from their lock screen, and — like ultrasound sensing — lets them see a display that offers quick control of smart home appliances or a look at their next calendar event.


Author: Khari Johnson
Source: Venturebeat

Related posts
AI & RoboticsNews

From sci-fi to reality: The dawn of emotionally intelligent AI

DefenseNews

Four ways US Army’s Pacific chief plans to boost regional land forces

DefenseNews

Navy, Marine Corps pitched three systems for first Replicator batch

DefenseNews

Takeaways from the voyage to Gaza for the US Army’s watercraft program

Sign up for our Newsletter and
stay informed!