MobileNews

Google launches Android 11 Developer Preview 2 with foldable, call screening, and Neural Networks API improvements

Google today launched the second Android 11 developer preview with hinge angle detection support, call screening improvements, and new ops in the Neural Networks API. You can download Android 11 DP2 now from developer.android.com — if you have the previous preview, Google will also be pushing an over-the-air (OTA) update. The release includes a preview SDK with system images for the Pixel 2, Pixel 2 XL, Pixel 3, Pixel 3 XL, Pixel 3a, Pixel 3a XL, Pixel 4, and Pixel 4 XL, as well as the official Android Emulator.

Google launched Android 11 DP1 in February, the earliest Android developer preview it has ever released. Last year, Google used the Android Beta Program, which lets you get early Android builds via over-the-air updates on select devices. This year, however, Google is not making the first previews available as betas (you’ll need to manually flash your device). In other words, Android 11 is not ready for early adopters to try, just developers. Like DP1, Android 11 DP2 is only available on eight Pixel phones. That’s a tiny slice of the over 2.5 billion monthly active Android devices — the main reason developers are eager to see what’s new for the platform in the first place. Google will likely release Android 11 to more phones in DP3 or by the first beta. To help Google keep the previews coming, you can give feedback and report bugs here.

Android 11 DP1 brought 5G experiences, people and conversations improvements, Neural Networks API 1.3, privacy and security features, Google Play System updates, app compatibility, connectivity, image and camera improvements, and low latency tweaks. DP2 builds on those with a few notable additions.

Android 11 DP2 features

Here’s the rundown of the new features added as part of Android 11 Developer Preview 2:

  • 5G state API: This API lets you check whether the user is currently on a 5G New Radio or Non-Standalone network. You can use this to highlight your app’s 5G experience or branding when the user is connected. You can use this API together with the 5G dynamic meteredness API and bandwidth estimator API, as well as existing connectivity APIs.
  • Hinge angle for foldables: This API lets you get the angle of the device screen surfaces via an available hinge angle sensor. Apps can query the sensor directly or through a new AndroidX API for the precise hinge angle.
  • Call screening service improvements: New APIs can now help users manage robocalls. In addition to verifying an incoming call’s STIR/SHAKEN status as part of its call details, call-screening apps can report a call rejection reason, and with permission they can see whether a call is to/from a number in the user’s contacts. Apps can also customize a system-provided post call screen to let users perform actions such as marking a call as spam or adding to contacts.
  • New ops and controls in Neural Networks API: Activation functions control the output of nodes within a neural network. Google AI discovered a swish activation function for faster training time and higher accuracy across a wide variety of tasks. A computationally efficient version of this function, the hard-swish op, is now built into Android 11. The new Control ops also enables more advanced machine learning models that support branching and loops. New execution controls help you minimize latency for common use cases: Asynchronous Command Queue APIs reduce the overhead when running small chained models.
  • Foreground service types for camera and microphone: If your app wants to access camera or microphone data from a foreground service, you now need to add the foregroundServiceType value to your manifest.
  • Scoped storage updates: Improvements and changes to protect app and user data on external storage, such as support to migrate files from the legacy model to the new scoped storage model, and better management of cached files.
  • Synchronized IME transitions: A new set of APIs let you synchronize your app’s content with the IME (input method editor) and system bars as they animate on and offscreen. For frame-perfect transitions, a new insets animation listener notifies apps of per-frame changes to insets while the system bars or the IME animate. Additionally, apps can take control of the IME and system bar transitions through the WindowInsetsAnimationController API.
  • Variable refresh rate: Apps and games can now set a preferred frame rate for their windows. Most Android devices refresh the display at 60Hz refresh rate, but some devices support multiple refresh rates, such as 90Hz as well as 60Hz, with runtime switching. On these devices, the system uses the app’s preferred frame rate to choose the best refresh rate for the app. The API is available in both the SDK and NDK.
  • Resume on reboot: Android 11 improves the experience of scheduled overnight over-the-air software updates. With resume on reboot, apps are now able to access Credential Encrypted (CE) storage after the OTA reboot, without the user unlocking the device. This means apps can resume normal function and receive messages right away. Apps can still support Direct Boot to access Device Encrypted (DE) immediately after all types of reboot.
  • Camera support in Emulator: The Android emulator now supports front and back emulated camera devices. The back camera supports Camera2 API HW Level 3 (includes YUV reprocessing, RAW capture). It’s a fully CTS-compliant LEVEL_3 device that you can use to test advanced features like ZSL and RAW/DNG support. The front camera supports FULL level with logical camera support (one logical device with two underlying physical devices). This camera emphasizes logical camera support, and the physical camera devices include narrow and wide field of view cameras.

Preview/Beta schedule

After you’ve flashed Android 11 onto your device or fired up the Android Emulator, you’ll want to update your Android Studio environment with the Android 11 Preview SDK (set up guide). Then install your current production app and test all of the user flows. For a complete rundown on what’s new, check the API overview, API reference, and behavior changes.

VB TRansform 2020: The AI event for business leaders. San Francisco July 15 - 16

The goal of the developer previews is to let early adopters and developers play with the build early so they can explore new features and APIs for apps, test for compatibility, and give feedback. Normally, more details would be shared during Google’s developer conference in May, but given that event has been cancelled, Google will likely adjust how it reveals more information. Either way, expect more new features and capabilities in subsequent previews and betas.

Android 11 beta timeline

Last year, there were six betas. This year, there will be three developer previews and three betas. Here’s the preview/beta schedule for Android 11:

  • February: Developer Preview 1 (Early baseline build focused on developer feedback, with new features, APIs, and behavior changes.)
  • March: Developer Preview 2 (Incremental update with additional features, APIs, and behavior changes.)
  • April: Developer Preview 3 (Incremental update for stability and performance.)
  • May: Beta 1 (Initial beta-quality release, over-the-air update to early adopters who enroll in Android Beta.)
  • June: Beta 2 (Platform Stability milestone. Final APIs and behaviors. Play publishing opens.)
  • Q3: Beta 3 (Release candidate build.)
  • Q3: Final release (Android 11 release to AOSP and ecosystem.)

Google is asking developers to make their apps compatible with Android 11 so that their users can expect a seamless transition when they upgrade. “We recommend doing the work early, so you can release a compatible update by Android 11 Beta 1,” Google VP of engineering Dave Burke wrote today. “This lets you get feedback from the larger group of Android 11 Beta users.”

Check out the latest Samsung phones at great prices from Gizmofashion – our recommended retail partner.


Author: Emil Protalinski.
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!