5 mins

Android 11 – A first glance

Exciting news as Google has lifted the lid off the first developer preview of Android 11. After a major branding revamp for Android 10 and big changes around battery management, security and privacy let’s take a look at what we can expect from the next version of Android.

Time has flown since we discussed Android 10 in October last year and the changes that were being made for us only 6 months or so ago but here we are with more updates and features.

With Android 11 Google are keeping their focus on helping users take advantage of the latest innovations, while continuing to keep privacy and security a top priority. They’ve added multiple new features to help users manage access to sensitive data and files and they’ve hardened critical areas of the platform to keep the OS resilient and secure. Android 11 brings enhancements for foldables and 5G, call-screening APIs, new media and camera capabilities, machine learning, and more.
This is just a first look; like prior years, they’ll continue to share new features and updates over the coming months and into Google I/O as they work through developer feedback.

This release is an early baseline build for developers only and not intended for daily or consumer use, so it can only be tested using a manual download and flash to Google Pixel devices. Read on for our take on what’s new in Android 11.

Helpful Innovations

  • 5G experiences – In Android 11 Google are enhancing and updating the existing connectivity APIs so you can take advantage of 5G’s improved speeds. 5G brings consistently faster speeds and lower latency and using 5G you can extend your Wi-Fi app experiences — such as streaming 4K video or loading higher-res game assets to mobile users, or you can build new experiences designed specifically for 5G.
  • Dynamic meteredness API – with this API you can check whether the connection is unmetered, and if so, offer higher resolution or quality that may use more data. Google have extended the API to include cellular networks, so that you can identify which users’ networks are offering truly unmetered data while connected to the 5G network.

 

  • Bandwidth estimator API – Google have updated this API for 5G to make it easier to check the downstream/upstream bandwidth, without needing to poll the network or compute your own estimate. If the modem doesn’t provide support, the OS makes a default estimation based on the current connection.

New screen types

Device makers are continuing to innovate by bringing exciting new form-factors and device screens to market. Google have extended support for these in the platform, with APIs to let you optimize your apps.

  • Pinhole and waterfall screens – Apps can manage pinhole screens and waterfall screens using the existing display cutout APIs. A new API lets your app use the entire waterfall screen including the edges, with insets to help you manage interaction near the edges.

People and conversations

Communicating with friends and colleagues is the most important thing many people do on their phones. Google are introducing changes which will help developers create deeper conversational experiences.

  • Dedicated conversations section in the notification shade – users can instantly find their ongoing conversations with people in their favorite apps.
  • Bubbles – Bubbles are a way to keep conversations in view and accessible while multi-tasking on their phones. Messaging and chat apps should use the Bubbles API on notifications to enable this in Android 11.
  • Insert images into notification replies – If your app supports image copy/paste, you can now let users insert assets directly into notification inline replies to enable richer communication as well as in the app itself. As part of DP1 – you’ll see image copy support in Chrome and image paste support via Gboard clipboard.

Real-time, bilateral communication apps should use the sharing/conversation shortcuts API to provide People targets that Android will surface throughout the phone as well as Bubble APIs to allow users to carry on conversations while using the device in other capacities.

Neural Networks API 1.3

Google’s Neural Networks API (NNAPI) is designed for running computationally intensive operations for machine learning on Android devices. In this update they have expanded the operations and controls available to developers including new operations and execution controls to help optimize common use cases:

  • Quality of Service APIs support priority and timeout for model execution.
  • Memory Domain APIs reduce memory copying and transformation for consecutive model execution.
  • Expanded quantization support, added signed integer asymmetric quantization where signed integers are used in place of float numbers to enable smaller models and faster inference.

See the NDK sample code for examples using these new APIs.

Google are now working with hardware vendors and popular machine learning frameworks such as TensorFlow to optimize and roll out support for NNAPI 1.3.

With much more to discuss and share on the Android 11 developer preview i will cover several more topics over the coming weeks.  My next blog will cover an interesting area for Android in privacy and security as they are making changes quickly to satisfy a growing concern in their user base.