Google provided a look at its latest digital offerings, with a heavy focus on its efforts to extend artificial intelligence features into more of its apps and services.
CEO Sundar Pichai unveiled Google Lens, a set of vision-based computing capabilities that can understand what you are looking at. It will first be available as part of Google’s voice-controlled digital assistant — which bears the straightforward name “Google Assistant” — and Photos app. In the real world, that means you could, for instance, point your phone camera at a restaurant and get reviews for it.
Pinterest has a similar tool. Also called Lens, it lets people point their cameras at real-world items and find out where to buy them, or find similar things online.
Another tool in Google Photos will prompt you to share photos you take with people you know. For instance, Photos will notice when you take a shot of a friend and nudge you to send it to her, so you don’t forget. Google will also let you share whole photo libraries with others. Facebook has its own version of this feature in its Moments app.
One potentially unsettling new feature in Photos will let you automatically share some or all of your photos with other people. Google claims the feature will be smart enough so that you could auto-share only specific photos — say, of your kids — to your partner or a friend.
Android changes coming
The company is also giving the crowd a look at new twists in its Android software for mobile devices, which powers more than 80 per cent of the world’s smartphones. The next version of Android, available to the mass market later this year, aims to gauge and control how much battery life your apps are using. A feature called Google Play Protect, meanwhile, will scan all your apps for malicious software.
As part of a years-old tradition, Google will name the next Android version after a dessert or sweet-tasting snack beginning with the letter “O.” (The current version of…