Key takeaways from Google I/O 2021Iva Filipović
The conference started with a performance from the amazing Tune-Yards alongside the AI-powered choir Blob Opera. The rest of Google I/O 2021 focused on AI, CSR, AR features and commerce. We’ll go through all the key takeaways and features to come in the following weeks and months.
Real-time AR navigation via Google Maps
A big part of the conference was dedicated to Google’s commitment towards becoming carbon-free by 2030 – a journey that started with Google becoming carbon-neutral in 2007.
CSR made its way into Google Maps, with eco-friendly, fuel-efficient routes soon to be visible. Streets will be more detailed, with pedestrian walkways and crosswalks becoming available in 50 countries by the end of the year.
AR Google Maps improvements are by far the most impressive and will ease navigation both outside and inside. Missing the right gate within an airport will be a thing of the past.
In your city, certain places will be highlighted based on the time of the day. This means that more bakeries will be visible in the morning when you’re most likely to look for a place to grab a quick breakfast treat, whereas more dinner spots will be visible in the evening. If Google Maps detects you’re visiting another city, it’ll automatically show you the best local hangouts and tourist attractions to visit.
Great news ahead of LGBT month
The Cinematic Moments feature now uses AI to fill the frames needed for a full animation rendering, while the Memories feature has introduced what they call “little patterns.” Using machine learning, it looks for a set of three or more photos with similarities, which it can single out as a pattern. For example tracking the same spot in your home through photos over years or making a story of an object appearing in many photos.
They’re also working with transgender users – who noted how painful revisiting certain moments can be – to develop a feature that will group all the past memories of your previous self and enable users to hide or delete them at once.
Virtual workspace is here to stay
Covid has undoubtedly changed the way we work – and Google has identified how to make the best of it. This leads us to all the updates related to Google Workspace. One of those updates is the Smart Canvas that allows multiple people to collaborate using the same document – including audio and video. As the cherry on the top: whenever someone is presenting within the document, they can enable live captions which will show what the presenter is saying and display captions in readers’ native language.
Google also demoed a new project, Project Starline, which showcased realistic video calls that will feel as if they’re conducted in person, enabling eye contact. This project will be closely followed by our DXD and Insights teams, as interviewing is one of the essential aspects of the research we conduct.
Humanising the search engine
It’s becoming increasingly difficult to distinguish between human and AI-written articles. Google revealed the research they did around language complexity in order to create a Google search assistant that gives more natural and personal answers. The search assistant will work based on the laMDA model – a language model for dialogue applications that’s open source and will therefore be able to converse on any topic. Obviously Google managed to leapfrog GPT-3 by several factors.
LaMDA model works with text only, but it will become a multimodal model (MUM), meaning that it will combine text, image, audio and video inputs for the best output possible.
Another great update coming to Google Search is easier access to information about the source of the search results you’re getting as well as articles from other sources on a similar news topic.
Better connection between product suppliers and seekers
As Google’s Bill Ready said, shopping inspiration can strike at any moment, so they want users to have a personal product showroom with them wherever they go. Very soon we’ll be able to take a picture of a random chair in a restaurant and get Google results showcasing that exact or similar chairs available for purchase. If you’ve used AliExpress search product via photo feature, it’s essentially the same.
Google struck a deal with Shopify that will display products from their 1.7 million e-commerce sites in search results and allow users to buy directly from the SRP, lowering the friction. Seems like Shopify-powered commerce will get some unfair advantage on the competition.
In addition, upon opening Chrome, you’ll have an overview of all the recent carts you put products in and notify you when the price of a certain product in the cart has dropped or if there’s a discount offered by a loyalty programme you’re signed up for. Though not the same, it makes one think of the Honey extension which trawls the web for coupon codes in exchange for info on your shopping habits and patterns.
An impressive Android 12 update will better accommodate people of darker skin tones. Cameras will now work more fairly, better detect darker skin tones and adjust the lighting around them accordingly.
Another exciting feature is how Google will use AI to help patients find answers to dermatology questions using their personal cameras. We’ll be able to upload three different photos of skin concern and answer a few basic questions after which AI will detect the most likely skin condition we have. The feature will cover 288 conditions, including 90% of the most common ones. Mammography will be aided with AI as well. It will detect and prioritise patients who need care first by identified the stages at which the condition progressed based on the photo submitted.
Hopefully, you’re just as excited as us to test all the new features once they’re available. Until then, check out the moves of world-class athletes just added to the Google AR search feature.