Colorful design of Android 12, partnership with Samsung and telepresence of the future



With Android 12, Google chose a design that matches the colors of the wallpaper. – GOOGLE

From our correspondent in California,

Here is the Google conference again. Canceled last year because of the Covid-19 pandemic, Google I / O, the annual high mass for the developers of the Mountain View giant, opened with the traditional keynote – virtual but live – on Tuesday. Highly anticipated, Android 12 offers a new colorful design that fits harmoniously to the wallpaper. Google also unveiled a partnership with Samsung to wake up its Wear OS system from its slumber, and, perhaps, finally compete with the Apple Watch. The Californian company continues to rely on artificial intelligence to improve its products (Search, Maps and Photo). And we saw a stunning demo of conversation with artificial intelligence, then glimpsed the future of telepresence with Project Starline.

Android 12 multicolor

It is not known if Google was influenced by the pandemic and the isolation, but the company seems to have decided to put some cheer in its design. Called “Material You”, the new approach focuses on personalization, particularly through colors. It starts with Android 12, first on Pixel smartphones, but Google will then apply this design to all of its products, including hardware, with a new pastel range for Google Home or Chromecast.

Android 12 offers color themes that automatically adapt to the wallpaper, such as shades of green to accompany a photo of a plant. The menus, notifications and widgets are also revised, with a more cartoonish and playful feeling. This cohesion in design across different products could help Google solidify an identity that sometimes lacks character. The lower density of information in Android’s menus, however, may not appeal to everyone.

A partnership with Samsung to challenge the Apple Watch

After a long sleep, WearOS finally wants to wake up. Google has announced an unexpected partnership with Samsung to better compete with the Apple Watch. Wear OS and Tizen to merge to unify a platform wearable fragmented. Simply called Wear, the new system’s priority is to optimize responsiveness and autonomy, and improve the quality of apps and notifications. The two companies have not revealed anything on the hardware side, but the OS could equip the future Galaxy Watch 4 and the hypothetical Pixel Watch.

Artificial intelligence continues to progress

Research, Photos, Assistant… Google relies more than ever on AI to improve its products. Google Photos, in particular, is capable of creating live photos from two or three still images, guessing the frames missing thanks to machine learning.

On the research side, the new machine, called Mum (Multitask unified model), is announced as 1,000 times more powerful than the previous one, which helps it to understand complex queries. For example, with the question “I hiked to the top of Mount Adams, and would like to hike Mount Fuji in the fall.” What different preparations do I need to make? », The system compares the height and drop of the two volcanoes, knows that in the fall, it rains often in Japan, and will suggest hiking boots and waterproof clothing.

The almost natural conversation with the machine

Google had already made stunning demonstrations of conversation with an artificial intelligence with almost human speech. But they only worked in very scripted settings, like reserving a table at a restaurant. His latest language model, LaMDA, allows for the first time to have a discussion with the machine on any subject. For example asking a paper plane the secret to flying well (“define well,” the AI ​​replies, “farthest, straighter or longest?), And the worst place it has ever landed (“ in a puddle). If the demonstration is representative of the technology – it’s always difficult to say when they are pre-recorded – the progress is impressive.

Project Starline, the telepresence of the future

“I could almost touch her. »One of the few people chosen to test Project Starline seemed genuinely moved. Call it telepresence, 3D video conferencing, holographic video orholodeck Star Trek, whatever: it is above all a technical feat.

The system uses a dozen cameras and depth sensors to create a photorealistic 3D model of a person, which is compressed and sent down the pipes of the Internet, then displayed in real time in a “magic glass panel,” says Google. Understanding a display light-field of 65 inches – which undoubtedly costs tens of thousands of dollars – and which makes it possible to perceive the depth without wearing special glasses (Google did not specify if there is an eye tracking system as on a prototype of ‘screen light-field from Sony). Add a big dose of post-treatment for a touch of light and shadow, spatial audio, and voila: you have the impression that your interlocutor is really present, in front of you. Please note, it is currently a prototype, and given the cumbersome nature of the system, it is primarily intended for companies and the medical world. Google seems here to take the opposite view of virtual reality, with a technology that is being forgotten to better connect us.





Source link