AI & RoboticsNews

IKEA launches AI-powered design experience (no Swedish meatballs included)

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


For IKEA, the latest in digital transformation is all about home design driven by artificial intelligence (AI) – minus the home furnishing and decor retailer’s famous Swedish meatballs. 

Today, it launched IKEA Kreativ, a design experience meant to bridge the ecommerce and in-store customer journeys, powered by the latest AI developments in spatial computing, machine learning and 3D mixed reality technologies. Available in-app and online, IKEA Kreativ’s core technology was developed by Geomagical Labs, an IKEA retail company, which Ingka Group (the holding company that controls 367 stores of 422 IKEA stores) acquired in April 2020.

IKEA Kreativ is the next step in IKEA’s long journey towards digital transformation. According to the company, it is the home retail industry’s first fully featured mixed-reality self-service design experience for lifelike and accurate design of real spaces deeply integrated in the digital shopping journey. A user can use either IKEA inspiration imagery from the digital showroom or their own captured images to place furniture and furnishings, experiment with options, and more. 

“The 3D and AI technology developed by Geomagical Labs will be used to bring out the uniqueness of IKEA digitally,” Phil Guindi, head of products at Geomagical Labs. “This is a vital moment in the IKEA transformation journey as we continue to develop and innovate to meet the customer and their needs, where they are.” 

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.


Register Here

How IKEA Kreativ offers AI-driven exploration

Home design can be hard, said Guindi, because people often buy without context, relying on their imagination. “In fact, 87% of our customers say that they want to feel good about their home, but only half of them know how to do it,” he said.

IKEA Kreativ turns room photographs into impressive interactive, “digital playrooms” that anyone can use to explore design ideas. “By understanding a photographed scene in great detail – including the 3D geometry of the scene, the objects in the scene, the lighting in the scene, and the materials in the scene – customers can interactively design room imagery, allowing them to add furniture & wall decor, and remove existing furniture digitally,” he said. 

The core technology is bespoke and proprietary, he said. “Whenever possible, modern open standards and open source were used, rather than licensed or vendor services, to maximize freedom to innovate and minimize lock-in.” he said. 

Both mobile and web applications connect to a scalable, containerized, cloud-based platform of microservices and AI pipelines, hosted by the Google Cloud Platform.

According to Guindi, when a customer scans a room, photography and sensor data from the phone is uploaded to the AI pipeline in the cloud, where it’s processed in a heavily parallelized GPU compute cluster to create wide angle imagery with spatial data to allow client applications to interactively design and edit the scenes. 

“By offloading complex calculations to the cloud, we allow less-expensive, low-powered mobile devices to run complex AI algorithms, and reach more of IKEA’s customers,” he said. 

Neural network technology similar to self-driving cars

The AI for IKEA Kreativ was developed using the same visual AI neural network technologies as self-driving cars, to understand home spaces using visualization at scale, according to the company’s press release. 

Guindi explained that this is similar to self-driving cars because to safely operate, robotic vehicles need to form some understanding of the space around them, called ‘spatial perception’ or ‘scene perception.’ 

“Robotic vehicles use multiple technologies to understand the world around them, such as visual-inertial SLAM to estimate the vehicle’s position in space, stereo vision to estimate depth of pixels in an video image, neural depth estimation to estimate depth from video imagery, and semantic segmentation to recognize and outline important objects in the space,” he said. 

Understanding an indoor scene shares some similarities with a vehicle sensing its environment. The new IKEA digital experience borrows inspiration from these methods – though indoor design applications face unique and difficult challenges. For example, neural network training for vehicular applications won’t work well indoors because rooms and cities look very different, and contain different objects. 

“You need to specially train networks on large volumes of indoor training data to provide usable indoor results,” Guindi said. 

Indoor rooms and 3D computer vision

Indoor rooms are also notoriously hard to geometrically reconstruct by 3D computer vision algorithms because many surfaces are carefully painted to be blank (with no trackable visual features), are shiny (the mirror reflections confused the distance estimators), have repeating factory-made patterns (that confuse visual feature trackers), or fail because rooms are much darker than outside environments.

“Because we implement a mixed-reality solution to harmonize virtual objects with real photography, we need to have a much better understanding of the lighting inside a room,” Guindi explained. “We want to ‘erase objects’ from home photos, so we need technology to be able to estimate the geometry and imagery hidden behind furniture.” And while robotic vehicles can afford expensive sensing hardware such as laser depth sensors, radar, and multi-camera arrays, IKEA values accessibility, he added: “We want everyone to use the applications on everyday smartphones, without requiring exotic or specialized hardware.” 

Once a view is scanned and processed by the Cloud AI pipeline, consumers are able to design a room from anywhere on any device be it their desktop web browser, laptop web browser, and on the IKEA mobile app.

“When they start a design experience, they will be shown an immersive, wide-angle, 3D photo of the scanned room,” Guindi said. Consumers can browse a catalog pane of IKEA products and pick any number of products to add to the room, which will appear inside the photo, with realistic size, perspective, occlusion and lighting. The user can move and rotate products, stack decor accents on top of other products and hang wall art on the walls. Customers can also edit their original room by digitally removing items they may no longer want like an old couch they want to replace.

IKEA Kreativ faced many challenges

IKEA Kreativ faced many challenges before being brought to market, said Guindi. For one thing, it was challenging to bring IKEA products to life in an inspiring and realistic way. 

“This requires detailed estimates of scene lighting and geometry, and requires using 3D graphics rendering to render the materials,” he said. “And this requires producing memory-efficient, visually-beautiful, 3D representations of the IKEA product range art.” 

In addition, IKEA needs to understand a photographed scene accurately enough so the customer can successfully design their space. “This means detecting the existence and 3D location of floors, walls and surfaces where one can place furniture, so they can see them at the right size and perspective,” Guindi explained. For example, that means having foreground objects in the room believably obscure objects that you move behind them. It also includes estimating the light sources in the scene to cast shadows and reflections. 

Improvements to come over time

AI scene perception technology is an actively researched topic and remains a very difficult challenge, Guindi continued. “Today’s product can feel magical, but is not flawless,” he said. “It can get confused, for example, if it does not see much floor, or if the walls are completely blank.” 

Still, IKEA Kreativ “is a big leap forward for IKEA and for our customers, it offers capabilities to customers not previously possible,” he said. “We’re seeing our customers achieve impressive results, and have evaluated the product and technology on hundreds of thousands of diverse rooms where we’ve seen the algorithms work well in the vast majority of cases.” 

Additional improvements will be released over time, he added, including the ability to change wall colors in your own space and add wall and ceiling-mounted furniture, as well as more collaborative ways of designing a home with others.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.


Author: Sharon Goldman
Source: Venturebeat

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!