How we made AR a reality at Furlenco
Purchasing furniture is like stepping into a relationship. You’re letting someone into your home; into your personal space. Sometimes, that someone is the reason why a ‘house’ becomes a ‘home’. You want that someone to be as perfect as possible. To put it in simple words, this is a big decision for a customer.
It was natural that people would spend a lot of time in making that big decision. But there was one fundamental problem: visualising the furniture.
Unlike purchasing other products online like clothes (where you can trust the image gallery for colour or fitting) or electronics (where you can go with the brands and specifications), our products at Furlenco needed additional credibility. For something as significant as furniture, it was important for people to be able to see how the new items would fit in with their existing space in terms of size and design.
A solution for this would be to open experience stores where people could come visit, touch and feel the products and then purchase online. A couple of reasons why we never wanted to do this: One, it’s an expensive proposition. Two, it feels like a regressive move to go back to brick and mortar stores when our users want more convenience by the click of a button from the comforts of their home. We are an e-commerce company and we wanted to keep it that way. This was a situation where tech could build a scalable solution.
Enter Augmented Reality
Augmented Reality (AR) was touted to be the next big thing in technology. It was about creating an illusion by placing a virtual object in the physical space, via a mobile phone’s camera.
But, there were some visible challenges. AR as a technology had been around for a while, but most people hadn’t been exposed to mainstream engaging implementations of AR. There were a bunch of games on the App Store (Pokemon Go was one that took off well), but there weren’t many that worked, especially in our industry. Our target audience knew AR existed but weren’t actively using it. So we had to build something engaging enough for people to start using regularly. It should help them make better decisions when they’re on the journey of purchasing something via mobile phones.
To sum it up, we had to ensure the experience actually adds value, and doesn’t end up as one of those gimmicky tech features that are more for show and tell than usability. If executed correctly, this could change the way people transacted with mobile devices.
Now that we knew we needed to implement AR, where to begin?
For many years, we piloted projects in various frameworks available in the market. We tried Unity/Vuforia, Facebook’s Spark AR and the initial versions of Apple’s ARKit and Google’s ARCore. We even explored third party companies that promised to plug and play. But the problems we faced were similar. The results never matched our expectations. The user experience wasn’t intuitive. The rendering of objects wasn’t in High Definition. We felt the pictures shot in our studio represented our furniture better than the AR experiences.
Until Apple’s Worldwide Developers Conference (WWDC) in 2018. In this event, Apple unveiled USDZ, a new file format it developed working together with Pixar. This allowed developers to create 3D models for Augmented Reality. Unlike other popular file formats available in the market (like GLTF), this one would be a single, compressed file. We didn’t have to deal with mesh data, materials or textures separately while trying to render it in AR. It made collaborating between teams and transferring files between devices a breeze. The file size was considerably lower too. Apple also released a python tool to help converting 3D models from traditional file formats to USDZ.
Our engineering team, along with our amazing designers managed to create high definition 3D models in compact sizes. Our goal was to carefully construct models that optimised for size without compromising on the quality. Typically, the average size of models for our products in traditional formats (like .obj) was around 25–30MB and for USDZ models, around 7–10MB. But we managed to reduce the size of each model further, with one of them, the chest of drawers, coming down to as low as 495kB! This meant, we could quickly download models on the fly without impacting the experience of the user as well as making the app less bulky.
We managed to leverage ARKit’s World Tracking and Plane Detection efficiently to make sure our users don’t take too long in figuring out how to place the models in their environment.
To create a correspondence between real and virtual spaces, ARKit uses a technique called visual-inertial odometry. This process combines information from the iOS device’s motion sensing hardware with computer vision analysis of the scene visible to the device’s camera. ARKit recognizes notable features in the scene image, tracks differences in the positions of those features across video frames, and compares that information with motion sensing data. The result is a high-precision model rendered relative to the device’s position and motion.
World Tracking analyzes and understands the contents of a scene. Hit-testing methods help in finding real-world surfaces corresponding to a point in the camera image. In Plane Detection, ARKit detects flat surfaces in the camera image and reports their position and sizes. Hit-test results or detected planes are used to place or interact with virtual content in the scene. Light estimation helps with setting ambient intensity and colour temperature resulting in realistic rendering of objects.
Here’s a demo video:
With a combination of ARKit and USDZ, we launched models for around 60% of our products which are now a part of our iOS App live on the App Store. It’s early days but from the data we’ve collected so far, our app users seem to be enjoying the experience. The AR feature enables them visualise our products in their homes and helps them take a more informed decision. As a result, we’ve seen our conversion numbers improve.
We have received favourable responses from everyone using our app. Are you ready to give it a shot? You can download our app from this link.
Pro tips to use it effectively:
- Scan the surface by tilting your phone at a slight acute angle to the horizontal plane (like a table or floor). A non-reflective, contrasting place would work best.
- Move the phone slightly from left to right to detect the area.
- A life size model of the object will get rendered on the surface.
- Once the object gets rendered, you can move it around to place it in the desired position.
- After placing it, you can move around it while it remains in place.
- You can pinch to zoom in and out, much like a picture in your gallery.
- Double tapping it will bring it back to the original size.
- Once you get a grip on how to use it correctly, you can explore further, like going under the bed or inside an open wardrobe.
In addition to this, there is an object mode, which will get activated when the user taps on the button on top. In this mode, the object gets rendered in white space. It helps the user get alternate perspectives of our products.
Feel free to give feedback. Our next iterations are in progress and anything you provide us will be invaluable! 🙂