With the recent unveiling of ARKit for iOS and ARCore for Android, augmented reality (AR) has taken the mobile world by storm. AR apps have popped up across an array of verticals including retail, gaming, travel, education, and healthcare. In response, I recently experimented with numerous AR apps for iOS and Android and I’ve had the pleasure of designing AR apps for two well-known clients while working at POSSIBLE Mobile. Through these experiences, I discovered an abundance of AR apps that have shortcomings in UX and UI design. Because AR is a new technology, designers are actively experimenting with and discovering interface paradigms that work best. Here are some recommendations from what I’ve learned.
When Placing an AR Object, Positioning & Perspective is Key
Upon the initial placement of a 3D AR object on a surface, make sure to position it within the user’s viewport. This will ensure that the user understands the object’s context and has a vantage point that allows the user to easily comprehend and potentially interact with the object. This seems like an obvious point, but all too often AR apps place a 3D object into the experience that is too large, such as putting an animated tiger on a table and having its stomach fill the entire screen. More often than not, the user’s perspective is inappropriately skewed when first placing the AR entity. For example, showing the user a stack of blocks from a bird’s-eye view creates contextual confusion. When skewed, the perspective should automatically adjust so that the user understands the overall picture at first glance, or the app should provide the user with hints regarding proper placement and perspective.
Provide Guidance on the Best Surfaces for Placement and Give the User Restart Options
During the surface detection process, or when a user holds up their camera to place an AR object, provide guidance for the best surfaces to use. At the same time, offer flexibility on the types of surfaces, like a table versus the ground, and allow the camera to detect surfaces with minimal detail. Give the user the option to restart the surface detection process if needed. Furthermore, make it obvious when the user successfully places an AR entity. You can achieve this by modifying the surface detection indicator following placement, or by offering a confirmation message. Lastly, present the user with an option to place the object on a new surface if desired.
Clearly Anchor the AR Object on the Selected Surface
As the user moves around the AR entity, make sure the object is firmly attached to the chosen surface (unless the object is animated or should move during the experience, such as within a game). The surface should give the user context, like a wall when hanging AR paintings. If the AR object moves as the user moves, it may cause confusion. Also, it’s helpful to provide visual cues such as a visual modification of a surface detection indicator once an object is placed on and anchored to a surface. Lastly, allow the surface to respond to the AR item to make it more believable. For example, if you had a dancing figure on a table, you could add moving drop shadows below it.
Use Intuitive Gestures with Instructions on First Launch of AR
When moving AR objects, direct manipulation is best. Use standard, natural gestures such as those shown on Apple’s Gesture Guide when possible. Upon first launch, provide the user with a simple gesture tutorial in order to ease the user into the new experience. For example, show an arrow wrapped around the AR object to indicate rotation with two fingers and a pinch and zoom icon for enlarging and shrinking. Lastly, make sure that gestures don’t conflict with each other or confuse the user. For instance, using a touch and swipe for moving an object and using the same gesture for a rotation action may leave the user feeling perplexed and could hinder the AR experience.
How to Design Effective Concept Images of AR Scenes
AR concept image design, such as the image shown above, presented a challenge for me when I first began designing AR apps for clients. Difficulties arose from the fact that the imagery needs to clearly communicate how AR modifies the real-world environment. Through trial and error, I discovered that it is best to present a large phone or tablet image in the foreground. Within the screen of the device, show an augmented interface that clearly reflects the overall functionality of the AR. For the background behind the phone, show an image of the real-world environment (as it relates to your AR experience) without augmentation and slightly blurred. This signals to the viewer exactly what is virtual and what is real life.
The Future of AR
After experimenting with numerous AR apps and designing several myself, I couldn’t help but ponder the future of the technology. I foresee AR pushing the majority of mobile UI designers to learn 3D software. The ability to model and stage realistic 3D objects that are optimized for ARKit or ARCore integration will become an essential skill in the mobile designer toolbox. For mobile designers that want to learn 3D modeling for AR, I recommend finding an online course (Lynda.com is a great resource) focused on Cinema 4D Lite, an ideal 3D program that comes free with Adobe After Effects and integrates seamlessly with AE for animation purposes. I also recommend an online course on Unity (unity3D.com has great free tutorials) for preparing 3D model files to hand off to development.
Furthermore, I believe holding up a phone to augment a scene is only the beginning. This experience is flawed because it can be uncomfortable and inconvenient, and it limits the user’s view to a small screen size. The first step in AR advancement could be a phone with a 3D holographic screen such as the RED Hydrogen. Once wearable technology becomes compact, stylish (sorry Google Glass), and affordable enough for the average consumer, I foresee AR moving into wearable devices like a pair of glasses or goggles. From there, I believe AR will transition into contact lenses. Inverse Innovation wrote an intriguing article on the progress of contact lense tech, and Google has a patent on Smart Contact Lenses. These will give the user full peripheral vision to explore AR in a natural, immersive manner. Users will have the option to turn augmentation on or off or to take the lenses out at any time (avoiding invasive augmentation such as the AR implants in Black Mirror’s infamous episode The Entire History of You).
Ultimately, the prevalence of AR will shift our everyday mobile phone away from handheld devices and into wearables, converging AR experiences with the real world in a more fluid, comprehensive way. I believe AR is not a fad or a fleeting piece of technology; it will continue to advance in fields such as healthcare, retail, education, and transportation. AR will become ubiquitous in our mobile world, and therefore learning the UX and UI design essentials for the medium is key for mobile designers moving forward.
For more about AR, read ARKit: Get Ahead of the Enterprise Curve on the POSSIBLE Mobile Insights blog.
Want to keep your brand ahead of the technology curve by implementing an AR strategy? Send us an email at email@example.com and one of our AR experts will be in touch.