Augmented Reality Development

The Innovator's Guide to Augmented Reality

In 2017, Pokémon GO, a simple Augmented Reality (AR) game, acquired more than 65 million monthly active users and generated more than $1.2 billion in revenue in less than a year. Later that year, Apple and Google kicked off a mobile app race when they released AR software development kits. The result has generated so much excitement that AR investment records were broken when AR-based startups raised over $3 billion just last year.  AR is here to stay, but if you think it’s all fun and games, think again: businesses and enterprises are the ones benefiting most from this fascinating new tech.

Understanding Augmented Reality

Imagine you’re on a street, viewing the shops ahead through your smartphone camera and screen. What if you could see floating, digital restaurant names and reviews superimposed on screen? This is augmented reality. It’s a combination of real and digital, seen through a phone screen, special glasses or headset. Although extra “augmented” digital information is often visual, it can also use senses like sound, touch and even smell, which brings it into the realm of “mixed” reality.

AR’s strength is that it combines the best of both the real and digital worlds, overlaying the internet’s virtually infinite information on top of our daily lives and freeing up our hands during everyday tasks. One of Google Glass’s proposed consumer AR scenarios was displaying a step-by-step recipe in your smart glasses, so you could actually see instructions while cooking. Although it isn’t possible yet, someday such glasses might even point out the next ingredient by highlighting it on the counter, or demonstrate what a pinch of salt looks like with an outline. That’s AR’s power, and someday it will be our reality.

Not so new

You might be surprised to find out that AR has been around for a while. Ivan Sutherland, the “father of computer graphics,” created the first AR head-mounted display system in 1968. However, the term “augmented reality” wasn’t coined until 1992 at Boeing.

However, AR didn’t have widespread commercial use cases until ten years ago. It was confined to research labs and government-funded projects until BMW Mini, National Geographic, Skoda and Coca-Cola began using “AR mirrors” in public. These were basically cameras capturing a live public view and adding interesting animations on a huge screen for brand awareness campaigns. [3]

The next range of commercial applications popped up in retail, where companies used AR so customers could easily try products at home, including clothing and furniture. Today, AR has an even broader set of uses in consumer and enterprise settings, as we’ll see very soon.

Immersing in Augmented Reality

We’re on the cusp of some truly immersive, personalized and revolutionary AR experiences. By 2020, 100 million consumers will be shopping in augmented reality.

Companies are already competing for people’s attention like never before. Now, to win people over and build lasting engagement, they’ll need to craft multi-sensory experiences that connect both intellectually and emotionally. Thankfully, AR goes far beyond browsing. It’s virtual product visualization. Forget catalogs, shelves and sterile stores; people actually see how products will fit into their homes and lives.

No wonder businesses are now so interested in AR: investors are bullish and consumers are slowly growing more receptive. Commercial AR hardware and algorithms are in their infancy but rapidly improving. AR isn’t ready for mass adoption yet, but it’s still attracting huge investment and it’s bringing futuristic industry innovations with it. [5]

Augmented Reality’s many forms

Computer vision (CV) is at augmented reality’s core. This is a computer program’s ability to understand real-world objects in the camera’s line of sight. Once real-world features and objects are identified, then AR’s task is to overlay graphics.

How AR is positioned

AR applications use various methods to figure out where virtual objects should be placed on top of the real world. Some of these methods are simpler, like marker-based AR, while others use complex computer vision techniques, like simultaneous location and mapping (SLAM).

 

  • In marker-based AR, physical markers like QR codes are placed in the real world. As soon as the AR program recognizes the marker, it adds virtual objects to the scene. The earliest commercial AR applications were marker-based because they’re relatively simple. In 2008, BMW Mini released an application where magazine readers could hold a printed ad in front of a phone camera. A 3D Mini Cooper model would appear on the page and moving the magazine even changed the model’s perspective on screen. [6] The one drawback of marker-based AR is that it’s very simplistic; it only works in certain settings with pre-defined, recognizable objects (like that magazine and QR codes). This is what makes it ideal for marketing campaigns.
  • Location-based AR apps use GPS, digital compasses and other sensors to determine location and orientation. The Florence Travel Guide app [7] uses these to determine where the smartphone’s camera is pointed. It knows exactly which monument, cafe or museum is in view. These AR apps are easy to implement and work well in hospitality and travel industries where interesting real-world objects have fixed geographic spots. Other interesting info, like historical facts, are overlaid once landmarks get recognized.
  • Simultaneous location and mapping (SLAM)- based AR programs use complex computer vision (CV) algorithms to recognize things like walls, barriers and floors. This state-of-the-art computer vision is the future of AR. Apple, Google and Facebook are heavily invested in SLAM feature recognition, and popular AR development kits like Apple’s ARKit and Google’s ARCore have SLAM capabilities. Although surface identification and mapping are the most widely used techniques, new hardware and development is quickly enabling contour identification, meaning complete and accurate 3D world meshes (digital, 3D terrain copies created from a picture or infrared scan of the room) will soon be possible. This means AR will soon be even more realistic.

 

How AR is drawn

Separate from how the AR app determines object positions, there are different methods for actually superimposing those on the world, whether it’s through a device (like a phone or headset), or projection.

 

  • Outlining AR creates a 2D outline of objects in view. These outlines are especially useful for architecture and engineering. CityViewAR of Christchurch town in the UK shows AR outlines of old buildings that fell during 2011’s devastating earthquake. Outlining AR is also used for car safety. In foggy, low-light weather, outlining road boundaries helps drivers stay on the road. And in beauty and apparel, outlines put products over shopper’s bodies and faces. Outlining AR is displayed through camera screens, headsets, mirrors, digital glass or even screens in car dashboards.
  • In superimposed AR, digital objects replace or cover real-world ones. IKEA’s popular AR catalog is an excellent example. With ARKit and ARCore’s rising prominence, superimposed AR is particularly popular with mobile AR developers. Its one drawback is that it only works when users are wearing special equipment or looking at phone screens. This can be inconvenient in manufacturing scenarios where workers’ hands are occupied or where they’re wearing protective equipment.
  • In projected AR applications, digital objects are projected on the real world, so no additional device (like a phone or headset) is needed. Particularly useful in industrial applications, projected AR is highly effective for complex machinery training. Step-by-step visual instructions are projected in a surface to guide workers through assembly line operations.


What about VR, MR and XR?

Augmented reality immerses people in the physical world and layers the virtual on the real; it’s easy to distinguish the digital additions. But you may have also heard of terms like virtual reality (VR), mixed reality (MR) and extended reality (XR). So, how do these differ?

  • Virtual reality (VR) is where users are immersed in a fully artificial, digital environment. Unlike AR glasses, VR headsets such as Oculus, Playstation VR and Samsung Gear VR are never see-through. VR applications like flight simulators are historically effective high-risk training techniques. Military pilots train using VR before doing real-world, life-threatening missions. VR is sometimes used to help soldiers suffering from PTSD. By recreating battlefield trauma (also known as “exposure therapy”), the brain rewires memories. VR is also emerging as an interesting new filmmaking frontier where viewers can immersively explore scenes and become more emotionally involved. This was demonstrated in 2017’s award-winning VR films Notes on Blindness and Clouds over Sidra, which build empathy for experiences as a blind person and the Syrian refugee crisis.

     

  • Mixed reality (MR) is where the line between augmented and virtual reality become blurred. In many ways, mixed reality is advanced AR. Real and virtual objects interact with each other, or other senses beyond sight like and touch are added. Microsoft’s HoloLens headset enables interesting applications like virtual co-location. Geographically distributed teammates collaborate on work, interacting with each other’s virtual selves as if they were in the same place. It also has the ability to selectively include and exclude real-world objects. In their Mars simulation, users interact with a completely virtual scene representing the surface of Mars, but also use a physical desk in the same room. The headset combines both the real and the virtual in a way that creates a new, mixed reality.

     

  • Extended reality, or (XR), is shorthand that’s collectively used to refer to AR, VR and MR.

It’s also worth pointing out that, unlike most VR and MR, AR is typically self-contained (within a phone or smart glasses) and untethered (you aren’t connected to a computer). That means mobility is one of its greatest strengths.

AR versus AR cloud

Perhaps in your time reading about AR, you’ve also heard about a mysterious term called the “AR cloud.” What is it, and why are people suddenly so obsessed with it?

The lofty vision

Apple’s ARKit and Google’s ARCore 2017 release led to tens of thousands of iOS and Android AR apps. [8] But millions of siloed people using online technology should remind you of something—like maybe the nineties internet. It took connecting the dots with Google, a huge search engine, to make the internet a global and accessible knowledge repository. And it really wasn’t until people put their social lives online and created cloud-based workspaces that the internet truly came alive. This shared, worldwide, online social space and workplace have created a data revolution. Today, what once happened with HTML and media is already happening with our 3D reality: it’s going online. This is AR cloud’s vision.

When social meets AR

People already share text, images and videos on social media. With the AR cloud, the world becomes a shared screen with spatial world data. In it, everyone has shared experiences. This shared reality is even searchable. Today, when one travel company creates an AR app for a district or city, this data is isolated from all the other AR apps’ data for the same region. The AR cloud will be a massive repository of context-rich spatial maps. It will unify all this disparate data and unify their virtual and real counterparts.

Put another way, imagine the world as a shared screen. This already exists on a small scale with massively multiplayer online (MMO) games. AR cloud is similar, but for real worlds instead of virtual ones. Ori Inbar, an AR cloud evangelist, calls it “the soft copy of the world.” [9]

Augmented Reality’s benefits

There’s a reason why so many developers and businesses are jumping in on AR. Here are some of augmented reality’s major benefits.

 

  • Interactive, hands-free instructions

AR often overlays information on a person’s line of sight, making it an effective, hands-free instruction tool. When automated, complex manuals provide step-by-step product, process and equipment instructions, plus assembly-line quality control. Field service technicians have live, collaborative AR calls with experts and fixes complex client-side issues. The result is major cost savings for the field service management (FSM) industry.

 

  • Envisioning the hypothetical in reality

Whether designing home interiors by browsing for just the right furniture, changing wall paint colors or optimizing office space, AR allows us to add imaginary objects to the real world by assessing the hypothetical in real life like never before. Where designers once had to create mood boards and mockups, AR makes design immersive, 3D and 100% live.

 

  • Virtual co-location

The idea behind AR co-location is that working on a virtual object with remote participants should be exactly the same as working in the same room with them. Imagine collaboratively solving a virtual puzzle with many other people in real-time, while also being able to see it from their different perspectives. That’s the power of co-location.

 

  • Immersion

The success of Pokémon GO proves that AR is highly immersive. John Hanke of Niantic, Pokémon GO’s creators, puts it like this: “To me, the most exciting part of AR is not the graphics […] but untethering of the game experience from a screen and a controller. Moving physically around.” It’s all about adding physicality to experiences—far beyond what’s afforded by Nintendo Wii or even Microsoft’s Kinect. It’s about getting out into the world and “[using] your body the way it’s been evolved over millions of years to perform.” [10]


On the other hand…

Although we’re major tech optimists, even AR has its drawbacks. In the last few years, a few shortcomings have become clear.

 

  • Lukewarm consumer response

At the enterprise level, AR has high impact and serious cost savings, but in the consumer market, the response hasn’t been as hot. However, this is largely a UX issue. Today, most AR simply overlays virtual graphics without an emotional connection, empathy or deeper purpose. With a better understanding of people’s expectations and context, designers will increase people’s emotional investment in AR.

 

  • More context, please

Blippar co-founder Omar Tayeb points out, “It’s not enough to identify a woman and a stroller. A universal visual browser will also understand the relationship. The stroller could have a baby […] the woman is possibly the mother. It’s not just visual, it’s contextual.” [11] AR overlooks contextual relationships. But once many different businesses and people implement and use AR, that interaction data will hopefully join the searchable AR cloud, powering an AR cloud-based AI that’s able to identify a woman, stroller and their relationship.


Learn more about artificial intelligence in our Innovator’s Guide to Artificial Intelligence.

 

  • Software limitations

ARKit and ARCore are still in their early days; their abilities are limited. AR’s three fundamental problems have been cracked at a basic level: camera orientation, ground plane identification (recognizing where the ground is) and scene lighting (understanding light and shadow so  it can be digitally replicated for realism.) Phone cameras and headsets also have complex “6DoF pose” orientation systems and visual inertial odometry (VIO) systems (accelerometer and gyroscope) that are being incorporated for 360-degree augmented reality. Realism is great, but these capabilities aren’t yet creating the immersive, empathic and realistic experiences people want. That comes down to designers.

 

  • Hardware limitations and price

Mobile devices are not powerful enough to process heaps of real-time data. AR requires high connectivity, meaning battery life is a big limitation. All this drives up end-user AR device costs (we’re still waiting for devices in the $200-$500 range); plus, tech outlets almost unanimously agree that AR headsets could be way sleeker. [12]

 

  • Health and personal injury

Falling off staircases. Bumping into walls. People have even been hurt playing Pokémon GO [25]. Some researchers warn that AR could interfere with our perception of speed and direction, causing a form of motion sickness (a familiar issue for VR-users). Revolutionary tech sometimes precedes rules for responsible social use. “AR hygiene” guidelines haven’t arrived, at least not yet.

 

  • Privacy

When social AR advances significantly, the amount of personal information accessible to total strangers may be alarming to some. Also, not everyone should have access to every virtual space. Industry experts suggest standards should include a privacy layer so applications get access control.

 

Augmented Reality in today’s world

Now that we’ve covered some of AR’s basics, let’s get into the good stuff: how and why augmented reality is being used.

 

AR for enterprise

Generally speaking, industrial AR is already being used across warehouses and manufacturing facilities for cost savings. Companies are finding it seriously improves complex assembly systems: anywhere from 25-32% warehouse productivity increase, according to Boeing and General Electric. At GE, trainees were 90% more likely to finish complex assembly tasks the first time, and it’s even eco-friendly; instructions that were once on paper printouts are now on AR instead. Immersive, 3D, instructional experiences are a major step up. [13]

AR for consumers

Although consumer AR has been slow to take off, there have been a few successful examples, including Snapchat’s Bitmoji and Pokémon GO. People have been trying out makeup, accessories and apparel with AR mirrors and AR-enabled fitting rooms, while makeup AR apps are hugely popular. With the popularity of try-on apps increasing, many AR scenarios tie strongly into e-commerce.

AR in the wild

AR has been used commercially for nearly a decade. Although many applications have been simple and sometimes even gimmicky, others truly stand out. Here are a few award-winning and industry-acclaimed AR apps setting the gold standard.

 

  • Incredibly enough, Lee Company calculates a $20 return for every dollar invested in AR. [15] Remote experts in central locations connect with technicians on an AR-enabled “see what I see” live support video calls. Previously, techs would either bring parts back or fly experts into the field. Now they use AR for field installations and repairs instead. Thanks to this new tech, Lee saves more than $500 per technician, per month in labor and travel.
  • IKEA Catalog has been finding novel ways to incorporate AR and furniture shopping for a while. Although IKEA began with simple marker-based AR that required placing the physical print catalog on the floor and pointing a phone camera at it, these days it’s no longer necessary. Point the camera at any area and the AR automatically localizes itself, placing the product there. It even matches the lighting and is true to the texture of the fabric, so you can see exactly how the furniture fits in your home. They’re now also using AR to help with furniture self-assembly.
  • Google Translate purchased World Lens; now they’re adding a new dimension to multilingual translation. If you see foreign text printed on menus, signs, books or elsewhere, World Lens AR automatically identifies the language and translates it. NYT journalist David Pogue called it one of 2010’s best tech ideas, and World Lens was featured as a 2010 Crunchies Best Technology Achievement awards finalist.
  • Aero Glass won the 8th annual Auggie Award for best app. Aero Glass is an AR device that’s surely going to usher in a new aviation era. It overlays navigation aids, geographic location and complex flight procedure instructions—even mission-critical take-off and landing procedures. Unlike Google Glass, which has just one screen for the right eye, Aero Glass uses Epson BT Moverio’s smart glass. That means 360 degrees, bifocal, stereo (two-eyed) vision. Unlike single screen glass-based AR, which usually renders on the right side of the right eye, Aero Glass also renders directly in front of you. This is a rare innovation.
  • Schell Games’ Happy Atoms won the 2017 Auggie Awards’ best game, but it’s not just an entertainment app. It’s a gamified, educational molecular chemistry app for kids. Happy Atoms’ toy kit includes actual, physical atom and electron blocks that get connected to form your own molecule. But what molecule did you create? That’s where AR comes in. You hold your toy molecule in front of your phone and Happy Atom’s AR tells you all about it. This is one truly great way for AR to enter classrooms and make complex concepts fun.

 

Developing and building Augmented Reality Experiences

Even if AR isn’t totally ready for broad adoption, it’s worth considering how to develop and build these experiences to get ahead of the curve. Plus, as seen above, there are still some very compelling reasons to develop AR experiences. So, let’s turn to some of the key considerations for building AR apps.


First: consider your scenario

There are certain commercial areas where AR has proved its worth. Here are some idea pitches for places where AR could be especially useful—see if your business falls into any of these categories.

 

Healthcare

  • AR can overlay internal imaging data. For example, vein heat signatures and live X-ray imagery might superimpose patients’ bodies to guide doctors during complex surgical procedures.
  • Apps might show the nearest life-saving devices nearby, including first aid kits and defibrillators, superimposed on a live camera view.
  • An augmented reality headset could show instructions for administering CPR, first aid and other assistance, even for people who are untrained, until professionals arrive on the scene.

Real estate

  • Housing developers could show prospective buyers what a house will look like after it has been built, so customers narrow their home list with fewer visits.
  • As technology advances, a customer in China might remotely evaluate and rent or buy properties in New York City with immersive AR/VR experiences. As a result, agents will spend far less time on showings.
  • Even during physical visits, property information is displayed with AR instead of needing an agent or paper placards.

Fashion, retail and beauty

  • Shoppers could select their apparel and an AR app creates a holographic fashion show where a model walks a virtual catwalk with these choices.
  • Virtual changing rooms can be created where shoppers try clothes and beauty products to see how they will actually look.
  • AR can even be used to evaluate body size and shape, then recommend clothing based on fit or style. This could drastically reduce the frustration associated with typical shopping or even power a custom tailored clothing brand.

Travel

  • An app could help customers look for the nearest restaurants, tourist attractions and hotels.
  • AR could provide an always-on, on-site location information system. These apps have existed for nearly a decade, but advances in SLAM and surface detection create more realistic overlays.
  • Your app could even partner with Amazon Echo so that AR is combined with voice assistants, helping travelers make travel decisions.

    Learn more about voice assistants and chatbots in our Innovator’s Guide to Chatbots.

Maps and navigation

  • AR apps could guide you through an airport with arrows on the ground and highlighted overhead signs, then get you to the boarding gate, your seat and the baggage carousel. It could even help you make optimal choices along the way, from online check-in to calling an Uber on arrival.
  • Fire escape plans could be provided for public buildings in the event an alarm goes off.
  • Mapping and navigation are often very accurate with SLAM-based AR, including helping drivers navigate in inclement weather.

Unique marketing campaigns

  • Brick and mortar shops might host virtual, in-store tours, curating buyers’ shopping experiences. This is personalized based on the shopper’s history.
  • Snapchat’s face-changing AR lenses and Facebook’s new AR platform, World Effects, are already used to connect emotionally connect with users. AR mirrors also offer excellent branding and marketing opportunities.

Home decoration and e-commerce

  • Imagine creating an end-to-end AR experience that helps people measure their space with superimposed guidelines, recommends different furniture based on the current items in the room and even allows them to visualize various layouts as if the furniture was actually there.
  • Or perhaps you could try a new paint color in your home without ever going to the paint store, getting swatches or breaking out the paintbrush, all with the help of an AR paint swatch app.
  • Home decoration AR can simply be leveling guides for hanging picture frames, or perhaps a tool that visualizes the best spots in the room for art and blinds based on how light falls.

Warehouse logistics

  • Warehouse workers’ AR glasses can show step-by-step instructions, totally automating training manuals and task lists. [18]
  • AR can also be used to plan warehouse storage layouts for more optimal use of floor and shelving.
  • One-fifth of logistics costs happen at the warehouse due to printing item fulfillment lists (called “picking lists”), then searching for them. Instead of printing lists, an AR heads-up display (HUD) could provide directions to each subsequent item on the list.

Does AR make sense for you?

Beyond the above ideas, AR might just make sense if your business has any of the following elements:

 

  • E-Commerce: AR offers more personalized customer choices than competitors by helping people visualize your products in a real-life setting.
  • Complex tasks: AR solutions are ideal for enterprises who are spending too much money on complex machinery training. Major AR platforms like Microsoft Windows Mixed Reality are helping companies train their workers with AR glasses and HUD instructions.
  • Space planning: AR works well for scenarios that help you visualize how objects will occupy a physical space, whether that’s furniture or warehouse inventory.
  • High travel costs: If your company spends a lot on service-related travel for technicians or other employees, AR-powered interactive calls can guide service people and reduce car and plane trips to almost zero.

 

As mentioned in the limitations section, AR works well when it works, but there are some cases where caution makes sense. When it comes to considering AR for your specific business, some additional considerations to bear in mind are:

 

  • Expensive eyewear: Specialized AR glasses don’t work well in the consumer market, at least not yet: most people don’t have them and they’re just too expensive and silly looking. Smartphone-based AR apps are far better for consumers. However, for industrial and enterprise applications, companies report a solid ROI on AR glasses, in spite of hardware expense. This is why specialized AR hardware is still highly recommended in these cases.
  • No privacy standards: If your business is somewhere where privacy or video/photo consent is required, then AR might not be the right application for you. AR developers haven’t yet created privacy or security standards.
  • VR, maybe? Although industries like real estate have effective AR use cases, VR ultimately better and more immersive property showing experiences. Movies, films and immersive digital walkthroughs provide curated experiences and are better-suited for VR than AR.

 

Second: design your product

As a new tech with novel experiences, too much AR innovation may cause cognitive load, push people outside their comfort zone and even make them avoid an experience altogether. With AR, as with many technologies, it’s important to innovate small, incorporate usability study feedback and iterate often. Both Google Glass and the UX Collective design blog have identified some additional, key considerations for AR experience design. We especially like the following: [14] [24]

 

  • Environment: Understanding and recognizing a user’s environment, including their context, is crucial to providing an immersive AR experience. Today’s AR apps often provide better experiences when they’re in known settings. For example, the environment is understood in industrial AR: a person on an assembly line is in a fixed location performing a known task. In consumer applications, it’s trickier; someone could be anywhere and doing anything. Effective designs narrow down your app’s environment domain. To do this, you have to scope where and how the program is used.
  • Get moving: We’re accustomed to sedentary UI. AR changes all of this. Now people can move while interacting, and interactions change with movements. In a world where UI has literally gone mobile, design that incentivizes movement, action and interactivity are what truly harness AR’s potential. New UX patterns are emerging: virtual objects that reveal more information as you get closer are one great example where motion is rewarded and exploration is encouraged.
  • Screen constraints: AR apps are viewed on many different mobile screens and headsets. The best designs respond to different screen sizes and viewing angles (especially key for multi-user experiences.) Phones and tablets are also more limited than glasses because they can’t provide 360-degree experiences. Limited screen real estate means space has to be managed. Users can be trained to scale and move objects. Better yet, use augmented reality’s superpower: mobility. Train people to move their bodies and devices with cues. Then the stage is set and the experience runs as expected.
  • Gestures: “Legacy” interaction patterns like “drag and drop,” “open,” and “close” still have a place in an AR future. New hand gestures are being innovated to recreate these interactions in 3D, but don’t go overboard and fatigue people. How users interact also depends on whether they’re using a mobile phone, AR headset or projected AR. Most AR software supports gestures like tap, swipe, pinch and rotate. Some additionally support interactions like air tap (on Hololens), voice commands, hover and facial expression recognition.
  • Ergonomics: When using gestures and having people move around, best practices include avoiding uncomfortable positions and asking people to perform repetitive, unnecessary, highly fatiguing, over-complicated or unrewarding tasks. Too many hand gestures can also break immersion with an otherwise magical experience.
  • Cues: Visual and audio cues can prompt people to follow instructions or learn to use AR for the first time. Visual cues include graphical notifications like arrows, highlights and hovering markers suggesting a pending action. Audio cues provide direct instructions or bring attention to key UI. Take advantage of AR’s mobility and provide cues as people approach objects (or even as they get too far away.)
  • Color and text: For colors, apply the same best practices as print, mobile and web. Beyond your app’s color requirements, keep in mind color blindness and cultural/business context. Also, consider that serif fonts are currently harder to read than sans serif fonts on AR glasses.
  • Lighting: Superimposed objects need to be realistic and it should be obvious that they’re interactive. This can be done with light, shadow, animation and by accurately recognizing surfaces in the live view (including the horizon, tables, floors and possible obstructions that virtual objects wouldn’t collide with if they were real!) Being aware of a user’s specific lighting conditions is important: it means making those AR objects fit into the scene and giving them a sense of mass.
  • Empathy: This one of AR’s most crucial elements. If a designer can evoke empathy and emotions in their AR experiences, that’s the definition of success. Likewise, if a designer can practice empathy and put themselves in a user’s shoes, that creates immersive, interesting and enjoyable app journeys.

 

Third: hire your talent

When building an AR team, consider three different candidate profiles beyond your usual core user researchers, product managers and other employees.

AR Designer

As with flat interfaces, AR interfaces need UI and UX designers to craft design strategies. Some AR UX skills are identical to web and mobile skills, but others are different—and interaction best practices are still evolving with the field. For now, understanding 3D space, light, shadow, mass and depth is key, as is thinking through user gestures in 3D planes. For this reason, 3D game designers and game UX designers often very well in this role. Also note that designing experiences for phones can differ from AR headsets or projected AR.

3D Artist

These artists work with studio tools such as Maya, 3Ds Max and Blender to create volumetric shapes and environments. Artists will frequently be required to optimize highly detailed (high poly) models for the demanding, low-performance world of mobile platforms.

AR Developer

The developer pulls together the 3D model and interactions. AR devs need to understand 3D game engines like Unreal or Unity, plus their respective programming languages. They should have some experience with AR toolkits like Vuforia, ARToolkit, Tango (now deprecated) or the newer ARKit and ARCore. Another big plus is if they’ve worked with MR/AR hardware like Hololens or Google Glass.

When should you hire direct?

Hiring directly for AR makes sense for quick, straightforward applications. In those cases, hiring developers can lead to a fast proof-of-concept. Hiring a producer or program/product manager who previously worked on AR or VR products will also help with communication between business, design and development teams.

On the other hand, agencies like Jakt work well for projects where deep AR expertise is needed. Jakt has extensive AR/VR design and development experience and can be a true partner, hitting the ground running to provide immediate feasibility feedback and ideation in the current AR landscape. We can contextualize your project against past AR projects and market trends. This is especially crucial when AR is such a new field. With little industry standardization, team wisdom and past project lessons are quickly shaping industry standards.

Fourth: select your tech

You’ve got your scenario, design and AR experts. Now you need to settle on your AR technology and platform. How do you decide? It begins with the hardware, followed by the software.

Hardware

Smart glasses, projection or mobile phones?

Smart glasses brands include R-8 by ODG, Microsoft HoloLens (Windows Mixed Reality), Epson Moverio and Google Glass.

Glasses have a hands-free advantage over mobile phones; employees can’t hold phones while working on assembly lines. On the downside, glasses are expensive and don’t have many consumer applications—yet. If you’re building consumer AR, mobile is the way to go. And when it comes to projected AR, there simply aren’t many consumer devices or uses yet.

If you’re using glasses, an important consideration is the number of optical screens. Most AR glasses come with one optical screen in the right eye. If you want full immersion, you’ll need bifocal, stereo vision—two cameras, one for each eye.

Any hardware you choose should also support one or more of the standard software SDKs like ARKit, ARCore or Vuforia.

Software

Game Engines

Game engines aren’t just for video games. They render 3D graphics that get layered on a camera’s live view. Unity is a standard game engine supported by all AR SDKs and hardware.

Software Development Kits (SDKs)

Google’s ARCore and Apple’s ARKit have quickly gained popularity over other SDKs like Vuforia and SLAMSDK. However, there are still some benefits to using licensed solutions like Vuforia. Vuforia works on most platforms and specializes in marker-based AR that triggers based upon specific, printed 2D or 3D objects. Conversely, ARkit and ARCore are free and aren’t purpose-built for marker-based AR. They work in broad situations and are capable of detecting floors, walls and other barriers.

Authoring platforms (AR CMS)

Authoring platforms simplify AR; you can create AR-based solutions with minimal programming, but only for specific use cases. ZapWorks, Wikitude Studio and Augment are general-purpose AR authoring studios, while Facebook AR Studio is specifically for Facebook. ZapWorks lets you create AR animations triggered by scanning printed codes called “zapcodes,” while Wikitude specializes in showing information when a camera is pointed at a particular at a particular GPS coordinate.

Augmented Reality’s fascinating future

Where does AR go from here? Its possibilities are clearly awesome, but consumers aren’t yet convinced of the everyday value. Part of the issue is that AR just isn’t there yet. Hardware and software limitations need to be addressed and design improvements need to be made so AR connects more on a personal level. There are a few other challenges holding it back that we see changing in the near future:

  • Invisible to visible: In some cases, people don’t even realize they’re using AR. Pokémon GO, Snapchat Lenses and makeup apps may be infiltrating our lives, but the term augmented reality still carries a certain mystery. The result? Consumer AR awareness remains low. As people become more aware of what AR is and how they’re already using it every day, adoption will likely increase.
  • Increased investment: Investors have historically been AR/VR shy, especially as hardware market penetration has been low. However, in the last quarter of 2017, investors doubled down and the trend is continuing. AR tech spending is expected to hit $60 billion in 2020. According to the International Data Corporation, total AR/VR revenue is projected to increase from $5.2 billion in 2016 to over $162 billion in 2020, largely from hardware sales. [19], [20]
  • Overtaking enterprise: Google Glass shut down its consumer version and has since released a new, enterprise one. Consumer use cases may be looking iffy across the board, but enterprise and industrial AR spending is projected to increase from $247 million in 2014 to $2.4 billion in 2019. [21] In fact, over 150 businesses—including 52 of the Fortune 500—have deployed AR/VR solutions, and it’s anticipated the app market will “hit an inflection point” in 2018 with a meteoric rise in AR adoption rates. By 2021, smart glasses shipments are expected to grow by 227% annually. [23]
  • Cheaper hardware (that we actually want to wear): Beyond expense, the greatest criticism of wearable AR headsets like Google Glass was that they look absolutely ridiculous. In fact, people who wore Google Glass out in public quickly acquired a very specific nickname: “glasshole.” No one cares how you look on the assembly line. But if smart glasses are ever going to take off in public, they need to be indistinguishable from everyday eyewear. Intel’s new Vaunt smart glasses are the first designed with “zero social cost” in mind. Just as smart watches have gone from a bulky, geeky item to a sleek, holiday must-have, we see the same happening for AR smart glasses.

 


AR still isn’t quite ready for mass consumer adoption, but with award-winning apps, increasing investment, broader device adoption and free SDKs from Apple and Google, 2018 is a promising year for major AR innovations. And as people and companies generate massive AR data in the coming years, the AR cloud is a huge tech disruption that’s sure to change our lives. With AR, it’s no longer a question of if, but when.


Want to join the revolution?

Drop us a line—we’ll find a way to make it happen.

 

0 Comments

Cancel