The Oxford Dictionary defines augmented reality as “A technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view” (Oxford, 1). It further goes on to say that augmented reality, or AR, is “a technology that works on computer vision based recognition algorithms to augment sound, video, graphics and other sensor-based inputs on real-world objects using the camera of your device” (Oxford, 1). To put it more succinctly, the purpose of AR is to place graphics and even audio on top of a real-world environment in real time. While this is fairly common knowledge, what most are unaware of is the fact that there are different types of AR:
- Marker Based Augmented Reality – Uses a camera, typically one on your smartphone, and some type of visual marker, such as a sign, car, or some other real-world physical object, to produce a result. When that marker is computed and deemed a marker of importance, specific content or information is shown over that particular marker.
- Markerless Augmented Reality – Sometimes referred to as location based, this is one of the most commonly used applications of AR. This form of AR uses GPS and compass technologies in your device, typically a smartphone, to determine your location and show you broad stroke information about where you are and what you are interested in.
- Projection Based Augmented Reality – This form of AR actually projects an image or field of light onto a real-world surface, and then allows humans to interact with that projection, sense what they are doing with it, and react accordingly. A perfect example of this is a projected keyboard
- Superimposition Based Augmented Reality – This form of AR is related to Marker Based AR, however, it differs in one major way. This type of AR takes a view and completely replaces the entire view with something different. A good example of this form of AR would be an interior decorating application. Take a picture of your living room, then drag and drop pieces of furniture from a catalog and place them in the room to see how things will look and fit.
How does it work?
As you can start to see from the outline of what AR is above, there are really four major components of any AR system, Sensors and Cameras, Projection, Processing, and Reflection. In order to get a good understanding of how AR actually works, we will briefly review each of these components.
Sensors and cameras are at the core of every AR solution since their main job is to gather information about what is going on around the user. They are typically on the outside of a device and gather information about what is happening in the “real world”. They then transfer that information onto a processor to be interpreted. Many people only think of cameras in this part of AR, but sensors provide valuable information related to temperature, angle the device is being held, and even elevation of the user (perhaps in a 39-story building in NYC). More often than not there will be more than one camera and sensor on an AR device as they all provide different data points to the processor. Some are responsible for gathering depth information, others simply for image capture, and still others for video and other informational capture. After the sensors and cameras gather everything they can about a user’s real-world surroundings, the data then needs to be processed.
Processors are the next major component of an AR system and are typically so powerful, they act like mini supercomputers in the palm of our hands. When one hears the word “processor” they tend to think only about a CPU or central processing unit, but there are many other types of processors that come into play in an AR system. Think about all of the components an AR system would need to function properly in addition to the CPU: GPU, RAM, flash memory, WiFi, GPS, Bluetooth, accelerometer, magnetometer, and even a gyroscope! Each of these different processors plays a distinct and very specific role in interpreting the data coming into them from the cameras and sensors. Take one of them out of the picture and the user experience would be lacking. After the processors do their thing to assimilate and interpret the data being sent to them by the cameras and sensors, it is time to produce an output to the user. As mentioned above, the output can be either projected or reflected for the user to interact with it.
Projection components of an AR system are responsible for displaying the augmented reality pieces onto the real-world background of the user. Projection-based AR works by utilizing a mini-projector typically facing forward on a wearable AR headset or something like it. It essentially transforms any surface into an environment with which the user can interact. Today the projection typically takes place on a screen in front of your eyes (think smartphone or tablet) but in the near future it is predicted that AR projectors will be powerful and intelligent enough to eliminate the need for a screen at all, making it possible for real-world surfaces to become part of the AR experience.
Reflection components are the second way that an AR environment can be created for the user. These systems function a bit differently from projectors in that they use mirrors to focus and alter the way the user sees the information being presented. Through a mix of light projection, different levels of reflective mirrors and screens the AR system can not only show the user certain things but read their input and interaction with those things as well.
Now that we have a working knowledge of what AR is and the components that enable it, we should look a bit deeper into practical applications of this technology in today’s world.
How is AR being used today?
There are many practical and useful applications of augmented reality in today’s world but let’s think about a few to get our minds going. In business consider remote collaboration, and augmented office spaces, in manufacturing and e-commerce think repairs and product showcasing, and in the travel industry ponder the ways tours and maps can be enhanced.
In today’s business world our work environments are spanning the globe, teams are spread out across two and three continents making collaboration more important than ever and next to impossible to achieve. Think about the teleconference, a phone meeting with 5 – 10 people who cannot see each other, cannot read body language and are constantly talking over one another. What happens? Most people become disengaged and start “multi-tasking” which we all know means they have completely stopped paying attention and contributing to the discussion. By implementing components of AR into your meetings, you can bring people together in ways a simple teleconference system cannot. Now, switch your mind to thinking about office spaces and people who are located in the same place. Currently, in many office buildings, there are meeting and team spaces that are tailored and built for specific purposes. In the very near future imagine generic meeting spaces with no specific features and a few plain, load-bearing objects, like tables and chairs. Then imagine AR projectors and reflectors creating a project room specific to the needs of that particular team and meeting objective. Need 4 walls of whiteboards? No problem. Need to present your project timeline on one wall, and an outline of objectives on another and interact with them both at the same time? No problem. AR can easily provide solutions to both the communication and workspace problems we all experience today.
When thinking about the manufacturing industry, they have some unique challenges of their own around repairs and training of employees to fix a wide range of increasingly complex products quickly and completely to ensure customer satisfaction. AR overlays can assist a technician in diagnosing and fixing a problem in real-time. Consider an engineer repairing a jet engine: an AR application enabling them to see an overlay of a piece of machinery, with repair information on the side, temperature sensor readings, next steps, clear directions on which hose to disconnect, etc. could reduce overall cycle time for repairs, provide on the job training, and increase the chances that the repair will be done correctly the first time.
In the growing eCommerce industry consumers are purchasing products they can’t hold, touch, or see for themselves, which creates angst with some and fear to purchase in others. It can also create a high volume of returns, with dissatisfied consumers returning their goods because it wasn’t what they thought they were purchasing. With the introduction of AR, consumers can now virtually see and manipulate the things they wish to purchase in great detail before they click the “checkout” button.
The last real-world example of how AR is being used today is in the travel industry. The last time you took a vacation and went on a tour, think about what you did. Most likely you were one of many people in a large group and either had a tour guide or a physical map to guide you through the sites. With AR, travelers are able to get all the information available about a building, painting, house, or other landmark simply by looking at it. Information about that object will be displayed on the device with which they are using to look.
With our wide and in-depth experience in developing mobile applications powered by frameworks like ARKit for iOS and ARCore for Android, OFS can build AR applications for you. Contact us here to set up a time to talk with us about your questions, ideas, and interest in implementing AR in your apps.
1. (Oxford, 1) – https://en.oxforddictionaries.com/definition/augmented_reality
Ganeshram Ramamurthy is ObjectFrontier’s technical director and heads technology for presales. For many years, Ganesh has been designing and developing enterprise applications across various domains. He has a keen interest in emerging technologies and is now spearheading blockchain initiatives at OFS.