Thursday, March 27, 2014

INTRODUCTION

Augmented reality (AR) is a live, copy, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. As a result, the technology functions by enhancing one’s current perception of reality.By contrast, virtual reality replaces the real world with a simulated one.Augmentation is conventionally in real time and in semantic context with environmental elements, such as sports scores on TV during a match. With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Artificial information about the environment and its objects can be overlaid on the real world.


TECHNOLOGY USED IN AUGMENTED REALITY

Hardware
Hardware components for augmented reality are: processor, display, sensors and input devices. Modern mobile computing devices like smartphones and tablet computers contain these elements which often include a camera and MEMS sensors such as accelerometer, GPS, and solid state compass, making them suitable AR platforms.

Display

Various technologies are used in Augmented Reality rendering including optical projection systems, monitors, hand held devices, and display systems worn on one's person.
Head-mounted
A head-mounted display (HMD) is a display device paired to a headset such as a harness or helmet. HMDs place images of both the physical world and virtual objects over the user's field of view. Modern HMDs often employ sensors for six degrees of freedom monitoring that allow the system to align virtual information to the physical world and adjust accordingly with the user's head movements. HMDs can provide users immersive, mobile and collaborative AR experiences.
Eyeglasses
AR displays can be rendered on devices resembling eyeglasses. Versions include eye wear that employ cameras to intercept the real world view and re-display its augmented view through the eye pieces and devices in which the AR imagery is projected through or reflected off the surfaces of the eye wear lens pieces. 
Contact lenses
Contact lenses that display AR imaging are in development. These bionic contact lenses might contain the elements for display embedded into the lens including integrated circuitry, LEDs and an antenna for wireless communication.Another version of contact lenses, in development for the U.S. Military, is designed to function with AR spectacles, allowing soldiers to focus on close-to-the-eye AR images on the spectacles and distant real world objects at the same time.
Virtual retinal display
A virtual retinal display (VRD) is a personal display device under development at the University of Washington's Human Interface Technology Laboratory. With this technology, a display is scanned directly onto the retina of a viewer's eye. The viewer sees what appears to be a conventional display floating in space in front of them.
EyeTap
The EyeTap (also known as Generation-2 Glass) captures rays of light that would otherwise pass through the center of a lens of an eye of the wearer, and substituted each ray of light for synthetic computer-controlled light. The Generation-4 Glass (Laser EyeTap) is similar to the VRD (i.e. it uses a computer controlled laser light source) except that it also has infinite depth of focus and causes the eye itself to, in effect, function as both a camera and a display, by way of exact alignment with the eye, and resynthesis (in laser light) of rays of light entering the eye.
Handheld
Handheld displays employ a small display that fits in a user's hand. All handheld AR solutions to date opt for video see-through. Initially handheld AR employed fiduciary markers, and later GPS units and MEMS sensors such as digital compasses and six degrees of freedom accelerometer–gyroscope. Today SLAM markerless trackers such as PTAM are starting to come into use. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR is the portable nature of handheld devices and ubiquitous nature of camera phones. The disadvantages are the physical constraints of the user having to hold the handheld device out in front of them at all times as well as distorting effect of classically wide-angled mobile phone cameras when compared to the real world as viewed through the eye.
Spatial
Spatial Augmented Reality (SAR) augments real world objects and scenes without the use of special displays such as monitors, head mounted displays or hand-held devices. SAR makes use of digital projectors to display graphical information onto physical objects. The key difference in SAR is that the display is separated from the users of the system. Because the displays are not associated with each user, SAR scales naturally up to groups of users, thus allowing for collocated collaboration between users.
Examples include shader lamps, mobile projectors, virtual tables, and smart projectors. Shader lamps mimic and augment reality by projecting imagery onto neutral objects, providing the opportunity to enhance the object’s appearance with materials of a simple unit- a projector, camera, and sensor.
Other applications include table and wall projections. One innovation, the Extended Virtual Table, separates the virtual from the real by including beam-splitter mirrors attached to the ceiling at an adjustable angle.Virtual showcases, which employ beam-splitter mirrors together with multiple graphics displays, provide an interactive means of simultaneously engaging with the virtual and the real. Many more implementations and configurations make spatial augmented reality display an increasingly attractive interactive alternative.
A SAR system can display on any number of surfaces of an indoor setting at once. SAR supports both a graphical visualisation and passive haptic sensation for the end users. Users are able to touch physical objects in a process that provides passive haptic sensation.

Tracking

Modern mobile augmented reality systems use one or more of the following tracking technologies: digital cameras and/or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID and wireless sensors. These technologies offer varying levels of accuracy and precision. Most important is the position and orientation of the user's head. Tracking the user's hand(s) or a handheld input device can provide a 6DOF interaction technique.

Input devices

Techniques include speech recognition systems that translate a user's spoken words into computer instructions and gesture recognition systems that can interpret a user's body movements by visual detection or from sensors embedded in a peripheral device such as a wand, stylus, pointer, glove or other body wear.

Computer

The computer analyzes the sensed visual and other data to synthesize and position augmentations.

Software and algorithms

A key measure of AR systems is how realistically they integrate augmentations with the real world. The software must derive real world coordinates, independent from the camera, from camera images. That process is called image registration which uses different methods of computer vision, mostly related to video tracking.Many computer vision methods of augmented reality are inherited from visual odometry. Usually those methods consist of two parts.
First detect interest points, or fiduciary markers, or optical flow in the camera images. First stage can use feature detection methods like corner detection, blob detection, edge detection orthresholding and/or other image processing methods.The second stage restores a real world coordinate system from the data obtained in the first stage. Some methods assume objects with known geometry (or fiduciary markers) present in the scene. In some of those cases the scene 3D structure should be precalculated beforehand. If part of the scene is unknown simultaneous localization and mapping (SLAM) can map relative positions. If no information about scene geometry is available, structure from motion methods like bundle adjustment are used. Mathematical methods used in the second stage include projective (epipolar) geometry, geometric algebra, rotation representation with exponential map, kalman and particle filters, nonlinear optimization, robust statistics.
Augmented Reality Markup Language (ARML) is a data standard developed within the Open Geospatial Consortium (OGC),which consists of an XML grammar to describe the location and appearance of virtual objects in the scene, as well as ECMAScript bindings to allow dynamic access to properties of virtual objects.
To enable rapid development of Augmented Reality Application, some software development kits (SDK) have emerged.Some of the well known AR SDKs are offered by Metaio,Vuforia,Wikitude and Layar.

Augmented Reality in Unity3d

Augmented Reality in Gaming


Merchlar's mobile game Get On Targetuses a trigger image as fiduciary marker
Augmented reality allows gamers to experience digital game play in a real world environment. In the last 10 years there has been a lot of improvements of technology, resulting in better movement detection and the possibility for the Wii to exist, but also direct detection of the player's movements.


















CREATING AN AUGMENTED REALITY APP IN UNITY3D


Step 1: Installing the Extension

Download from this site

Go to the Unity download page to get the package for your development platform and proceed to the installation instructions below.

Windows installation

The recommended development environment for Android is Microsoft Windows 7 32/64-bit or Windows XP. Note that Unity 4.1 or later is required.
  1. Download the archive file from the download page.
  2. Extract the archive and copy the content (vuforia-unity-android-ios-xx-yy-zz.unitypackage) to a location that is convenient for your development environment..
Note: To make the extension-only package available in your Unity Standard Assets list, copy the package manually to the Standard Packages folder of your Unity installation.

Step 2: Compiling a Simple Project

Create a project

  1. After you have installed the package, create a new project in Unity.
  2. Select vuforia-unity-android-ios-xx-yy-zz.unitypackage from the list of Import the following packages.
            OR
  1. Right-click in the Project view of an open project and choose Import Package…
  2. Browse to the vuforia-unity-android-ios-xx-yy-zz.unitypackage you just installed and import it or double-click on the downloaded package. 
When creating the Unity project, avoid using spaces in the name if targeting iOS because this causes problems later when you build with Xcode. Note that by double-clicking on a package, you can selectively import parts of that package.
Note: If you do not see the Vuforia package in the list, go back to Step 1 and manually install the packages.
Next, you need to add a Device Database to your project. You can do this in two ways:
  • Create a target on the Target Manager
             OR
  • Use existing targets from other projects
  1. To use the Target Manager method, see the Target Manager to create and download a package. 
  2. Double-click the downloaded package, or right-click on Unity Project for “Import Package“ and "Custom Package.”
  3. Select the downloaded package.
  4. Click Import to import the target Device Database.
If you are copying the Device Database files from another project, be sure to copy any files located in theEditor/QCAR/ImageTargetTextures folder. These will be used to texture the target plane in the Unity editor.

Following is a summary of the folder contents:
  • Editor - Contains the scripts required to interact dynamically with Target data in the Unity editor
  • Plugins - Contains Java and native binaries that integrate the Vuforia AR SDK with the Unity Android or Unity iOS application
  • Qualcomm Augmented Reality - Contains the prefabs and scripts required to bring augmented reality to your Unity application
  • Streaming Assets - Contains the Device Database configuration XML and DAT files downloaded from the online Target Manager

Add AR assets and prefabs to scene

  1. Now that you have imported the Vuforia AR Extension for Unity, you can easily adapt your project to use augmented reality.
  2. Expand the Qualcomm Augmented Reality folder, and expand the Prefabs folder.
  3. Delete the “Main Camera” in your current scene hierarchy, and drag an instance of the ARCamera prefab into your scene. The ARCamera is responsible for rendering the camera image in the background and manipulating scene objects to react to tracking data.
  4. With the ARCamera in place and the target assets available in the StreamingAssets/QCAR folder, run the application on a supported device, and see the live video in the background.




Inspector view of the ARCamera

  5.  Drag an instance of the ImageTarget prefab into your scene. This prefab represents a single instance of an  Image Target object.
  
  6.  Select the ImageTarget object in your scene, and look at the Inspector. There should be an Image Target  Behaviour attached, with a property named Data Set. This property contains a drop-down list of all available Data Sets for this project. When a Data Set is selected, the Image Target property drop-down is filled with a list of the targets available in that Data Set. 
  
  7.Select the Data Set and Image Target from your StreamingAssets/QCAR project. In this example, we  choose "StonesAndChips".  (It is automatically populated from the Device Database XML file that is  downloaded from the online Target Manager). The Unity sample apps come with several Image Targets . To use them, copy them from the ImageTargets sample, or create your own at the Target Manager section of  this site.


Inspector view of the ImageTarget
Note: When you added the Image Target object to your scene, a plane object appeared. This object is a placeholder for actual Image Targets. In the inspector view of the Image Target there is a pop-up list called Image Target. From this list, you can choose any Image Target that has been defined in one of theStreamingAssets/QCAR datasets so that the Image Target object in your scene adopts the size and shape from the Image Target it represents. The object is also textured with the same image from which the Image Target was created.

Add 3D objects to scene and attach to trackables

Now you can bind 3D content to your Image Target.
  1. As a test, create a simple object (GameObject > Create Other > Sphere).
  2. Add the sphere as a child of the ImageTarget object by selecting it in the Hierarchy list and dragging it onto the ImageTarget item.
  3. Move the sphere in the scene until it is centered on the Image Target. You can also add a Directional Light to the scene (GameObject > Create Other > Directional Light).

TrackableEventHandler

The Default Trackable Event Handler (DefaultTrackableEventHandler) is a script component of the Image Target that causes the cube you just drew to appear or disappear automatically – an automatic reaction to the appearance of the target in the video.
You can override this default behavior – one could also imagine playing a fade-out animation, showing an info screen or playing a sound for example. For a more detailed description of theITrackableEventHandler interface, please see  'Responding to Tracking Events' in the Special Optionssection.

Adding Dataset load to camera

The Vuforia SDK has the ability to use multiple active Device Databases simultaneously. To demonstrate this capability, you can borrow the StonesAndChips and Tarmac Device Databases from the ImageTargets sample and configure both to load and activate in the ARCamera’s Inspector panel. This allows you to use targets from both Device Databases at the same time in your Unity scene.





Deploy the application

The next step is to deploy your application to a supported device.

Android deployment process

Unity provides a number of settings when building for Android devices – select from the menu (File > Build Settings… > Player Settings…) to see the current settings. Also, choose your platform now – Android or iOS.
  1. Click  Resolution and Presentation to select the required Default Orientation. Note: the Vuforia AR Extension now supports Auto Rotation.
  2. Click  Icon to set your application icon.
  3. Click Other Settings.  Set the Minimum API Level to Android 2.3 'Gingerbread' (API level 9) or higher. Set Bundle Identifier to a valid name (e.g., com.mycompany.firstARapp).
  4. Save your scene (File > Save Scene). 
  5. Open the build menu (File > Build Settings…). Make sure that your scene is part of Scenes in Build. If not, do one of the following:
  • Use Add Current to add the currently active scene.
  • Drag and drop your saved AR scene from the project view into the Window.
You can now build the application. Attach your Android device and then click Build And Run to initialize the deployment process.

iOS deployment process

Unity provides a number of settings when building for iOS devices (File > Build Settings > Platform > iOS icon).
  1. Before building, select the required Default Orientation.  Note:The Vuforia AR Extension now supports Auto Rotation.  
  2. Make sure that Target Platform is not set to armv6 (OpenGL ES 1.1). This version of the extension supports only OpenGL-ES 2.0. 
  3. Make sure that Bundle Identifier is set to the correct value for your iOS developer profile.
  4. Now you can choose to build the application. First, save your scene (File > Save Scene).
  5. Open the build menu (File > Build Settings…).
  6. Make sure that your scene is part of Scenes in Build. If this is not the case:
    a. Use Add Current to add the currently active scene.
     OR
    b. Drag and drop your saved AR scene from the project view into the Window.
  7. Press Build And Run to initialize the deployment process.
When building and running apps for iOS, Unity generates an Xcode project. It launches Xcode and loads this project. The Vuforia AR Extension includes a PostProcessBuildPlayer script that performs the task of integrating the Vuforia library into the generated Xcode project. This is run automatically when you select Build from within Unity. Be aware that if you manually change the generated Xcode project, you may need to update the PostProcessBuildPlayer script to avoid overwriting your changes.
The generated Xcode project includes a file called AppController.mm. There are Unity provided options in this file to tailor the performance of the app for your own purpose. The PostProcessBuildPlayer script sets the THREAD_BASED_LOOP as a default because it gives the best visible performance with the samples provided alongside the Vuforia AR Extension. Consider changing these options to whatever gives the best performance for your own application.


Created AR view

Using the application

You should have a printout of the appropriate Image Target in front of you. If you are working with a target from one of the sample apps, the PDFs are located at Editor/QCAR/ForPrint/*.pdf. Otherwise, print out the image that you uploaded to the Target Manager and make sure that the aspect ratio doesn’t change. When you look at the target using the device camera, you should see your sphere object bound to the target. Congratulations, you have successfully augmented reality!

Running in the editor

The Vuforia Unity Extension supports the Play Mode feature, which provides AR application emulation through the Unity Pro Editor using a webcam. Configure this feature through the Web 
To use Play Mode for Vuforia in Unity Pro, simply select the attached, or built-in, webcam that you want to use from the Camera Device menu, and activate Play Mode using the Play button at the top of the Editor UI.
You can also use the standard Unity Play Mode with non-Pro Unity versions and by set ‘Don’t use for Play Mode’ in the Web Cam Behaviour component.
To use standard Play Mode, adjust the transform of the ARCamera object to get your entire scene in view, and then run the application in the Unity editor. There is no live camera image or tracking in standard Play Mode; instead, all Targets are assumed to be visible. This allows you to test the non-AR components of your application, such as scripts and animations, without having to deploy to the device each time.

Source :- https://developer.vuforia.com/resources/dev-guide/
              http://en.wikipedia.org/wiki/Augmented_reality