Thursday, March 27, 2014

Augmented Reality in Unity3d

Augmented Reality in Gaming


Merchlar's mobile game Get On Targetuses a trigger image as fiduciary marker
Augmented reality allows gamers to experience digital game play in a real world environment. In the last 10 years there has been a lot of improvements of technology, resulting in better movement detection and the possibility for the Wii to exist, but also direct detection of the player's movements.


















CREATING AN AUGMENTED REALITY APP IN UNITY3D


Step 1: Installing the Extension

Download from this site

Go to the Unity download page to get the package for your development platform and proceed to the installation instructions below.

Windows installation

The recommended development environment for Android is Microsoft Windows 7 32/64-bit or Windows XP. Note that Unity 4.1 or later is required.
  1. Download the archive file from the download page.
  2. Extract the archive and copy the content (vuforia-unity-android-ios-xx-yy-zz.unitypackage) to a location that is convenient for your development environment..
Note: To make the extension-only package available in your Unity Standard Assets list, copy the package manually to the Standard Packages folder of your Unity installation.

Step 2: Compiling a Simple Project

Create a project

  1. After you have installed the package, create a new project in Unity.
  2. Select vuforia-unity-android-ios-xx-yy-zz.unitypackage from the list of Import the following packages.
            OR
  1. Right-click in the Project view of an open project and choose Import Package…
  2. Browse to the vuforia-unity-android-ios-xx-yy-zz.unitypackage you just installed and import it or double-click on the downloaded package. 
When creating the Unity project, avoid using spaces in the name if targeting iOS because this causes problems later when you build with Xcode. Note that by double-clicking on a package, you can selectively import parts of that package.
Note: If you do not see the Vuforia package in the list, go back to Step 1 and manually install the packages.
Next, you need to add a Device Database to your project. You can do this in two ways:
  • Create a target on the Target Manager
             OR
  • Use existing targets from other projects
  1. To use the Target Manager method, see the Target Manager to create and download a package. 
  2. Double-click the downloaded package, or right-click on Unity Project for “Import Package“ and "Custom Package.”
  3. Select the downloaded package.
  4. Click Import to import the target Device Database.
If you are copying the Device Database files from another project, be sure to copy any files located in theEditor/QCAR/ImageTargetTextures folder. These will be used to texture the target plane in the Unity editor.

Following is a summary of the folder contents:
  • Editor - Contains the scripts required to interact dynamically with Target data in the Unity editor
  • Plugins - Contains Java and native binaries that integrate the Vuforia AR SDK with the Unity Android or Unity iOS application
  • Qualcomm Augmented Reality - Contains the prefabs and scripts required to bring augmented reality to your Unity application
  • Streaming Assets - Contains the Device Database configuration XML and DAT files downloaded from the online Target Manager

Add AR assets and prefabs to scene

  1. Now that you have imported the Vuforia AR Extension for Unity, you can easily adapt your project to use augmented reality.
  2. Expand the Qualcomm Augmented Reality folder, and expand the Prefabs folder.
  3. Delete the “Main Camera” in your current scene hierarchy, and drag an instance of the ARCamera prefab into your scene. The ARCamera is responsible for rendering the camera image in the background and manipulating scene objects to react to tracking data.
  4. With the ARCamera in place and the target assets available in the StreamingAssets/QCAR folder, run the application on a supported device, and see the live video in the background.




Inspector view of the ARCamera

  5.  Drag an instance of the ImageTarget prefab into your scene. This prefab represents a single instance of an  Image Target object.
  
  6.  Select the ImageTarget object in your scene, and look at the Inspector. There should be an Image Target  Behaviour attached, with a property named Data Set. This property contains a drop-down list of all available Data Sets for this project. When a Data Set is selected, the Image Target property drop-down is filled with a list of the targets available in that Data Set. 
  
  7.Select the Data Set and Image Target from your StreamingAssets/QCAR project. In this example, we  choose "StonesAndChips".  (It is automatically populated from the Device Database XML file that is  downloaded from the online Target Manager). The Unity sample apps come with several Image Targets . To use them, copy them from the ImageTargets sample, or create your own at the Target Manager section of  this site.


Inspector view of the ImageTarget
Note: When you added the Image Target object to your scene, a plane object appeared. This object is a placeholder for actual Image Targets. In the inspector view of the Image Target there is a pop-up list called Image Target. From this list, you can choose any Image Target that has been defined in one of theStreamingAssets/QCAR datasets so that the Image Target object in your scene adopts the size and shape from the Image Target it represents. The object is also textured with the same image from which the Image Target was created.

Add 3D objects to scene and attach to trackables

Now you can bind 3D content to your Image Target.
  1. As a test, create a simple object (GameObject > Create Other > Sphere).
  2. Add the sphere as a child of the ImageTarget object by selecting it in the Hierarchy list and dragging it onto the ImageTarget item.
  3. Move the sphere in the scene until it is centered on the Image Target. You can also add a Directional Light to the scene (GameObject > Create Other > Directional Light).

TrackableEventHandler

The Default Trackable Event Handler (DefaultTrackableEventHandler) is a script component of the Image Target that causes the cube you just drew to appear or disappear automatically – an automatic reaction to the appearance of the target in the video.
You can override this default behavior – one could also imagine playing a fade-out animation, showing an info screen or playing a sound for example. For a more detailed description of theITrackableEventHandler interface, please see  'Responding to Tracking Events' in the Special Optionssection.

Adding Dataset load to camera

The Vuforia SDK has the ability to use multiple active Device Databases simultaneously. To demonstrate this capability, you can borrow the StonesAndChips and Tarmac Device Databases from the ImageTargets sample and configure both to load and activate in the ARCamera’s Inspector panel. This allows you to use targets from both Device Databases at the same time in your Unity scene.





Deploy the application

The next step is to deploy your application to a supported device.

Android deployment process

Unity provides a number of settings when building for Android devices – select from the menu (File > Build Settings… > Player Settings…) to see the current settings. Also, choose your platform now – Android or iOS.
  1. Click  Resolution and Presentation to select the required Default Orientation. Note: the Vuforia AR Extension now supports Auto Rotation.
  2. Click  Icon to set your application icon.
  3. Click Other Settings.  Set the Minimum API Level to Android 2.3 'Gingerbread' (API level 9) or higher. Set Bundle Identifier to a valid name (e.g., com.mycompany.firstARapp).
  4. Save your scene (File > Save Scene). 
  5. Open the build menu (File > Build Settings…). Make sure that your scene is part of Scenes in Build. If not, do one of the following:
  • Use Add Current to add the currently active scene.
  • Drag and drop your saved AR scene from the project view into the Window.
You can now build the application. Attach your Android device and then click Build And Run to initialize the deployment process.

iOS deployment process

Unity provides a number of settings when building for iOS devices (File > Build Settings > Platform > iOS icon).
  1. Before building, select the required Default Orientation.  Note:The Vuforia AR Extension now supports Auto Rotation.  
  2. Make sure that Target Platform is not set to armv6 (OpenGL ES 1.1). This version of the extension supports only OpenGL-ES 2.0. 
  3. Make sure that Bundle Identifier is set to the correct value for your iOS developer profile.
  4. Now you can choose to build the application. First, save your scene (File > Save Scene).
  5. Open the build menu (File > Build Settings…).
  6. Make sure that your scene is part of Scenes in Build. If this is not the case:
    a. Use Add Current to add the currently active scene.
     OR
    b. Drag and drop your saved AR scene from the project view into the Window.
  7. Press Build And Run to initialize the deployment process.
When building and running apps for iOS, Unity generates an Xcode project. It launches Xcode and loads this project. The Vuforia AR Extension includes a PostProcessBuildPlayer script that performs the task of integrating the Vuforia library into the generated Xcode project. This is run automatically when you select Build from within Unity. Be aware that if you manually change the generated Xcode project, you may need to update the PostProcessBuildPlayer script to avoid overwriting your changes.
The generated Xcode project includes a file called AppController.mm. There are Unity provided options in this file to tailor the performance of the app for your own purpose. The PostProcessBuildPlayer script sets the THREAD_BASED_LOOP as a default because it gives the best visible performance with the samples provided alongside the Vuforia AR Extension. Consider changing these options to whatever gives the best performance for your own application.


Created AR view

Using the application

You should have a printout of the appropriate Image Target in front of you. If you are working with a target from one of the sample apps, the PDFs are located at Editor/QCAR/ForPrint/*.pdf. Otherwise, print out the image that you uploaded to the Target Manager and make sure that the aspect ratio doesn’t change. When you look at the target using the device camera, you should see your sphere object bound to the target. Congratulations, you have successfully augmented reality!

Running in the editor

The Vuforia Unity Extension supports the Play Mode feature, which provides AR application emulation through the Unity Pro Editor using a webcam. Configure this feature through the Web 
To use Play Mode for Vuforia in Unity Pro, simply select the attached, or built-in, webcam that you want to use from the Camera Device menu, and activate Play Mode using the Play button at the top of the Editor UI.
You can also use the standard Unity Play Mode with non-Pro Unity versions and by set ‘Don’t use for Play Mode’ in the Web Cam Behaviour component.
To use standard Play Mode, adjust the transform of the ARCamera object to get your entire scene in view, and then run the application in the Unity editor. There is no live camera image or tracking in standard Play Mode; instead, all Targets are assumed to be visible. This allows you to test the non-AR components of your application, such as scripts and animations, without having to deploy to the device each time.

Source :- https://developer.vuforia.com/resources/dev-guide/
              http://en.wikipedia.org/wiki/Augmented_reality

No comments:

Post a Comment