I don't understand why you need to import the iOS sdk where as you have already imported vuforia-unity-x-x-x.unitypackage. You don't need to import the iOS sdk.
Just Import the Unity SDK and set up your project or a
Unity Sample project

's by vuforia and build for iOS if you want to deploy in iOS.

Let see:

How To Setup a Simple Unity Project using Vuforia's Unity SDK:

It's easy to set-up a basic Vuforia project in Unity. Follow these steps to import the Vuforia Unity extension and the add and configure the prefabs and assets used to develop Vuforia apps in Unity.

Create a project

  1. Create a new project in Unity.
  2. Select vuforia-unity-xx-yy-zz.unitypackage from the list of Import the following packages.


  1. Right-click in the Project view of an open project and choose Import Package…
  2. Browse to the vuforia-unity-xx-yy-zz.unitypackage you just installed and import it or double-click on the downloaded package.

When creating the Unity project, avoid using spaces in the name if targeting iOS because this causes problems later when you build with Xcode.

Note: If you do not see the Vuforia package in the list, go back to How to Install the Unity Extension

and manually install the packages.

Obtain a License Key

You will need create a license key for your app in the License Manager and add this key to your project.

How To Create a License Key

How To add a License Key to your Vuforia App

Adding Targets

Next, you need to add a Device Database to your project. You can do this in two ways:

  • Create a target on the Target Manager


  • Use existing targets from other projects
  1. To use the Target Manager method, see Vuforia Target Manager
  1. to create and download a package.
  2. Double-click the downloaded package, or right-click on Unity Project for “Import Package“ and "Custom Package.”
  3. Select the downloaded package.
  4. Click Import to import the target Device Database.

If you are copying the Device Database files from another project, be sure to copy any files located in the Editor/QCAR/ImageTargetTextures folder. These will be used to texture the target plane in the Unity editor.

You should now see the following folder structure inside Unity:

Project Folders

  • Editor - Contains the scripts required to interact dynamically with Target data in the Unity editor
  • Plugins - Contains Java and native binaries that integrate the Vuforia AR SDK with the Unity Android or Unity iOS application
  • Vuforia - Contains the prefabs and scripts required to bring augmented reality to your Unity application
  • Streaming Assets / QCAR - Contains the Device Database configuration XML and DAT files downloaded from the online Target Manager

Add AR assets and prefabs to scene

  1. Now that you have imported the Vuforia AR Extension for Unity, you can easily adapt your project to use augmented reality.
  2. Open the /Vuforia/Prefabs folder
  3. Delete the “Main Camera” in your current scene hierarchy, and drag an instance of the ARCamera prefab into your scene. The ARCamera is responsible for rendering the camera image in the background and manipulating scene objects to react to tracking data.
  4. With the ARCamera in place and the target assets available in the StreamingAssets/QCAR folder, run the application on a supported device, and see the live video in the background.
  5. Drag an instance of the ImageTarget prefab into your scene. This prefab represents a single instance of an Image Target object.
  6. Select the ImageTarget object in your scene, and look at the Inspector. There should be an Image Target Behaviour attached, with a property named Data Set. This property contains a drop-down list of all available Data Sets for this project. When a Data Set is selected, the Image Target property drop-down is filled with a list of the targets available in that Data Set.
  7. Select the Data Set and Image Target from your StreamingAssets/QCAR project. In this example, we choose "StonesAndChips". (It is automatically populated from the Device Database XML file that is downloaded from the online Target Manager). The Unity sample apps come with several Image Targets . To use them, copy them from the ImageTargets sample, or create your own at the Target Manager
  1. section of this site.

Inspector view of the ImageTarget

Note: When you added the Image Target object to your scene, a gray plane object appeared. This object is a placeholder for actual Image Targets. In the inspector view of the Image Target there is a pop-up list called Image Target. From this list, you can choose any Image Target that has been defined in one of theStreamingAssets/QCAR datasets so that the Image Target object in your scene adopts the size and shape from the Image Target it represents. The object is also textured with the same image from which the Image Target was created.

Add 3D objects to scene and attach to trackables

Now you can bind 3D content to your Image Target.

  1. As a test, create a simple Cube object (GameObject > Create Other > Cube).
  2. Add the cube as a child of the ImageTarget object by selecting it in the Hierarchy list and dragging it onto the ImageTarget item.
  3. Move the cube in the scene until it is centered on the Image Target. You can also add a Directional Light to the scene (GameObject > Create Other > Directional Light).


The Default Trackable Event Handler (DefaultTrackableEventHandler) is a script component of the Image Target that causes the cube you just drew to appear or disappear automatically – an automatic reaction to the appearance of the target in the video.

You can override this default behavior by revising the DefaultTrackableEventHandler script or writing your own by implementing the ITrackableEventHandler interface.

Adding Dataset load to camera

The Vuforia SDK has the ability to use multiple active Device Databases simultaneously. To demonstrate this capability, you can borrow the StonesAndChips and Tarmac Device Databases from the ImageTargets sample and configure both to load and activate in the ARCamera’s Inspector panel. This allows you to use targets from both Device Databases at the same time in your Unity scene.

Deploy the application

The next step is to deploy your application to a supported device.

Android deployment process

Unity provides a number of settings when building for Android devices – select from the menu (File > Build Settings… > Player Settings…) to see the current settings. Also, choose your platform now – Android or iOS.

  1. Click Resolution and Presentation to select the required Default Orientation.
  2. Click Icon to set your application icon.
  3. Click Other Settings. Set the Minimum API Level to Android 2.3 'Gingerbread' (API level 9) or higher. Set Bundle Identifier to a valid name (e.g., com.mycompany.firstARapp).
  4. Save your scene (File > Save Scene).
  5. Open the build menu (File > Build Settings…). Make sure that your scene is part of Scenes in Build. If not, do one of the following:
  • Use Add Current to add the currently active scene.
  • Drag and drop your saved AR scene from the project view into the Window.

You can now build the application. Attach your Android device and then click Build And Run to initialize the deployment process.

iOS deployment process

Unity provides a number of settings when building for iOS devices (File > Build Settings > Platform > iOS icon).

  1. Before building, select the required Default Orientation. Note:The Vuforia AR Extension now supportsAuto Rotation.
  2. Make sure that Target Platform is not set to armv6 (OpenGL ES 1.1). This version of the extension supports only OpenGL-ES 2.0.
  3. Make sure that Bundle Identifier is set to the correct value for your iOS developer profile.
  4. Now you can choose to build the application. First, save your scene (File > Save Scene).
  5. Open the build menu (File > Build Settings…).
  6. Make sure that your scene is part of Scenes in Build. If this is not the case:
    a. Use Add Current to add the currently active scene.
    b. Drag and drop your saved AR scene from the project view into the Window.
  7. Press Build And Run to initialize the deployment process.

When building and running apps for iOS, Unity generates an Xcode project. It launches Xcode and loads this project. The Vuforia AR Extension includes a PostProcessBuildPlayer script that performs the task of integrating the Vuforia library into the generated Xcode project. This is run automatically when you select Build from within Unity. Be aware that if you manually change the generated Xcode project, you may need to update the PostProcessBuildPlayer script to avoid overwriting your changes.

. There are Unity provided options in this file to tailor the performance of the app for your own purpose. The PostProcessBuildPlayer script sets theTHREAD_BASED_LOOP as a default because it gives the best visible performance with the samples provided alongside the Vuforia AR Extension. Consider changing these options to whatever gives the best performance for your own application.

Created AR scene

Using the application

You should have a printout of the appropriate Image Target in front of you. If you are working with a target from one of the sample apps, the PDFs are located at Editor/QCAR/ForPrint/*.pdf. Otherwise, print out the image that you uploaded to the Target Manager and make sure that the aspect ratio doesn’t change. When you look at the target using the device camera, you should see your cube object bound to the target. Congratulations, you have successfully augmented reality!

Running in the editor

The Vuforia Unity Extension supports the Play Mode feature, which provides AR application emulation through the Unity Editor using a webcam. Configure this feature through the Web Cam Behaviour component of the ARCamera in the Inspector.

To use Play Mode for Vuforia in Unity, simply select the attached, or built-in, webcam that you want to use from the Camera Device menu, and activate Play Mode using the Play button at the top of the Editor UI.

You can also use the standard Unity Play Mode by checking ‘Don’t use for Play Mode’ in the Web Cam Behaviour component.

To use standard Play Mode, adjust the transform of the ARCamera object to get your entire scene in view, and then run the application in the Unity editor. There is no live camera image or tracking in standard Play Mode; instead, all Targets are assumed to be visible. This allows you to test the non-AR components of your application, such as scripts and animations, without having to deploy to the device each time.