Sunday, December 9, 2018

Oculus VR for Unity - review of Sample Framework unitypackage v1.31

Before I reviewed the full Oculus Sample Framework unitypackage, I reviewed the core Oculus Integration Utilities which are included in the Sample Framework:


For 1.32
This week's update of the 1.32 brings the scripts and the plugins to 1.32 and SDK to 1.33:
Debug.Log() Console output:
Unity v2018.2.20f1, Oculus Utilities v1.32.0, OVRPlugin v1.32.0, SDK v1.33.0.
UnityEngine.Debug:Log(Object)
OVRManager:Awake() (at Assets/Oculus/VR/Scripts/OVRManager.cs:1078)

Oculus appear to be shifting focus towards putting  assets inside the Avatar plugin dlls accessible via CAPI class, and removing some meshes, etc from individual use.

This version 1.32 includes some assets (but not the scenes) under Oculus/SampleFramework/Core, including the locomotion, teleport scripts, which is a good thing and raises my start rating from 3 to 4 :)  (Full Oculus Sample Framework would get a full 5 if it ever comes back to Asset Store, as would the OVR Updated Interaction unity package which should still be available here https://developer.oculus.com/blog/easy-controller-selection/ along with useful touch controller meshes )
Also helping raise the star rating is the inclusion of LeftControllPf and RightControllerPf which are essentially standalone animated prefabs of the Oculus Touch Controllers, used by attaching them under a customised TrackedRemote prefab, setting Controller var to LTouch/RTouch, dragging the ...ControlPf prefabs to replace both Model vars, adding those "Touch" TrackedRemote prefabs for each of OVRCameraRig or OVRPlayerController's LeftHandAnchor/RightHandAnchor sub GameObjects..

Also new are the social/network sample scenes under Oculus/Platform/Samples.


Review of v1.31
 
Test Developing with Rift CV1, I have downloaded and tested some core sample scenes in the latest Oculus Integration Utilities v1.31 for Unity - most scenes have something useful to use, even if not everything works, or is incompatible as-is with Unity 2018 (requires some code tweaking for deprecated methods)
- something of note is the Oculus/Avatar/Samples/SocialStarter/Assets/MainScene, which does the loading of a custom avatar, complete with autofading linerender rays & pointer spheres which match those in the default Oculus Home user interface, however the code which creates those rays and pointers is not externally exposed to developers
- it appears to be an internal function of the Oculus.Avatar.CAPI class, of which we only can see headers, because it is an API inside the OVRPlugin.

Unfortunately that does not help developers who need lightweight high performance/low overhead code for mobile platforms, like Gear VR which I am attempting. Importing ONLY the VR subtree hierarchy from this unitypackage works for SOME scenes which have no platform or avatar code dependencies, but if you start using certain prefabs or scripts, a web of cross-dependencies on the AudioManager,Avatar,LipSync,Platform,Spatializer,VoiceMod subtrees will soon emerge, necessitating a monolithic import of bloat, even if you remove the various 'Sample' subtrees.

As official companion unity packages to the Oculus Integration/Utilities, I have tested the line render technique shown in the Oculus Sample OVR Updated Controller Interaction (separate unitypackage) and certain locomotion scenes in Oculus Sample Framework (separate unitypackage)
- both also designed for earlier versions of Unity, in order to test locomotion/teleport/rotation as well as UI interaction, but that brought additional complexity because of differing versions of OVR scripts in the sample scene prefabs which are incompatible with the same named scripts in the separate Oculus/VR/Scripts hierarchy.

Additionally, the Oculus Unity UI interaction for Touch/GearVR/Go/Malibu controllers seems broken when dealing with anything more complex than a toggle, button, input field, or slider bar. (i.e. Things like a Unity UI.DropDown result in reverting to selection via a non-visible Gaze Cursor.)
There is a user-submitted hack for this here: https://forums.oculusvr.com/developer/discussion/39327/gaze-pointers-drop-down-boxes
(thx @jhocking !)
It also bears repeating to say Unity UI Navigation must be disabled on all in-scene UI components to avoid the 'stuck focus' bug, (selects the previous UI element on click rather than the UI element currently being pointed to).

I will try the hack, and try to use something other than a dropdown or scroll field, but just be aware not everything is integrating well....
Update: I updated the hack to be implemented via CoRoutine only on Event DropDown Clicked, rather than every frame in Update.  It currently works for GazePointer and correctly switches between Controller raycasts so long as activeController for the EventSystem's OVRInputModule is being correctly set.

A number of issues and lack of obvious functions here are mitigated by another unitypackage "Oculus Sample Framework" v1.31, which very handily also contains the contents of THIS unitypackage as well as all the samples in a single download.



 
And now for the review of the Sample Framework v1.31:

The v1.31 version Oculus Sample Framework unitypackage combines updated/tweaked Sample Framework scenes and assets for a number of scenarios, with the core OVRPlugin 1.31 & Integration contained in the separately importable Oculus Integration Utilities v1.31 package.

Compared to the previous v1.30.1 of the Sample Framework, v1.31 shows an interesting migration by Oculus towards implementing separate LocalAvatar prefabs with animated hand mesh superimposed over controllers, and away from static controller mesh representations attached to the PlayerController/OVRCameraRig.

Highly customisable Locomotion with Teleport is still supported in the example scene for SampleScenes/FirstPerson/Locomotion2/TeleportAvatar, but the latest PlayerController prefab in the core v1.31 Integration package already supports very basic linear motion with snap rotation (without teleport).

The GazePointerRing is used as a pointer in that scene, rather than the nice Oculus Dashboard/Home ray pointer alpha shaded linerender, but you can see and copy how a straight laser linerender and a curved multisegment linerender from controller is handled via the teleport mechanism.

To see how a custom/personalised LocalAvatar works with those nice alpha shaded ray pointers and animated hands, see the Oculus/Avatar/Samples/SocialStarter/MainScene, which loads a custom avatar (so long as you set up the Oculus app id via Oculus Platform/Oculus Avatar editor inspectors)
Bug: -There are 2 audio listeners in the scene. Please ensure there is always exactly one audio listener in the scene.
Fix: disable Audio Listener for GO "Room Camera" & also "Main Camera" since OVRPlayerController contains OVRCameraRig with Audio Listener
Bug:  the HelpPanel GO on avatar left hand  has a null material
Fix: Drag Help (Rift) & GearHelp (GearVR) materials to OVRPlayerController GO PlayerController Script
(Dragging material Help to HelpPanel GO - does not fix in Play mode, it gets set back to null)
 
Basic UI interaction is also demonstrated in the Locomotion2/TeleportAvatar scene.

The overall download size of >500Mb seems large but it is possible to just take the specific scripts and scene prefabs you need, along with the core /Oculus/Avatar(2Mb),Platform(23Mb),VR(20mb) asset subtrees, for a more reasonable size. Note that the Oculus/VR subtree contains the actual OVRPlugin core dlls, so it is not an optional import :D

Start here for Oculus Unity doco:
https://developer.oculus.com/documentation/unity/latest/concepts/book-unity-gsg/
https://developer.oculus.com/documentation/unity/latest/concepts/unity-utilities-overview/#unity-utilities-overview https://developer.oculus.com/documentation/unity/latest/concepts/unity-reference-scripting/

and here for guides on adding linerender to controller using Sample Framework (for GearVR but also works for Rift touch and other rotational pointer type controllers like Go, Malibu)
https://developer.oculus.com/blog/adding-gear-vr-controller-support-to-the-unity-vr-samples/
https://developer.oculus.com/blog/easy-controller-selection/
https://forums.oculusvr.com/developer/discussion/54513/use-gearvrcontroller-with-selection-ray-unity

Happy VR coding with this asset, thx Oculus.

Sunday, January 7, 2018

Augmented Reality Vuforia Demo for Android & Unity








Description

Demonstration of Augmented Reality using Vuforia libraries, on 2 separate builds for Android (Android Studio , Unity 2017.2)

3D AR object tracking using ImageTarget & Virtual Buttons

Unity native AR integration requires Android v7 / iOS ARKit,
however adding Vuforia AR libraries allows backlevel Android support (back to v4.1 as of Unity 2017.2)

Please see the transcript for links & background info.


Transcript + timecodes

@0:00
Hi, This is Morte from PCAndVR,
with a video demonstrating Augmented Reality using Vuforia libraries, on builds for Android (using Android Studio & Unity 2017.2)

@00:14
If you're not familiar with Augmented Reality (AR), it means using a device to synchronise virtual objects into the sensory field (e.g. visual/auditory) through which we experience the real world.
@00:27
A key feature of AR is locking a virtual reference frame into a live camera image of either a specific image target, a 3D object, or a surface (ground plane).  This reference frame becomes a stable centre point for generated 3D objects to exist such that they appear synchronised with the real world in the device's camera image.
@00:51
Because an AR object or image target can be tracked in 3D space, it actually provides a form of head tracking for Mixed Reality (XR) if you use a mobile Virtual Reality (VR) head mount with a camera opening such as Samsung's Gear VR or custom Google Cardboard headsets.

@01:12
Without head tracking, VR headsets can only provide 6DoF (degrees of freedom) rotation from a single point, and no translational movement in virtual space, adding to motion sickness.
@01:24
AR devices can be something as simple as a recent model mobile phone or tablet with camera.
@01:30
Head mounted AR devices can be more useful for hands-free operation and keeping virtual objects in the field of view, but typically there are weight and portability issues with tethered head mount devices, and image quality limitations on current untethered glass-frame devices.

@01:48
This video will demo 3D AR object tracking using ImageTarget & Virtual Buttons, as they are reliable and simple to implement.
@01:59
First up, we have the Android Studio built app, which largely contains the same feature set as the Unity app, albeit using simpler models and older image targets.
@02:11
The 1st Android Studio example uses the stones image target.  The added 3D object is a teapot, originating from the famous 1975 Utah teapot 3D model.
https://en.wikipedia.org/wiki/Utah_teapot


@02:24
Using the app we can show the teapot from various distances, angles, and even zoom in on the inside of the spout, with the teapot visually appearing fairly stable against the stones image target.

@02:37
The image targets are printed on A4 paper from files in the supplied library, and are designed to be high contrast, with numerous well defined edges and non-repeating features.  This allows detection and tracking even if only a portion of the image is visible in the camera field.

@02:56
The 2nd Android Studio example uses the wood background image target, with four virtual buttons.   Each button gets mapped with a virtual rectangle surrounding it to detect occlusion by real world objects, effectively meaning the button has been pressed.  In this case, it changes the colour of the teapot to match the colour of the virtual button covered by my fingers: red, blue, yellow, and green.

@03:29
Next, we move to the Unity build of the same type of example features, with image targets and non-interactive 3D objects, and virtual buttons to allow interactive 3D object behaviour using the same image targets.

@03:45
Here we have a standing astronaut, which can jitter if the AR device is not held steady, an effect worsened by the model's height, and relatively small target image size
@03:56
next is an oxygen cylinder which is significantly more stable due to the 3D model being smaller and positioned closer to the target surface,
@04:08
and an animated drone, which while small, can jitter in the hovering position above the target surface.

@04:15
I coded a custom 3D image of my own showing an empty gamelevel platform, using the stones image target and downloaded .xml file, but I could have used any suitable custom image with the Vuforia web xml generator.

@04:30
Finally, the Unity build's virtual buttons feature shows the astronaut waving after a button press,
@04:38
the oxygen cylinder displaying an infographic dropdown visual,
@04:45
the drone emitting a blue projection field.
@04:49
and the fissure changing from white steam to dark red.

@04:54
In concluding, we saw some limitations of smaller, less complex image targets in terms of feature detection & target tracking,
with resultant visual instability of mapped 3D objects, worsening further above the surface of the target,
and objects disappearing at low incident angles to the target, where feature detection is not possible.  
@05:16
Both these limitations can be mitigated somewhat through use of 3D object targets or larger complex multiple image targets on 3D objects,
since the detection should still clearly see features of one target even if those of another become occluded or at a low angle.

@05:36
For further background info & links, please see the transcript for this video.

@05:41
And that's it, so thanks for watching!
@05:44




Background info

If you are thinking of doing your own Vuforia AR builds for Android,
The 1st demo uses an app built on Android Studio using the Vuforia Android SDK:
https://library.vuforia.com/articles/Solution/Getting-Started-with-Vuforia-for-Android-Development.html

A number of books & online tutorials exist, but can be confusing due to the deprecated legacy Android Vuforia library API which used QualComm. & QCAR. naming.
Now Vuforia library naming is standalone, and Android native builds require Android Studio, replacing Eclipse.
https://developer.vuforia.com/forum/qcar-api/import-comqualcommqcar-cannot-be-resolved
However it still required significant manual tweaking and customisation for me to convert deprecated code from legacy Eclipse format into Gradle-based Android Studio framework.
It is really not worth pursuing Vuforia libraries for Android Studio since the Vuforia Unity SDK makes the entire process simple and modularised, utilising drag n drop gameobjects.

It is already far easier to go straight to Unity for both Augmented, Virtual, & mixed reality  (XR) features.
Unity native AR integration requires Android v7 / iOS ARKit,
however adding Vuforia AR library support allows backlevel Android support (back to v4.1 as of Unity 2017.2)
Don't bother with the downloadable legacy Vuforia library .unitypackage for Unity 5.x.x or earlier, there are too many deprecations to contend with.


Aside from using the image targets from the supplied library, creating your own image targets for use in an app consists of uploading an image to your Vuforia web account, and processing them for the appropriate platform. 
@03:10
Internally, Vuforia converts images into feature points, downloaded from the site as an xml file used in the app as a form of 2D vector UV mapping. 
@03:22
These feature points are used for tracking in conjunction with the Vuforia image detection & processing algorithms, and any chosen 3D objects can be overlaid onscreen, locked to the detected target so that movement of the device's camera shows the chosen 3D objects from the appropriate angle and distance, matching the real world target.



This is a general introduction to Vuforia:
https://library.vuforia.com/getting-started.html

This link helps you get started with Vuforia in Unity:
https://library.vuforia.com/articles/Training/getting-started-with-vuforia-in-unity-2017-2-beta.html


Some tips & tricks for Vuforia on Unity:
https://gamedevelopment.tutsplus.com/tutorials/vuforia-tips-and-tricks-on-unity--cms-28744

Vuforia's legal doco allows for free use of a subset of features during app development:
https://developer.vuforia.com/vui/pricing

Vuforia requires setting up a Licence for each Unity project/application you develop
https://library.vuforia.com/articles/Training/Vuforia-License-Manager

If you use the supplied example code from the .unitypackage in Asset Store (for Unity 2017.2 or later), be sure to use the link here in the notes to understand how to successfully apply custom image targets without them being overriden by the sample code:
https://developer.vuforia.com/forum/unity/targets-not-working-201720f3-besides-samples#comment-61739


Google Daydream View VR requires specific compatible Android v7 phones such as Google's Pixel 2 or Samsung Note 8/Galaxy S8, but reference headsets do not allow rear camera visibility, so no AR or XR on those.