Introducing: Holodog!

My newest infatuation is Unity, a game engine that within a matter of years has been a prime breeding ground for VR/AR development and experimentation.

Since November,  I’ve been collaborating on a Hololens application called “Holodog,” a Tamagotchi-esque virtual pet that you can interact with and actually see in the space you are in.

My Holodog team (Estella Tse and Katie Hughes) and I had little experience with creating a full-blown application in Unity.  And like the brave warriors we are, we decided to take on the challenge of learning it in a Hackathon environment within a matter of a weekend.  Forcing myself to learn so much in a short amount of time actually made me realize that developing for AR/VR isn’t as out of reach as it had seemed.  Special thanks to our mentor, Livi Erickson for helping us troubleshoot and guiding us when we needed it.

Here’s an example of our application in action, demoed by the wonderful Estella Tse.  On the left is our holographic dog, Buster.  Estella is able to see Buster when she puts on the headset.


A little overview of the Hololens –

If you’re not familiar with the Hololens, it’s an Augmented Reality headset released by Microsoft in March 2016.  It utilizes light refractions and amazing physics magic to create the illusion that virtual 3D objects are in the same space as you. Currently there is only a developer edition available, and the consumer version has yet to arrive.  

The main inputs that you can use to interact with the ‘holo-world’ are:

  1. Gaze
    This is essentially a circular cursor, except it’s in the center of your view in the Hololens.
  2. Gesture
    A large part of interacting with your surroundings is essentially signing with your hands.  Some of the main gestures:
  • Pinching / Grasping (what I call it)
  • Scrolling
  • Adjusting / Moving things around
  • Getting a menu
  • Voice
    Hololens has a built-in “Speech to text” voice command feature.

Other features include:

  • Camera
    • There’s a built-in camera in the Hololens, which is pretty useful.  You can livestream what you’re seeing to others and take pictures.  You can also integrate computer vision APIs to do complex things like image or object recognition using plugins like Vuforia (tutorial for this coming soon!)
  • Spatial mapping
    • This creates a low-poly map of your surroundings, making sure that holograms do not show up in the middle of a couch, or in a place that’s not visible to you.
    • There also is a coordinate system that the Hololens uses, so you can use…
  • …Spatial anchors!
    • You can store the placement of holograms within the local storage of your Hololens.  So if you’re in the same space and restart your device, your holographic dinosaur would be staring at you from the same location where it was set.
  • Spatial sound
    • You can create the effect that a certain sound is coming from a certain location.
  • There are many more features that I don’t have much experience using yet, but feel free to check out the details here!

How you can get started –

To be honest, I began writing a full-blown tutorial on building and deploying a first application to the Hololens, but I realized that a lot of the knowledge I got was from the Microsoft Developers tutorials, which are actually pretty straightforward for the most part.   Here are the two tutorials I would definitely urge you to do if you’re new to developing for this device:

  1. Build and deploy a basic application
  2. Add scripts to objects and use the Hololens emulator

A few notes on these tutorials:

  • Since documentation tends to be updated less frequently than the technology itself, if you find yourself questioning what the “Hololens SDK” or “Hololens Toolkit” is for Unity, know that you will NOT need that.  Unity has upgraded itself so you do not need this additional SDK when developing for the Hololens (yay!).
  • Once you begin creating Event Managers, know that you do not have to have write and drag separate EventManagers for each input script onto your “OrigamiCollection”.  You can create a Null Object and add all your Gaze, Gesture, Speech Managers onto it (they should be already available if you search for them, no need to copy/paste from the tutorial), and have that as the parent element of everything.  That will allow you to access those functions for all parts of your application without needing all of your objects to be children of OrigamiCollection.
  • This tutorial will be useful if you want to do things like screen capture what you are seeing within the Hololens, take pictures, or allow someone to view a demo without putting on the headset themselves (there is a latency issue with this method, but so far I haven’t found a better one – please let me know if there is!)

Troubleshooting.   You will most likely run into bugs, that is the nature of development and especially developing w/any emerging tech.

  • I resolved some of them using my mentor Livi Erickson’s guide to troubleshooting Hololens errors (Thanks again!).
  • Whenever I was initially deploying my app in Visual Studio, I received this fun error:
  • Unsafe code requires the `unsafe' command line option to be specified"

    I ended up creating a scms.rsp file in my asset folder and adding the line -unsafe within the file.  It didn’t make me feel great, since this is most definitely a hack, but this was a hackathon and so I persisted.  Here’s where I got that information.  You will probably have to restart Unity and Visual Studio after doing this.

  • You might also see this error in Visual Studio if you restart your device after debugging:
    DEP0001 : Unexpected Error: -2145615869

    The majority of the time what helped fix this was restarting Visual Studio entirely.  You should also pause the debugging process whenever you aren’t using it, since you might accidentally rebuild and cause this error again.  And remember: When in doubt, restart Visual Studio because that is probably your issue 🙂

Please feel free to reach out via Twitter @nerdyreddy if you’re curious how we created Holodog!