Hear from Nexus Studios’ Head of Immersive, Pablo Colapinto, how we teamed up with Niantic to explore the potential of their newly launched toolkit, the ARDK.

An Open Source Sample App From Niantic and Nexus Studios

On top of many other exciting announcements, we at Nexus Studios are particularly pleased to be presenting with our Niantic friends at the Augmented World Expo this week, a day after they launch their Lightship Augmented Reality Development Kit [the ARDK]. It’s a great moment to be sharing our vision alongside theirs. 

Across the globe, Nexus has developed a well-known practice as pioneers in spatial storytelling. In this practice we use space itself as an instrument to better connect people together through story.  With XR, the space between us, no matter how vast, is no longer empty. You can do things with this newly immersive space to communicate with each other. You can marvel your partner with imaginary worlds that bloom in your living room, or send your friends an alternative image of yourself on a night out, your outfit transformed. You can conjure giant versions of your heroes and watch them walk amongst buildings. You can gather your fans at a central location or disperse them on a journey to find treasure through the city’s streets. You can reach your audience anywhere.

Space itself is a budding language full of limitless potential. It is now a directly manipulable instrument of consumer technology; we have folded it into our economy. It can be leveraged in all these emotionally innovative ways. There is nothing artificial, virtual, or meta about this ether. It is very real.

And it requires craft to cultivate its more poetic forms. As we explore the future of spatial storytelling together, there are three ingredients we will all need to master let’s call them the three C’s Character, Context, and Community. Our abilities to create meaningful spatial experiences are driven by these ingredients:

  • Characters need dynamic real-time behaviors
  • Context requires machine-learning driven spatial understanding of your environment
  • Community is best served by multi user participation.

With these in mind, we began our adventure with Niantic, who has developed technology to help with all three C’s.

Encouraged by the positive response to our quick Early Explorer prototype (a remote game of two-player hot potato, in outer space um, I’ll explain another time), we sought to do more to share our learnings. As artisans, a plucky spirit of craftsmanship drives our strategic innovation. Folks in the know like Niantic    recognise that we bring the same meticulous attention to detail to our work across film, animation, and immersive, whether dealing with puppets, motion capture suits, or machine learning algorithms. The care we take to manufacture stories is reflected in the quality of production that delights our audiences. In this light, our desire to do more was a natural progression.

In this spirit we got to work with Niantic crafting their first open source project   The AR Voyage: A Lightship Demo. This free, downloadable, and modifiable Unity project is a collection of micro-game programming examples (including off-the-shelf multiplayer) intended to be inspiring. As such, they have the following characteristics: clarity of purpose, ease of use, and charm of execution.  Those are three pretty good characteristics to keep in mind when making anything by the way.  

Our mission was clear: help the developer community better understand how they can take advantage of Niantic’s new Unity APIs to create novel experiences.  Now, before you go designing a whole new pikmin or Pokémon GO game, be advised these features currently do not enable geo-locating content. But they are still very interesting. In addition to depth estimation, semantic segmentation, and multipeer networking, the API also enables excellent meshing capabilities, for discovering the topography of the space around you.

Every journey begins with a toolkit of possibilities.  In this case:

  • Meshing: Determining the topography of the space around you.
  • Multipeer Networking: Connecting up to 8 users in a remote or co-located joint session.
  • Semantic Segmentation:  Categorizing pixels based on real-world material characteristics.
  • Depth Estimation:  Assigning a depth value to each pixel (distance to the camera).
  • Occlusion: Hiding virtual objects behind real-world objects.
  • Gameboard: Determining the walkable area around you (i.e. the floor).

Aiming to highlight smooth and compelling gameplay across these features, we built four mini games, which I will briefly describe here:

 

Build-a-Ship
A scavenger- hunt-like mini game that illuminates segmentation, the ability to categorise different pixels according to their real-world material, using Niantic’s own custom machine learning models. Supporting multiple types of natural things around you, including grass, sky, and trees, the Lightship ARDK offers a cool new way to play with the world in your apps.  In this way we build a bit of treasure hunt you have to find certain things in the real world and collect them to make your airship. 

A while back, in order to prepare for their release and to better understand and learn from the needs of software developers, Niantic established the Early Explorers program. Leaders in XR were invited to participate in experimenting with the new Unity-based platform before it was shared with a wider public.

At the time, the platform already included some nifty custom machine learning algorithms running on top of any modern phones camera feed (be it Android or iOS) this enabled things such as monodepth estimation (the ability to grab depth information from a single image source), semantic segmentation (the ability to detect which pixels are grass, sky, fur, etc), and core AR features built on top of these channels, such as realworld occlusion (so that virtual objects could hide behind real ones) and the “gameboard” (so that virtual characters could know what areas of the real world ground plane were free of obstacles, in order to better navigate around them).  An additional major component was their complete multipeer networking solution, for synchronizing the state and location of play for up to 8 players in a session.  (Since then, they’ve added meshing, but more on that in a bit).

We immediately set out to explore the more intriguing aspects of this off-the-shelf toolkit. For instance, I could scan my space on an Android device, collect a map of what part was actually my floor, and send that data to another player, who could then see a birds eye view map of my space and also where  I was relative to it.  Below you can see a bare bones example of this in action. Player A is moving around their room looking for a “neutrino” and Player B is tracking her position in real-time. 

Without the Lightship ARDK in hand, and the helpfulness of the Niantic engineering team, I’m still not sure how we would have made such a quick gameplay exercise.

Snow Fight
A basic multiplayer game with straight forward logic that illuminates multipeer networking, the ability to synchronize spatially and temporally with other players for a shared experience. Supporting up to 8 players, Niantic’s advanced networking protocols allow for fine-tuning of this powerful out-of-the-box feature. This collaborative game you have to knock bees out of the air with snowballs.

Snowball Toss
An easy to program game that illuminates meshing, the ability to uncover and interact with the physical topography of your space. This works on all modern Android and iOS phones (no need for a LIDAR sensor). Watching and hearing snowballs thwack and bounce against your walls and tables will always be satisfying, trust me.

Walkabout
A navigation game that illuminates the gameboard and occlusion features, which use depth sensing and segmentation to automatically recognise the ground and real-world obstacles, and create a tiled board around them. With these features, you can design characters that move around you in a spatially believable way. In our case, we made Captain Doty walk around to build a snowman.

I hope you can already start to envision how you might combine these features to your advantage, or maybe you’ve thought of some more. Sign up at https://lightship.dev/, download the Sample App code, poke around, and begin to scheme and dream. Participate. And definitely let me know what you are up to.

At Nexus Studios, we are continuing to explore the use of Niantic technology across the arts, sports, travel, education, and other fields. Next year we will release Story Trails, a massive immersive experience commissioned by Unboxed: Creativity in the UK that invites the public to both create and explore the hidden history of their community.Stay tuned for that, and more projects in the works, as we continue to hone our practice together.

1/3
    XS
    SM
    MD
    LG
    XL