Article
Posted on Wed 15 Nov 2017
Virtually Useful: Step Four - tools for making in VR
In this episode, Studio Managing Producer, Verity McIntosh introduces some practical tools for making creative content in VR.
Posted by
Previously: Step Three - Design dimensions (what really works in VR)
As with all mediums, it takes time, practice and grim determination to get good at making creative content in VR. It can take a while to uncover your own taste, discover what you like and dislike in other people’s work, try out a number of tools and develop your own style. You may need to build your networks to collaborate with others on things like VR development, 3D modelling, animation, VR artistry and 360 filming, and this of course brings rewards of its own as you get to draw from the different skills, experience and perspectives of those in your wider team. To make a start on your own, however, and to become a better collaborator/commissioner, I share here a few tools and tricks you might wish to explore as you set out on a VR adventure.
Quick caveat: What follows is neither a complete, nor a particularly well-balanced list (there is a conspicuous absence of tools for spatial sound for instance) as I have decided to apply the age-old technique of sharing what I know, rather than attempting to pass off things that I have Googled as knowledge!
These are game engines that are becoming pretty ubiquitous with those building 3D, computer-generated content for VR. Both are free to start using with various royalty models, licensing and fees applicable once you start earning revenue. They can both be used to make content for most VR headsets and mobile devices.
Unity and Unreal are pro tools, but both have extensive online tutorials and active forums well populated by others asking and answering questions from those at various stages of development and learning. There are some libraries of pre-existing characters, objects and environments that can be used by anyone that can be helpful when you are getting started.
These are all professional 3D computer graphics software toolkits that may be familiar to those with experience of CGI or animation in film and television. 3DS Max (formerly 3D Studio Max) and Maya are products from Autodesk, and Blender is free to use, open source alternative.
These tools are used for making really high quality assets that would be at home in a Hollywood movie. The rendering for this kind of stuff takes days, and as such would not work if served up directly into a VR headset. They are generally used as a first step for asset generation that will then be imported into something like Unity of Unreal in order to handle the real time rendering.
This year, hot on the heels of the heels of the Pokémon Go craze, both Apple and Google have released developer toolkits to make it easier to create robust and user-friendly augmented reality apps for iPhones and Android phones respectively.
It’s worth flagging up that both Apple’s ARKit and Google’s ARCore are only useful if you are developing work for the newest handsets. For Apple that means devices running iOS 11 on an iPhone6S or newer, and for Google ARCore the only Android phones that are suitable are the Samsung Galaxy S8 and Google’s Pixel phones; although it is likely that this will become less of an issue over time as future handsets are built with the presumption of AR capability.
Easy (easier) tools to get things started
There are a number of great consumer paint, draw and sculpt tools now available for most headsets. Whilst they have been primarily designed to be played with in the home, many of them can also be put to work to create 3D, navigable storyboarding and testing environments. This can help you to test out your ideas and share thinking with collaborators, or pitch concepts directly in virtual or augmented reality, without the need for any programming, CAD or 3D modelling on a computer first. Some of these tools can even be used to create audience-ready works, thanks to an insightful decision on the part of many developers to allow users to ‘save and publish’ their designs to be experienced by others.
TiltBrush – a fantastic 3D painting tool from Google, developed for HTC Vive and laterly made available for the Oculus Rift. TiltBrush enables you to move around a space wildly gesturing with your hand controllers in order to wield a range of brushes and painterly effects. It is a total joy to play with.
A recent and sublime example of this is Draw Me Close, a co-production between the National Theatre and the NFB, in collaboration with Pervasive Media Studio resident’s All Seeing Eye and with illustrations by Teva Harrison. It explores the relationship between a mother and her son in the wake of her terminal-cancer diagnosis.
Image: Draw Me Close © National Theatre
Draw Me Close places one audience member at a time in a HTC Vive headset, and in a room that has been accurately re-created and embellished as a 3d line drawing in TiltBrush. All of the objects you see drawn in TiltBrush have physical equivalents in real life so you can walk through the door, perch on the bed and (without too many spoilers) reach out and touch the person in front of you. Draw Me Close elegantly blurs the worlds of live performance, virtual reality and animation, and has been stirring strong responses at Tribeca and Venice Biennale.
Quill – similar to TiltBrush in many ways, designed by Oculus. As the story goes, Quill was designed by the team at the now defunct Oculus Story Studio to allow them to create ‘Dear Angelica’, a rather beautiful story told by a young woman as she reflects on the life of her mother, voiced by Geena Davis.
Image: Dear Angelica © Oculus
Blocks – a low polygon 3D modelling tool from Google.
Medium – a 3D sculpting tool from Oculus.
If TiltBrush and Quill are somewhere between an artists’ palette and MS Paint, Blocks and Medium sit somewhere between a lump of wet clay and a bucket of lego.
Poly – a couple of weeks ago, Google released Poly, a fantastic library of VR and AR objects that can be used and added to by those facing down the challenge of populating expansive virtual worlds. The tool is well integrated with Google’s other creative VR tools TiltBrush and Blocks, but in theory they are ‘platform agnostic’ so should work with Apple’s ARKit as much as Google’s ARCore for those creating augmented reality apps for mobile.
A similar and slightly more established library of content with 1.5million+ 3D/VR content is Paris-based company, Sketchfab. It too has good integration with all of the major VR/AR creation tools, although it is likely to be quickly overtaken now that Google are stepping forward.
A-Frame – If you prefer your tools kits to be a little less proprietary and a little more open source, you should have a look a A-Frame, an open source library for creating browser-based experiences in VR using a suite of tools developed by Mozilla’s MozVR team. This one is aimed at web developers who are likely to be comfortable with HTML, but not confident with WebGL that would otherwise we needed to wrangle VR on the web. Amongst the tools developed for A-Frame is A-Painter, a TiltBrush inspired painting programme that can be used in a headset or in a browser.
Tools for getting together
AltSpaceVR A series of pre-made virtual environments that people from around the world can visit, meet and play together. There are a some generic spaces that you can use to host a virtual gathering such as a bar or a garden, and there are specific events that you can attend as a virtual audience member, such as comedy gigs, lectures and (I’ve just noticed) communal meditation sessions! These environments are generally pretty lo-res computer generated affairs, and you will likely be a similarly lo-res avatar version of yourself – think of the first Mii characters when the Nintendo Wii launched. That said, the experience of milling around amongst a bunch of other people, who are presumably also wandering around their living room trying not to whack their shins on a coffee table is not to be underestimated.
Note: AltSpaceVR did appear to be on the verge of shutting down this summer, and has recently been bought by Microsoft.
Working with 360 video
Some 360 cameras now come with on board stitching software, to knit together all of the images taken into a full, spherical format. For those that don’t, there is image-stitching tool, Kolor, a useful bit of software for those using a number of small cameras such as Go Pros to record the full 360 field of view.
Filmmaker’s friend, Adobe’s Premiere Pro has a series of tools to support editing and viewing in 360 video and VR.
A quick glimpse of the future
One of the emerging frontiers for VR is the ability to make photo-realistic environments that you can move through. Current 360 video technology locks you to one spot, and in order to make 3D landscapes that can be rendered in real time and moved through, computer generated environments are your only real option. There are some good experiments being done right now with photogrammetry and depth capture for those interested in getting a bit closer to the real thing, but for the full Start Trek Holodeck experience, we are going to have to wait for Light Field technology to develop. Light field (plenoptic) capture is hard to succinctly describe but loosely it is a way of recording live action within a specific area (volume) from multiple different angles simultaneously, creating a vast bank of images. An algorithm is then applied that reconstructs images, representing how each aspect of the scene would look from any given angle within that volume. In theory, this allows us as audience members to walk through a film set as the action plays out around us. We won’t be able to change what happens, but we could move around, under and through the scene like a friendly ghost, seeing all of the different architecture, objects and lighting from all angles, rendered in real time. All of this will need a significant jump forward in camera technology and processor capability to handle the huge amounts of data being captured and crunched.
There are a few companies working on early versions of this, but we are probably still at least 5-10 years away from having developer and artist-friendly tools to design for light field. The closest we come at the moment is Lytro Volume Tracer (VT), a new offering from one of the aforementioned pioneers of Light Field, Lytro. They claim that it is ‘a Light Field rendering solution for CG content, delivering the highest fidelity, most immersive playback experience for mixed reality’. It is unclear at the moment who will be able to get their hands on this tool, or how it really works, so nothing to fit into your 2017 Christmas stocking sadly. Personally, I think it’s one to watch out for in the years ahead.
Next: Step Five – It’s all about sound