Hey! Alexis here, Technical Artist at Imaginary Spaces.
Working through various projects of mine, I find myself time and time again wondering if things could be done differently if the game engine tool palette was a little more ergonomic. Over time, I have thus taken the habit to cook up various tool ideas that occur to me, from a seed-based foliage tool to an automatic greebler.
As with many other of my experiments, this small project started from a core design philosophy of mine: the mouse and keyboard aren’t really the best tools to express heartfelt, visceral artistic ideas and as digital creators, we often revert to our own bodies to act out our ideas. Who hasn’t grimaced at a nearby mirror as a reference for an illustration, gesticulated to other team members to describe the pace and gait of a certain character or (and this is where this project comes in) wiggled around our fingers to explain how a tuft of smoke should behave?
It so happened that we had a Leap Motion hand tracker gathering dust in a box somewhere around the office. I decided to spend about a day to see how far I could get, using it to author particle systems from recorded hand motion data. It obviously wouldn't do all the work - but if it worked it would save that tedious first iteration that we all hate.
How it basically works is that it tracks and records the tips of your fingers as position vectors and then uses that data to infer stuff like hand velocity - then that data gets sent into the various Unity particle system modules. You can wiggle your fingers to modulate noise, open and close your hand to adjust the size of the particles and (obviously) mocap complex movement data. Interestingly enough it doesn’t really require a Leap Motion, just two or more Transforms that can be tracked in space - You could probably set it up to work with VR motion controllers or simulated rigidbodies in your scene!
After installing the drivers on my new machine and fighting with the Leap Motion Unity integration a little bit, I managed to get it to work. It records at runtime but then stores the recorded data in a prefab, that way you can have the data survive closing playmode afterwards. Working on an actual recorder was pretty fun - It’s kind of interesting how in the end you can basically mocap complex shapes into existence that obviously weren’t intended to be built with the particle system curve editors.
If (or when) I come back to it, I’ll try either building it in Unreal, make it support the HDRP VFX Graph or work with the Oculus Quest Hand Tracking (if that makes it to the Rift S)!
Happy hand-waving! You can find more info, the full project and more documentation on the Github repository.