ONE OF THEIR OWN
A digital boy goes to great lengths to fit in with the human kids in his neighborhood.
Shot by Timothy Gillis
Music: INTRA by Mark Fell performed by Drumming Grupo de Percussão
Written, Directed & Animated by Allen Colombo.
Produced by Effie Studios | 2020 | 4 min.
Five siblings who live near the walnut farm where we shot played the neighborhood kids.
The actors were all non-pros except Sélynne Silver who played Aurora, the mom to the digital boy at the heart of the story.
LOCATION: We shot the movie in one day on a walnut farm near the town of Hanford in California's Central Valley. Hanford City Councilwoman Dolores Gallegos generously arranged access for our shoot (and even got us the truck).
When interacting with Ray, actors engaged with track ball rigs, with Rainer positioned just off camera to speak Ray's lines.
TRACKING: I tracked the 46 visual effects shots in Blender to extract camera and track ball motion data. The camera motion data drove a virtual camera in Blender to match the movement of the live-action camera.
The track ball motion data told the virtual camera where Ray needed to be placed.
CHARACTER MODEL: Makehuman is an open source tool for the creation of human avatars (and winner of best software name of all time). I used it to create Ray, modified his skeleton in Maya, and re-textured the character in Blender for compositing into the live action footage.
MOTION CAPTURE: To animate Ray, I acted out the performance and captured the movements using an X-Box Kinect motion sensor, with a PlayStation Move controller fastened to the top of my head to capture neck rotation. iPisoft allowed me to record the motion data and map it to Ray's skeleton. I exported the animation from iPiSoft as an fbx with keyframes for further polishing in Blender.
Movements requiring large spaces, such as the opening run, I sourced from the Rokoko Motion Library in Unity, and mapped to Ray's skeleton in Blender.
PERFORMANCE CAPTURE: I built a head mounted Go Pro rig to record facial performance and mouth movement while also canceling out neck movement. Faceware Analyzer converted the facial performance footage into motion data, and the Faceware Retargeter plug-in in MotionBuilder allowed me to apply the converted motion data to the bones of Ray's face.
LIGHTING AND COMPOSITING: Once animation was complete, I lit Ray to match the live action footage in Blender and exported to After Effects, where I composited him into the finished edit. Cinematographer Tim Gillis came in to consult on color grading to help finalize the look.