top of page

Jurango's Technology-Driven Process

The Jurango production pipeline integrates a number of high-tech hardware and software technologies through the entire production process. Here are some of those we will employ in this film.

1

Screenplay Writing

We are using Scrivener for writing and editing the screenplay. Scrivener has a number of tailorable features that allow us to track progress of each scene and bundle all series' information (such as character information, set descriptions and details, and scene status) into a single document.

2

Character Creation

We are using the latest version of Reallusion's Character Creator (V4) to create the most realistic characters possible. In addition, a number of plug-ins are being used to enhance this realism: realistic skins, expression wrinkles, scars, tattoos, dirt/paint, makeup, and hair builder assets.

3

Animation Sequencing

Working alongside Reallusion's Character Creator, we are using its sister tool, iClone 8, for capturing many of the animation sequences in the series. This tool includes the ability to integrate and tailor individual character animations into sequences as well as animate entire crowds of characters all at once. These sequences will be transferred to the rendering tool to produce the final image sequences for the scenes.

4

Motion Capture (MOCAP)

While we have acquired a considerable number of animation sequences for the series, there will be occasions when we will need something unique. For this, we will be using the latest technologies for capturing and mapping actors' motions directly onto characters in real time. This will require the use of a studio, two or more cameras, and special software for triangulating motion and mapping it onto the characters.

5

Lip Sync'ing

Reallusion's iClone tool has built-in features that will automatically translate actors' voices (audio) into phonemes, which iClone uses to move the characters' lips. Each sound (phoneme) can then be tweaked to create the most believable lip-sync'ing possible.

6

Facial Expressions

iPhone Face MOCAP and AccuFACE, acquired plug-ins for iClone, allow us to map actors' detailed facial expressions directly onto the character. These can be done either in LIVE mode or from captured videos of the actors. This allows us to work with actors remotely to record facial expressions and map these onto the character at a later time.

7

Animation Rendering

We will be using the latest version of Unreal Engine to render each of the frames in the animation series. This software has advanced technologies (such as ray tracing, LUMEN, and NANITE) that will render very realistic, detailed, and complex images in near real-time. In addition, we will be using the latest hardware technology for the rendering: a 13th generation Intel i9 processor with 64GB of memory and an NVIDIA 4080 graphics card for final production.

8

Animation Compositing

We will be using daVinci Resolve to produce the final animation sequences from the individual frames rendered by Unreal. Resolve comes with many features (such as frame editing, color correction, visual effects, motion graphics and audio post production) to produce the most realistic sequences possible.

Animation Pipeline D02.png

Detailed flow of the Production Phase

Animation Pipeline D02.png

Sets Walk-About (First Drafts)

Latest Animation Tests

bottom of page