Jurango's Technology-Driven Pipeline (Process)
The Jurango production pipeline integrates a number of high-tech hardware and software technologies through the entire production process. Here are some of those we will employ in this film.


1. Screenplay Writing
We are using Scrivener for writing and editing the screenplay. Scrivener has a number of tailorable features that allow us to track progress of each scene and bundle all series' information (such as character information, set descriptions and details, and scene status) into a single document.

2. Character Design
We are using the latest version of Reallusion's Character Creator (V5) to create the most realistic characters possible. In addition, a number of plug-ins are being used to enhance this realism: realistic skins, expression wrinkles, scars, tattoos, dirt/paint, makeup, and hair builder assets.
​
Watch the video to the left to see how characters are created.
​

3. Animation Sequencing
Working alongside Reallusion's Character Creator, we are using its sister tool, iClone 8, for capturing all the animation sequences in the series. This tool includes the ability to integrate and tailor individual character animations into sequences as well as animate entire crowds of characters all at once. These sequences will be transferred to the rendering tool to produce the final image sequences for the scenes.


4. Facial Expressions
iPhone Face MOCAP and AccuFACE, acquired plug-ins for iClone, allow us to map actors' detailed facial expressions directly onto the character. These can be done either in LIVE mode or from captured videos of the actors. This allows us to work with actors remotely to record facial expressions and map these onto the character at a later time.

5. Lip Sync'ing
Reallusion's iClone tool has built-in features that will automatically translate actors' voices (audio) into phonemes, which iClone uses to move the characters' lips. Each sound (phoneme) can then be tweaked to create the most believable lip-sync'ing possible.


6. Body Motion Capture (MOCAP)
While we have acquired a considerable number of animation sequences for the series, there will be occasions when we will need something unique. For this, we will be using the latest technologies for capturing and mapping actors' motions directly onto characters in real time. This requires just one high-resolution cameras: no special equipment (suits) or software are needed.

7. Animation Rendering
We will be using the latest version of Unreal Engine to render each of the frames in the animation series. This software has advanced technologies (such as ray tracing, LUMEN, and NANITE) that will render very realistic, detailed, and complex images in near real-time. In addition, we will be using the latest hardware technology for the rendering: a 13th generation Intel i9 processor with 128GB of memory and the most advanced graphics card available (an NVIDIA 5090) for final production.


8. Animation Compositing
We will be using daVinci Resolve to produce the final animation sequences from the individual frames rendered by Unreal. Resolve comes with many features (such as frame editing, color correction, visual effects, motion graphics and audio post production) to produce the most realistic sequences possible.


