🎨3D Artist Guide

Overview

Whether you create Live2D or 3D characters, your rigged models can be used on Hologram.

Steps

  1. Model, rig, and skin your character using any software you are comfortable with.

  2. Animate your 3D model using blend shapes or skeletal animations (Optional).

  3. Import your character model to Hologram's web studio to try out face-tracking.

  4. Use your character model on Google Meet or Discord through the Hologram Chrome Extension.

Requirements (3D)

  1. For rigging, you need to follow the Apple ARKit blend shape naming convention. More on this in the "Rigging (3D)" link below, under "References".

  2. Export your 3D model as a .gltf or .glb file.

File Formats

Our Chrome extension supports 3D characters in glTF format and Live2D characters in Live2D Cubism's standard file types. We will soon support VRM and FBX formats as well.

Blendshapes

Our real-time face tracking follow the blend shape naming convention of Apple ARKit. However, not all blend shapes need to be created.

Hologram has a tier system for the ARKit blend shapes. Tier 1 consists of the baseline coefficients that the character models should have to be functional. Tier 2 and 3 consists of blend shapes that would make the character models a lot more expressive.

Tier 1: eyeBlinkLeft, eyeBlinkRight, eyeWideLeft, eyeWideRight, jawOpen, mouthSmileLeft, mouthSmileRight

Tier 2: eyeSquintLeft, eyeSquintRight, mouthFunnel, mouthPucker, mouthFrownLeft, mouthFrownRight, mouthShrugUpper, mouthShrugLower, browDownLeft, browDownRight, browInnerUp, browOuterUpLeft, browOuterUpRight, noseSneerLeft, noseSneerRight

Tier 3: cheekPuff, mouthLowerDownLeft, mouthLowerDownRight, mouthUpperUpLeft, mouthUpperUpRight, mouthLeft, mouthRight

Rigging

Our full-body tracking follows the naming standard below.

Testing

The easiest way to test out your Live2D or 3D character model against Hologram's facial tracking system is via our web studio.

Last updated