Creative Cloud becomes more user-friendly with each release. One of the most exciting additions in 2017 is a beta product called “Character Animator.” Before you read on, let this video speak for itself.
Character Animator allows digital artists to bring still drawings to life more quickly – and intuitively – than ever before. The program automatically detects facial features and phonemes using a computer’s webcam and microphone. These inputs are then mapped to different layers that have been built into a still drawing with Photoshop or Illustrator. For example, a character illustration may include five (or more) versions of its mouth – a neutral expression, a smile, an open mouth for the sound /O/ “oh” a wide grin for /E/ “ee” and a curled lower lip for /f/. These layers are automatically toggled on and off as the user’s facial expressions and spoken inputs change. The result is an animation that mirrors its artist in real time. Eye brows raise and lower, eyes blink, the head tilts from side to side, and the mouth simulates speech.
While Character Animator greatly reduces the time required to produce a moving puppet, it’s important to remember that any form of animation is time-consuming. When using Animator, each expression must be created and fine-tuned at the beginning of the process. The character must then be rigged before it will move properly. This means telling the program which layer is the “left eye,” which is the “right eye,” “nose,” “mouth” and so on. Character Animator is not fully automatic, but it offers a major improvement to the process of 2D animation – and it’s a lot of fun to use!