Adobe Character Animator is a desktop application software product that combines live motion-capture with a multi-track recording system to control layered 2D puppets drawn in Photoshop or Illustrator. It is used to produce both live and non-live animation.
Character Animator imports layered Photoshop and Illustrator documents into puppets which have behaviors applied to them. The puppets are then placed into a scene, which can be viewed in the Scene panel and Timeline panel. Rigging is set up in the Puppet panel, though basic rigging is fully automatic based on specific layer names like Right Eyebrow and Smile. Properties of selected elements can be examined and changed in the Properties panel, including behavior parameters. Live inputs include a webcam (for face-tracking), microphone (for live lip sync), keyboard (for triggering layers to hide/show), and mouse (for warping specific handles).
Final output of a scene can be exported to a sequence of PNG files and a WAV file, or any video format supported by Adobe Media Encoder. Live output can be sent to other applications running on the same machine via the Syphon protocol (Mac only) or Adobe Mercury Transmit on both Mac and Windows. Scenes can also be dropped directly into After Effects and Premiere Pro, using Dynamic Link to avoid rendering.
- Release notes were unavailable when this listing was updated.
- Intel, 64-bit processor
- macOS 10.13 or later
- 8 GB of RAM
- GPU with at least OpenGL 3.2 support
Size – 1.07GB