Contact Information

160 Robinson Rd, #14-04
Sit Rd, SBF Center, Singapore 068914

The recent development in deepfake technologies is pushing the boundary of reality manipulation to an alarming extent. Alibaba Group’s Institute for Intelligent Computing has released Animate Anyone, a generative video technique that significantly steps up the game.

  • The Newest Innovation: Animate Anyone outplays its predecessors such as DisCo and DreamPose in creating convincing video deepfakes. It has skillfully crossed the chasm from being a rudimentary academic experiment to producing content that looks real unless scrutinized closely.
  • User Guide: This technology can extract details such as facial features and poses from a reference image and map them onto slightly different poses, creating a convincing series of images. The challenge, however, lies in the ‘hallucination’ problem, where the model must plausibly invent details like the movement of a sleeve or hair when a person turns.
  • Not Yet There: Animate Anyone has significantly improved from past attempts, though it’s yet to achieve perfection. The technique uses an intermediate step to retain basic and fine details, improving the final result’s quality.
  • What the Future Holds: The potential misuse of this technology is considerably high. With a high-quality image, a malicious actor could potentially make a person do or express anything, especially when combined with facial animation and voice capture technology.
  • Upcoming Advances: While the technology is currently too complex for public use, the rapid evolution in the AI world suggests that it may not remain that way for long. The team is actively working on preparing the demo and code for public release, and although there’s no specific release date yet, their commitment to sharing the technology is firm.

The implications of this technology are potentially far-reaching and we might soon face a wave of realistic ‘dancefakes’ that could shake our sense of reality.

client banner image


Leave a Reply

Your email address will not be published. Required fields are marked *