AI-driven Animations Bring Digital Avatars To Life

You’ve never seen sprites move like this.
 
Even with the assistance of automated animation features in modern game-development engines, bringing on-screen avatars to life can be an arduous and time-consuming task. However, a recent string of advancements in AI could soon help drastically reduce the number of hours needed to create realistic character movements.
 
Take basketball games like the NBA2K franchise, for example. Prior to 2010, the on-screen players — be it Shaq, LeBron, KD or Curry — were all modeled on regular-sized people wearing motion-capture suits.
 
“There was a time when NBA2K was made entirely of animators and producers,” 2K’s Anthony Tominia told the Evening Standard in 2016. However, even when the developers began bringing in the NBA players themselves, they were still faced with the costly and time-consuming challenge of capturing their body motions for each movement — dribbling the ball, shooting, jumping — and then translating that data to their in-game avatars.
 
“With motion capture, the data that we have is all that we have, in the sense that if we capture somebody dribbling a ball across the room at a particular speed, then we have it at that speed,” Jessica Hodgins, professor of computer science and robotics at Carnegie Mellon University, told Engadget. “We don’t have the ability to easily adapt it to turning, or running at a different speed, or dribbling with a different pattern.”
 
However, a new system developed at CMU in conjunction with California-based DeepMotion Inc could help slash the production times for dribbling animations. It utilizes a “deep reinforcement learning” technique to generate lifelike dribbling motions in real time through trial and error. Basically, the system learns to animate dribbling through practice. Lots and lots of practice.

,

 

Source: Engadget

more insights