OVERVIEW

UN REAL EMOTION is a project that started to think of new ways of showing emotions in the Metaverse. Current technology is yet focused on portraying the motion of people into the Metaverse, not our facial expressions or emotions.

In the following video, three interviewees answer questions while their emotions are being shown in an original way. The hovering sphere behind them expresses their emotion by changing colors and shapes.

Meta Human research

Virtual Human

UN REAL EMOTION started by creating virtual humans that represent different races and nationalities. They were created using different programs including Unreal Engine, and After Effects.

The looks of these characters are designed not to be defined by specific races or nationalities to represent the future with fewer boundaries. These universal characters have different eye colors, skin tones, bone structures. All of these features including their outfits were carefully adjusted to reduce the awkwardness yet have some characteristics.

Once the characters were designed, we interviewed different models while capturing their face movements through Live Link Face app. This enabled the virtual humans to link with the facial expressions of real humans.

Emotion Visualization

Emotion visualization is another key point of this project. Unavoidably, today’s technology has limits in copying human facial expressions into the virtual world. Our abilities to express and read emotion significantly decrease which leads us to think of a way to show emotions in the virtual world.

Instead of merely showing the analyzed emotions in numbers and graphs, we wanted it to be shown and feel more intuitively. To do so, we first analyzed emotion from our real human interview video using Morphcast. Morphcast is an AI that recognizes facial expressions and facial emotions. The extracted data was then translated into sphere-shaped object. The hovering object behind the virtual humans has changing size, shapes, and colors based on the emotion data based on the algorithm.

UN REAL EMOTION
UN REAL EMOTION
UN REAL EMOTION
UN REAL EMOTION

Interview

The final video consists of virtual humans answering interview questions that are intended to draw out the emotional changes of the interviewees. Interviews are conducted in different languages, putting into account that there won’t be a language barrier in the near future.

Please find the full-length video here

Credits

Creative Director/ Producer
Toshiyuki Hashimoto (The Shift)
Art Director/ Character 1 Voice
Jiyu Park (The Shift)
3D Artist/Designer
Mika Hirata (The Shift)
3D Engineer
Seiya Nakano (aircord)
Interactive Developer
Sana Yamaguchi (aircord)
Sound Researcher
Maho Ishizaka (aircord)
Character 3 Voice
Diana Ganea
Character 2 Voice
Jonathan Campbell
Drag
Click
Explore
Next
Copy an email address
Copied