Say hello to the E2-Mask Z. This face-covering headset uses AI to read expressions of happiness, sadness, anger, and more and then render those emotions via an anime avatar.
Developed by the Hirata Takegawa Lab, it certainly should be possible to change the avatar from an anime girl. The idea is that this digital face mask could be worn by dentists to help calm children, as well as for people attending parties or job interviews, or even by actors during theatrical plays.
There is a space below the screen that allows for a field of vision that’s covered by see-through cloth, so the contraption can be worn in public.
Here is the official explanation of the text from the project’s website:
To enable digital facial augmentation with an avatar in a real space, we propose a digital face mask display system that integrates a lightweight flexible display with a thin facial expression recognition system. The thin wearable facial expression recognition system was implemented with photo reflective sensor arrays which can measure facial expressions at 40 feature points distributed across an entire face. We investigated a ten-class facial expression identification model based on an SVM training algorithm. The trained model achieved an average accuracy of 79% when identifying the facial expressions of multiple users.
See the E2-Mask Z in action below: