Skip to content

Latest commit

 

History

History
27 lines (19 loc) · 1.59 KB

File metadata and controls

27 lines (19 loc) · 1.59 KB

Dynamic Realtime Animation Control

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live. At it’s skeletal state, our project essentially merges facial keypoint detection, keypoint meshing, emotion and gesture tracking with animation. The final rendered animation can be projected / broadcasted onto an application that requires webcam access.

3 step

MOTIVATION

To look presentable on any video call at any given time.
For protecting yourself and maintaining privacy on the internet.
To make a replacement for video streaming through a webcam that consumes more bandwidth, resulting in using lower bandwidth.

OBJECTIVES

To make a pose-detection model using OpenCV and Tensor flow.
To also make an emotion detection model.
Animate a render of the newly made model on a live rendering software like Blender or Three.JS!

OUTPUT

work Finaloutput

CONTRIBUTERS

Harsh-Avinash
Seshank-k
Nishita-Varshney
Aaryan Bhatiya Ghosh