Data Scientist/Instructor・Mostly write Python & R for pay・Mostly write p5js for fun・Check me out @thespanningset on Instagram
Published Jan 26, 2018
This post is to share a side project on extracting ‘reaction faces’ from YouTube videos.
Output from video: ‘PLOTCON 2016: Hadley Wickham, New open viz in R’.
Neutral |
Scared |
Happy |
The code requires opencv
& keras
(model trained with TensorFlow backend). To use the reaction face finder:
cd
into the newly cloned repo’s directory#command format
python youtube_react_face.py -y YOUTUBEURL -o OUTPUT [-m MAXFRAMES]
#example command
python youtube_react_face.py --youtubeURL https://www.youtube.com/watch?v=tVb0g0-JCfk --output output
#required
-y YOUTUBEURL, --youtubeURL YOUTUBEURL
url for youtube video to find reaction faces in
-o OUTPUT, --output OUTPUT
dir to output reactions to
#optional
-m MAXFRAMES, --maxFrames MAXFRAMES
stop processing video frames after this many (default 5000)
Before classifying the emotion of a face the program has to find the face. This project currently uses a Haar cascade face detector. This style of detection is great for speed, but is prone to false positives. So every now and then the output will show a scared tie or an angry shirt.
Huge thanks to Adrian at PyImageSearch for his book, Deep Learning for Computer Vision with Python. Adrian and the book gave me all the tools needed for building the emotion model. Since I followed along with the book so closely for creating the model, I don’t feel that it would be appropriate to share the code used.