It’s here!!!! I worked on the Messenger Cowatch project. It allows you to watch videos together with friends. Check out its official release page for details on How it Works and more.
Here’s a pic of me testing it out with my intern mentor.

My contribution can be broken into 2 main parts:
1. Using Machine Learning (ML) to make video suggestions
2. Creating a Visualization Tool for development and debugging
1. Using Machine Learning to make video suggestions
In the Watch Together feature, there are multiple tabs that organize suggested videos.

We make and test numerous ML models to determine which is best (best usually means the one that provides the most engagement and satisfaction). It’s basically impossible to filter through every single video ever on Facebook to determine which one is best for you, so instead, we create a set of smart shortlists from a set of “seeds” and filter from there.
An example of a seed I implemented was using an engaged FB page to get currently playing live videos. I got to work with many of FB’s ML modeling tools. Most of ML engineering is actually running tests and making decisions. There is definitely not as much coding as something like Prod Engineering.
2. Creating a Visualization Tool for development and debugging
As we are coding and making changes, we want to see how the suggestions change in real time. We don’t want to have to sit through the whole process of trying to ship all the code to Android/iOS and thus be dependent on that team’s side of things in order to test ML recommendations. At the same time, we don’t want to have to use the dictionary of numerical ids to try and get a sense of what is going on.
So instead, we create a middle-man tool to visualize and provide lots of debugging info. This way we get the best of both worlds.

Creating this debugging tool was one of my favorite parts since I liked working on something so visual and I also liked helping my team and knowing that they found my tool useful.
Overall, I am so grateful I got to have this experience!
