Facebook’s AI research might one day teach you how to play drums

Facebook’s AI research might one day teach you how to play drums

A new research project led by Facebook’s AI team envisions future AI tech that continually analyzes people’s lives using first-person video by capturing everything they see, hear, and do to assist them with daily tasks through, most likely, augmented Reality (AR) glasses.

The project, called Ego4D, allows AI to comprehend and interact with the environment in a first-person perspective the same way humans do. According to Facebook’s team, AI generally learns from third-person pictures and videos, but its next-generation AI will learn from recordings by stepping into your shoes.

“We brought together a consortium of 13 universities and labs across nine countries, who collected more than 2,200 hours of first-person video in the wild, featuring over 700 participants going about their daily lives,” reads Facebook’s blog post about the project. Facebook says that participants, including residents of the United Kingdom, Italy, India, Japan, Saudi Arabia, Singapore, the United States, Rwanda and Colombia, recorded videos of themselves doing daily tasks, like playing sports, shopping, gardening and more.

Beginning next month, Facebook researchers will be able to gain access to this data to develop Ego4D. According to the social network, this data is the world’s most extensive collection of unscripted first-person films.

Facebook has developed five benchmark challenges for developing smarter and more useful AI assistants, including:
Episodic memory: What happened when? (e.g., “Where did I leave my keys?”)
Forecasting: What am I likely to do next? (e.g., “Wait, you’ve already added salt to this recipe”)
Hand and object manipulation: What am I doing? (e.g., “Teach me how to play the drums”)
Audio-visual diarization: Who said what when? (e.g., “What was the main topic during class?”)
Social interaction: Who is interacting with whom? (e.g., “Help me better hear the person talking to me at this noisy restaurant”)

At the moment, no AI system can fulfill the tasks listed above. However, Facebook sees this type of capability as the future of AR computing with Ego4D. According to the social network, systems trained on Ego4D might one day be implemented in wearable cameras and in-home helper robots, which rely on first-person cameras to navigate their surroundings.

While the technology sounds intriguing and helpful, privacy is always an issue with projects like this, especially when it comes to Facebook. AI recording and analyzing every step a person takes effectively turns the user into a human surveillance machine.

However, according to a Facebook spokesperson (via The Verge), the project is still in its early days and “privacy safeguards” will be introduced in the future.

Source: Facebook

Leave a Reply

Your email address will not be published. Required fields are marked *