Amazing new First-person view Machine Learning Dataset (Ego4d)

Found an amazing data set of over 4000 hours of first-person view video footage of people doing normal everyday tasks. EX: taking out the garbage, playing board games, bike riding, etcetera...

Amazing new First-person view Machine Learning Dataset (Ego4d)

Found an amazing data set of over 4000 hours of first-person view video footage of people doing normal everyday tasks, totaling more than 5TB of data!. EX: taking out the garbage, playing board games, bike riding, etcetera...

Last updated 20220218, data set is now live update!

Note: i reuploaded the video to youtube which to was located on a random google drive link

Egocentric 4D Perception (EGO4D)

They have an awesome latent space explorer

Link bellow is the live interactable view you see above

Ego4d

Okay, cool but WTF is it?

its a masive collection of every day life video from the point of view of peoples eye vantagepoint. which is how human see the world.

It has mutliple datasets:

Transcriptions of everything

Eye tracking

Object Segmentation

Reconstruction of 3d enviornments

Motion capture

Why is this important?

New datasets = new amazing algorthms to solve cool problems

Problems like:

"Computer where did i leave my keys?"

Having a robot understand who is talking to who in a converstation.

and so much more!

Problems like these arnt really easly solvable without really good data. this i predict will be a really big generative source of quality Machine learning research to come. 😃

2 hour talk givin by the creators

Highlights

source: time stamp 2:19 https://drive.google.com/file/d/1oknfQIH9w1rXy6I1j5eUE6Cqh96UwZ4L/view

So where do I download

You cant yet, :/ its being released at the end of feb 2022, so your probliby reading it when its avialbe!

Update! data set is now live!

looks like it was released yesterday! nice! today is 20220218

LINK

https://ego4d-data.org/#download

For the nerd here is the research paper!

Ego4D: Around the World in 3,000 Hours of Egocentric Video
We introduce Ego4D, a massive-scale egocentric video dataset and benchmarksuite. It offers 3,025 hours of daily-life activity video spanning hundreds ofscenarios (household, outdoor, workplace, leisure, etc.) captured by 855 uniquecamera wearers from 74 worldwide locations and 9 different countri…

Direct pdf link

https://arxiv.org/pdf/2110.07058.pdf

Get involved!

Important Dates | EPIC@CVPR22

Egocentric 4D Perception (EGO4D)
A large-scale first-person video dataset, supporting research in multi-modal machine perception for daily life activity

Take away

I hope y'all ml researches make great use of this dataset!

I'm excited to see what comes next!

A great new class of dataset that unlocks awesome robotics an VR applications

thanks for reading! 😎

Author

by oran collins
github.com/wisehackermonkey

If you want to help me out and give some donations here's my monero address: 432ZNGoNLjTXZHz7UCJ8HLQQsRGDHXRRVLJi5yoqu719Mp31x4EQWKaQ9DCQ5p2FvjQ8mJSQHbD9WVmFNhctJsjkLVHpDEZ I use a tracker that is pravicy focused so if you block its cool, im big on blocking stuff on my own machine. im doing it to see if anyone is actualy reading my blog posts...:)