We developed an open-source iPhone application that can collect and stream data from the iPhone’s sensory suite. Download it from the app store and try recording some data yourself!
https://apps.apple.com/us/app/anysense/id6742254654
Specifically, we enable streaming and collection of:
- RGB, depth and movement data
- Audio data from internal or external microphones
- External sensor data streamed over Bluetooth
The gold star here is for apps like AnySense to help scale robotic data collection the same way that smartphones helped scale vision and language data. With data scaling, we can do some cool stuff like training visuotactile policies on human-collected data that we can directly deploy on the robot! Check out our demo video on our website!
If you're interested, contribute to our project on Github. Github: https://github.com/NYU-robot-learning/AnySense