RECENT POSTS
Tracking Demo and Wearable Updates
Jan 18, 2018 • Lior Lustgarten
We’re two weeks into the term, time for some updates!
First Tracking Demo
Our vision system is now at the stage where we can track the ring and screen at the same time! The android app uses the phone’s camera to take pictures in real time. Each of these pictures is processed through our vision system to locate the markers at the corners of the screen and the marker of the wearable. White lines are drawn over the camera’s image to show where the screen was detected. The screen is then normalized to remove perspective. We can use this cleaned up screen (seen in the bottom right corner of the app) for element identification.
Proof of concept demo for ring and screen tracking
Ring Prototype Progress
3D printed finger tracking ring in use
Wearable - Haptic Feedback
One of our goals is to add haptic feedback to our project. Vibration as a mode of communication
to the user is intuitive and immediate. For example, if a vibration occurs each time the user’s
finger is over an interactable element, it will allow for the screen to be explored more quickly.
The platform for our first haptic prototype is the Raspberry Pi Zero. It’s
relatively small, easy for us to program and has Bluetooth built in for communication with the
phone. We’re also testing the similar SparkFun ESP32 Thing as an alternative.
These are the parts we’re using for our proof of concept :
The setup for the use of the app is demonstrated using the Camera App
Moving Forward
As our system takes shape, we’re looking to get more feedback. We have been guided so far by insights gained from interviews with people who are blind or visually impaired. Soon we plan to have a functional (if rough around the edges) demo and we’re excited to hear what people think! If you are interested in trying it out and giving us feedback, reach out to our team email: watvisionteam@gmail.com
- Older
- Newer