Empowering people to communicate with care takers and loved ones.
Vocable AAC allows those with conditions such as MS, stroke, ALS, or spinal cord injuries to communicate using an app that tracks head movements, without the need to spend tens of thousands of dollars on technology to do so.
Vocable uses ARKit to track the user's head movements and understand where the user is looking at on the screen. This allows the app to be used completely hands-free: users can look around the screen and make selections by lingering their gaze at a particular element.
For users with more mobility, the app can be operated by touch.
Use a list of common phrases provided by speech language pathologists, or create and save your own.
Type with your head or your hands.
For the current progress on features, please visit the project board.
For a high-level roadmap, see the Vocable Roadmap
We'd love to translate Vocable into as many languages as possible. If you'd like to help translate, please visit our Crowdin project. Thanks for helping people communicate all around the world! 🌎🌍🌏
We love contributions! To get started, please see our Contributing Guidelines.
- iOS 13.0
- iOS devices with TrueDepth camera
External contributors will need to provision their device and sign Vocable to run on that device in a way that is convenient for them.
Internal/WillowTree contributors can follow the steps outlined in the Technical Onboarding page
Matt Kubota, Kyle Ohanian, Duncan Lewis, Ameir Al-Zoubi, and many more from WillowTree 💙.
vocable-ios is released under the MIT license. See LICENSE for details.
vocable-android is available on Google Play and is also open-source.