The next generation eye-to-speech technology for reading.
Our deep-tech startup is currently developing a software using machine learning and eye-tracking to provide a seamless and natural reading experience that is accessible to everyone.
We believe that everyone should have the opportunity to read and learn without interruption
Our software is designed to help individuals with dyslexia read fluently, provide a concentration-friendly space for those with ADHD, and make language learning faster and more effective for everyone.
Hear what you see in real time
With an eye-tracker connected, all you have to do is look at the text to hear exactly what it says without having to do anything manually.
You control all functions with only your eyes, such as the speed and what to read.
This gives a simulation of the mental voice we hear when we read or think about something. Similar to what you hear when you are reading this.
Meet the team
Our differences compliment each other, and our values connects us.
Chief Operating Officer
Lee is a young entrepreneur who is passionate about Technology, Education and Equality. With previous experience, a strong interest, and an acquired skill for organization she was the perfect fit for the role of COO. Lee has always been a competitive individual, so don't be afraid to challenge her to anything.
Chief Executive Officer
At just 19 years old, Wilma already has great experience as an entrepreneur, speaker and offensive cybersecurity specialist, with deep knowledge within technology. As a hacker both online and offline, she is passionate about everything from breaking into systems, to breaking down barriers through new smart innovations.
As part of our ongoing commitment to innovation, we are currently working on iTrack Focus, a feature that uses eye-tracking technology to help individuals improve their focus and concentration. With iTrack Focus, the program automatically detects which parts of the screen the user is focusing on and darkens the rest of the screen, making it less distracting.