Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Wilson Fisk using ASL tech in Echo
Screengrab via Disney Plus/Marvel Studios

Is the ASL technology used in ‘Echo’ real?

Technology has been moving forward in leaps and bounds, but did this particular jump happen?

In “Taloa,” the fourth episode of Marvel’s grim new Disney Plus series Echo, something wild happens.

Recommended Videos

Spoilers ahead from Echo.

Well, lots of wild things happen. We get to watch Vincent D’Onofrio stomp a guy, and the first memorable case of forcible contact lens application in MCU history takes place. But the part that undoubtedly stuck with viewers, thanks to its futurist vibes and its visually singular presentation, was what happened after four grown men forced the show’s main character to put in a contact lens: Maya, a hearing impaired character, starts seeing people’s spoken words, interpreted in real time as computer generated sign language, projected onto her contact lens. In turn, her sign language was processed through an automated translator and played as spoken words in an earpiece worn by her problematic father figure, Wilson Fisk

The whole thing has everything that audiences love: The tech of tomorrow, the promise of being able to do something hard like learning ASL without having to try. But is it real?

Where can I get some sweet Echo ASL lenses?

Like so much of the technological goodness promised to us by the Marvel Cinematic Universe, Echo’s hand-tracking, voice-interpreting, millimeter-thin contact lens system is not, at present, a reality. That’s the bad news. 

The good news is, there’s so, so much good news. Tech like the stuff shown in Echo has been in the works for so long, there’s promo footage from Microsoft promising that it could be achieved using the Xbox Kinect. There are even real-time interpretation apps that’ll get the job done, albeit not through a magical AR contact lens, and with some issues. 

AI programs like SLAIT utilize advanced body tracking and artificial intelligence to interpret sign language as it is spoken. Critics have pointed out that it’s mostly good for direct, word-to-word translation, and that it and programs like it still struggle to pick up on the intricacies of human communication – the grace notes and subtleties that make a phrase like “eat my shorts” an insult instead of a dietary recommendation.

Additionally, fitting the tech into a monitor that you can then fit on top of your pupil will take a lot of elbow grease. Currently, a company called Mojo Vision produces a super bare-bones AR contact lens display, but as of 2022, CNET reported that power consumption and wireless communications interference remained an ongoing issue. It’ll be a while before the slick-as-snot animation displays seen in Echo are realistic.

Still, ask anyone born before 1970 and they’ll tell you that there was a time when the idea that a phone would fit in something smaller than a suitcase was pure science fiction. In short, keep an eye out.


We Got This Covered is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Tom Meisfjord
Tom Meisfjord
Tom is an entertainment writer with five years of experience in the industry, and thirty more years of experience outside of it. His fields of expertise include superheroes, classic horror, and most franchises with the word "Star" in the title. An occasionally award-winning comedian, he resides in the Pacific Northwest with his dog, a small mutt with impulse control issues.