[Noisebridge-discuss] How long will it take to have real personal HUDs?

Sai Emrys noisebridge at saizai.com
Sat Nov 21 06:17:13 UTC 2009


The tech: an unobtrusive true HUD personal user interface.

I.e.:
* display that fully overlays the user's standard vision (not merely
giving a small screen in a corner of the eye, or requiring big
goggles)
- bonus: stereo vision *
* stereo earbuds
* integrated camera
- bonus: camera sees from the user's perspective (e.g. by lens, fiber
optic, or mirror mounting at/between eyes)
- extra bonus: camera sees outside normal light spectra, e.g. thermal,
IR, UV, etc
* practically usable input device (EEG? glove? speech recognition?)
- yes, I know there are some partially workable models already (e.g.
Twiddler, Emotiv), but AIUI they're not quite ready for heavy use yet
* standardized connection interface (e.g. USB, Bluetooth, etc) +
standard APIs (e.g. current situation for webcams, keyboards, etc) so
any personal device can easily connect to it

A recent story on _On the Media_ made me think of this again. I know
some of you are actually working with prototypes in this.

My question for y'all: how long do you think this will take?

I'm just curious how accurate y'all'll be. :-P

My guess: 6±1 years until real retail use.

- Sai

* not that this would help *me* any, as I have strabismus and
therefore absolutely no stereo/fusional vision whatsoever**. OTOH, I
think single eye HUD would be sufficient for this use.
** yes, I see depth just fine. Stereo vision is only required for a
small minority of depth cues. But I can't use magic eye books,
binoculars, 3d glasses, etc.



More information about the Noisebridge-discuss mailing list