Replacements For Mouse And Keyboard? 29
Qrious writes "I read Tad Williams' "Otherland" a while ago and was facinated by the interface that is used for controlling computers with your hands. Now I am wondering how far technology is in this field. Aren't replacements for mouse and keyboards around ? Gloves and a virtual keyboard ? A flick of the wrist that changes back and forth betweeen keyboard and pointing device ? Are these things already made ? If not, why not ? I for one would like to work with something faster than a keyboard. Typing in the air or on a table with some kind of gloves instead of a keyboard would take some getting used to, but would be a lot faster when you are used to it. Anyone else have ideas to some other way of interacting with computers that would make it easier on the hands and faster to control ?"
There are certainly a lot of radical extensions of the keyboard as we know it -- chording keyboards like the BAT and the Twiddler, Deep Ergonomic ones like the Kinesis, and many variants on the one-handed keyboard.
IBM of course is now selling ViaVoice for Linux, as well. But are there interfaces that are even more out there available now?
Ideas (Score:3)
I believe that a touchscreen is a good replacement for a mouse; and a trackball would also be an even better replacement under certain circumstances (such as smooth navigation, as opposed to point-and-click). Inspired by the non-existent technology of the Starfire project [asktog.com], I've been pondering the idea of using a combination of a large screen for viewing documents, with no controls on it; a small touchscreen for application-specific controls; a trackball for smooth motion; and a keyboard for typing text. I could operate GIMP entirely via the touchscreen, or a combination of touchscreen and trackball. The trackball could be the 3-axis type which would allow some interesting 3D navigation (but Linux is short on apps which would make use of it, so far). Eventually the keyboard (and some other input modalities) could be gradually replaced by voice recognition.
The metawidget [ampr.org] idea (that link is getting old, I need to write about my more recent ideas) would be useful in such a system to separate "control" functionality from the document and view parts of the MVC pattern. GIMP for example would keep a socket connection open with the touchscreen display software, on which it would exchange messages about which controls to make available, and receive messages about which controls were selected. So the palettes, the toolbox, and some context-sensitive stuff would be the main things on the touchscreen, and there would be enough real estate to have many more of them available at once, so that most actions can be done with fewer clicks. Cascading menus should be replaced with something more appropriate for punching a touchscreen (when you "dive in" to the next level, the next level replaces the current level; or perhaps with a menu structure that resembles the NeXT file browser). Eventually I will get around to putting up a website at www.metawidgets.org to discuss these ideas.
In short, I think improving ergonomics ought to be done in a holistic way rather than just putting more bells and whistles on existing devices like mice and keyboards. And there is more than one path to experiment with. I like touchscreens but they are not practical in every situation.
Now, I'm going to go ramble on a bit here [ampr.org] about my ergonomic workstation idea, in case you haven't already had enough...
Good old fashioned keyboard (Score:1)
Alternate mice (Score:1)
How about a glove that understands sign language? (Score:3)
Good basic idea, but not ASL (Score:3)
However, you could use some restricted set of sign symbols with a very restricted "grammar". This would be analogous to saying "open program", "close window", "reboot" etc to your computer instead of saying "I want to calculate my gas mileage and then check my email". The advantage in using gesture instead of speech would be that the physical motion *may* be easier to detect than the speech units.
Of course, since we are now talking about an arbitrary set of physical motions that are intended to convey instructions to the computer we are right back to "virtual keyboards" or something very like it...
--
Misc. (Score:1)
Re:Good basic idea, but not ASL (Score:1)
And I think we all know exactly what "sign symbol" should be used for the reboot command, no?
On second thought, I'm not decided on that - either "the finger" or "banging head on keyboard", depending on the circumstances of the reboot... :=]
________________________
Not with modern technology... (Score:1)
I think that the next stage of interfaces will be cybernetic implants. People will then be able to control computers just by thought. However, I think that this is still a long way off.
Re:Misc. (Score:1)
--
Close... But not quite (Score:2)
I know of a handicapped person who has lost all motion below his neck and has a MOUTH piece that interprets touguemovement as a keyboard, by simply pressing his toungue against different parts of his mouth he can hype letters on the screen. Admittedly, it doesn't quilify him for a master at playing Quake 3, but it certainly allows him som leeway with respect to accessing computers.
Why he does not use speech recognition You might ask ? Well I'm sure he does, but speech recognition requires that speech conforms to certain "standards" and that the program recognizes the word/command issued. The keyboard is much more flexible when You're doing something more complex than writing a letter.
In keeping with this "bodily" interaction, There's been substantial progress in the military with tracking eyemovement (don't ask me why), thus allowing You to control a pointer on a screen by simply looking at the screen. Right and left click are done by simply closing right or left eye. But due to "normal" eyemovement requireing that Your eyelids close once in a while, closing BOTH eyes will result in nothing.
I would agree that sign language would be nice, but interpreting it is somewhat difficult, because i might move my hands/fingers slightly higher/faster/closer or I might simply bend my joints a bit more than someone else. This could be very difficult for a machine to intepret, as it not only had to transform the movements into letters/words but also sometimes would have to make determinations on what was meant in special situations, did he accidentally move an extra finger, or did he forget one ? The computer would then have to make determinations on the context of what was being said. Additionally, how would the gloves know if I touched my nose or my forehead ? Two significantly different things in signlanguage.
I doubt that will be feasible in the near future, especailly since people who can sign can also use a keyboard, but the idea is nice though.
Howver, for commercial/private use, i think that the virtual glove/glasses combination would be much more likely to succeeed in the nearest future, as these handicap aids are not easier to use than the keyboard/mouse combo, which I think will determine the ultimate success of a replacement for these wonderful but low-baud-rate input devices.
Re:Ideas (Score:1)
If you mean a Spaceball [qualixdirect.com] kind of thing (6 degrees of freedom, translation and rotation in each space axis), then this would probably suck for what you want. About the only thing these are good for is manipulating 3D models or scenes, and even then they're questionable.
The problem is that to rotate, for example, you twist and then hold the thing in its twisted state. The same goes for translation, you displace the ball in the right direction and then hold it there. For broad scrolling stuff, like you mentioned above, this would be worse than a mouse.
Why gloves aren't around much... (Score:2)
A company (VPL) was soon born - the glove was manufactured and sold, along with a few other virtual reality interface devices, including a body suit, and an HMD. The glove idea - well, that was just too unique - and thus, was patented.
When the bottom dropped out in the VR market in the mid-90's, VPL was one of the first to go (though, I think they may have been resurected in some fashion - I have seen a web site). However, the patent lives on. It is broad enough to cover almost any conceivable use for a glove/computer interface - in enough configurations of sensing apparatus (they used fiber optics in the commercial glove, but worked with Mattel on the PowerGlove, which used resistive strain gauges) - that no one has since been able to build a more cost effective glove.
In fact, about the only innovation I have seen in a while was a glove that used the touching of fingers and finger/palm to make gestures, which could then be interpreted (and actually, this seems like a much better way to go)...
My site has further info, if anyone cares to visit...
I support the EFF [eff.org] - do you?
Re:Fatigue, pain, syntax, and semantics (Score:1)
----
why a virtual keyboard? (Score:1)
Uh, *why* would typing on air or a table be faster? Someone has already mentioned tactile feedback and the need for the "spring-back" effect to reduce fatigue (and think about how much your fingers would hurt if you typed on a *table*), but there is also the question of how *your* perception of the difference between the "k" and "l" keys would compared to *the computer's* ideas. Another factor in fatigue is the dreaded "gorilla arm" syndrome, which pretty much doomed touchscreens for most applications.
If you're wanting to go with a "virtual" interface, why stick with limiting concepts like a mouse and keyboard?
--
Re:Alternate mice (Score:1)
Re:Eye tracking (Score:1)
Neil
Fatigue, pain, syntax, and semantics (Score:2)
Emails - I cringe at the thought of someone sending me a voice created email. Just listen to the voicemail you get sent on a daily basis. Can you imagine it in writing and going on for two pages?
Coding - Try speaking a shell, PERL or C++ program. Did you remember all the "." and ";"? Did you remember to declare those variables? I recall a study about our ability to speak and reason at the same time. Humans are not very good at it. (Quite a statement there I just realized)
Perhaps we should just give up on the QWERTY board and go back to the most efficient design. Staying QWERTY helped when we were converting from typewriters and needed to make it easy on everyone. Now we need to jump ahead to efficient typing.
As for mice, something else is needed. I do not want to remove my hand from the keyboard while typing. While surfing, etc. I do not want to use my keyboard unless I am typing a URL. (Perhaps a laser pointer on my hand?). However, a mouse or "precision" instrument is going to continue to be needed for 3d rendering, graphics, etc.
Try putting a piece of paper on the wall and writing your name. See the problem? Pens are incredibly refined instruments. They relieve much of the pressure on your hand, provide a stable surface and are able to support writing in almost any direction for extended periods of time. I believe UI are going to evolve the same way, slowly, over time and to a simplified fashion combining multiple technologies.
Hopefully, discussions like this will give someone here an idea that we can all benefit from.
Here's what I want. (Score:2)
Re:Won't work (Score:1)
Whoosh...whoosh...whoosh...mouse everywhere!
Granted, pointers wouldn't really be necessary, but it would be the worst thing ever...
ditch the mouse, keyboard, glove, etc. (Score:3)
Have a look at the MIT Media Lab [mit.edu] Responsive Environments Group [mit.edu] and their current projects [mit.edu], specifically responsive surfaces [mit.edu] as well as the Tangible Media Group [mit.edu] and their senstable [mit.edu] and m etaDesk [mit.edu] projects.
These projects all make use of various methods of sensing user movement without the need to add extra bits of stuff to the person. One of the projects that I have heard about (I think it is part of metaDesk) uses an array of theramin-like* devices to sense the users hand positions over a table top. The devices are able to sense both hands along with palm and finger orientations.
* A theramin is an electronic musical instrument where the musician plays the instrument by passing her hand between a radio transmitter and reciever, interrupting the electromagnetic field and producing various tones and sounds. This is similar to the effect that you can see on either TV or radio reception as people move around relative to the receiver antena (obviously this doesn't happen if you have cable TV).
Re:Alternate mice (Score:1)
Eye tracking (Score:1)
Re:Good basic idea, but not ASL (Score:1)
So your computer won't need to understand you any more than it does now.
Re:why a virtual keyboard? (Score:1)
How about a glove? (Score:1)
Oh and for the computer geeks who think a touch screen would be nice, hah! I'd love to see you trying to see through twinkie filling.
LiNT
Re:Alternate mice (Score:1)
Handykey Twidler (Score:2)
--
Eric is chisled like a Greek Godess
Re:Good basic idea, but not ASL (Score:2)
It doesn't need to. I see no reason that you couldn't use sign language to write a letter or a story.
Why are we bothering with physical apperatus ??? (Score:1)