Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Technology

Replacements For Mouse And Keyboard? 29

Qrious writes "I read Tad Williams' "Otherland" a while ago and was facinated by the interface that is used for controlling computers with your hands. Now I am wondering how far technology is in this field. Aren't replacements for mouse and keyboards around ? Gloves and a virtual keyboard ? A flick of the wrist that changes back and forth betweeen keyboard and pointing device ? Are these things already made ? If not, why not ? I for one would like to work with something faster than a keyboard. Typing in the air or on a table with some kind of gloves instead of a keyboard would take some getting used to, but would be a lot faster when you are used to it. Anyone else have ideas to some other way of interacting with computers that would make it easier on the hands and faster to control ?"

There are certainly a lot of radical extensions of the keyboard as we know it -- chording keyboards like the BAT and the Twiddler, Deep Ergonomic ones like the Kinesis, and many variants on the one-handed keyboard.

IBM of course is now selling ViaVoice for Linux, as well. But are there interfaces that are even more out there available now?

This discussion has been archived. No new comments can be posted.

Replacements For Mouse And Keyboard?

Comments Filter:
  • by ecloud ( 3022 ) on Monday August 21, 2000 @07:54PM (#838333) Homepage Journal
    The usual objection against virtual keyboards is lack of tactile feedback. It's not just to let you know that you pressed a key, but to reclaim some kinetic energy; it takes less work if your fingers bounce back from the keys instead of you having to pull them back. Touchscreens are intuitively compelling, and also bad with respect to this feedback if used exclusively - you're punching fingers down against an immovable object and it could probably be rather fatiguing. Your finger is absorbing most of the impact energy. But maybe the problem is not so bad with gloves, because you don't have to move your fingers quite as forcefully in the first place, and vibratory or audio feedback would be enough to let you know that you hit a key. I'd probably get tired of having to put on the gloves, myself; like how could you eat or drink and compute at the same time while wearing them? (I do this a lot, I'm eating dinner right now.)

    I believe that a touchscreen is a good replacement for a mouse; and a trackball would also be an even better replacement under certain circumstances (such as smooth navigation, as opposed to point-and-click). Inspired by the non-existent technology of the Starfire project [asktog.com], I've been pondering the idea of using a combination of a large screen for viewing documents, with no controls on it; a small touchscreen for application-specific controls; a trackball for smooth motion; and a keyboard for typing text. I could operate GIMP entirely via the touchscreen, or a combination of touchscreen and trackball. The trackball could be the 3-axis type which would allow some interesting 3D navigation (but Linux is short on apps which would make use of it, so far). Eventually the keyboard (and some other input modalities) could be gradually replaced by voice recognition.

    The metawidget [ampr.org] idea (that link is getting old, I need to write about my more recent ideas) would be useful in such a system to separate "control" functionality from the document and view parts of the MVC pattern. GIMP for example would keep a socket connection open with the touchscreen display software, on which it would exchange messages about which controls to make available, and receive messages about which controls were selected. So the palettes, the toolbox, and some context-sensitive stuff would be the main things on the touchscreen, and there would be enough real estate to have many more of them available at once, so that most actions can be done with fewer clicks. Cascading menus should be replaced with something more appropriate for punching a touchscreen (when you "dive in" to the next level, the next level replaces the current level; or perhaps with a menu structure that resembles the NeXT file browser). Eventually I will get around to putting up a website at www.metawidgets.org to discuss these ideas.

    In short, I think improving ergonomics ought to be done in a holistic way rather than just putting more bells and whistles on existing devices like mice and keyboards. And there is more than one path to experiment with. I like touchscreens but they are not practical in every situation.

    Now, I'm going to go ramble on a bit here [ampr.org] about my ergonomic workstation idea, in case you haven't already had enough...

  • I cant really see us getting rid of keyboards and mice any time soon. a virtual keyboard could be pretty cool, but there are some issues that would need to be worked out, such as the feedback. as for mice, trackballs are pretty cool, and touch screens are nice too, but the fingerprints all over the screen would get really annoying really quickly. As a programmer, there's no way I'd even attempt to code in something like Via Voice, that is just lunacy!
  • Actually, I've wondered about how static computer interfaces are myself. One thing that sparked my interest was the Diamond mouse that you could pick up off a surface and use, but I found it was a lot more limited than I'd hoped. What I'd like to find is a gyroscopic mouse that can interpret not only linear movement (as a cursor across a window screen), but also twists, pitches, and so on. Of course I could just be playing too many flight sims and FPS ;)
  • Building motion sensing gloves that are used by pretending you have a keyboard is silly. I think someone should develop some (relatively skeletal)gloves that understand sign language, then you could enter data using an existing gesture set that probably doesn't have any RSI problems surrounding it.
  • by FascDot Killed My Pr ( 24021 ) on Tuesday August 22, 2000 @04:22AM (#838337)
    You can't use sign language for the same reason you can't use spoken language--computers can't yet decode a simple sentence.

    However, you could use some restricted set of sign symbols with a very restricted "grammar". This would be analogous to saying "open program", "close window", "reboot" etc to your computer instead of saying "I want to calculate my gas mileage and then check my email". The advantage in using gesture instead of speech would be that the physical motion *may* be easier to detect than the speech units.

    Of course, since we are now talking about an arbitrary set of physical motions that are intended to convey instructions to the computer we are right back to "virtual keyboards" or something very like it...
    --
  • This might be off-topic, but there is a company (whose URL I can't remember) that manufactures foot pedals as replacements for the shift, ctrl, and alt (or meta, if you grok EMACS) keys. It might take a little getting used-to, but it would make those games of Quake a lot more interesting (select weapon with left foot pedal 1, fire with right foot pedal 2, or something like that.)
  • ...you could use some restricted set of sign symbols with a very restricted "grammar"...

    And I think we all know exactly what "sign symbol" should be used for the reboot command, no?

    On second thought, I'm not decided on that - either "the finger" or "banging head on keyboard", depending on the circumstances of the reboot... :=]
    ________________________

  • Mice/trackerballs/touchpads are ideal for desktop control because they just require the hand to be moved across a flat surface. Touchscreens or gloves which are waved around would be very tiring on the arm. Keyboards are very user-friendly and don't require memorising gestures for each letter/symbol. Voice recognition is all right but is only designed for word processing. When a system is produced which does not require correction, people might have the option of buying computers without keyboards

    I think that the next stage of interfaces will be cybernetic implants. People will then be able to control computers just by thought. However, I think that this is still a long way off.

  • Your fingers are *much* faster than your feet at operating switches, so doing this would be (no pun intended) shooting yourself in the foot. Incidentally, you fingers are also significantly faster than your thumb; a lot of Jeopardy players use their index finger rather than thumb to operate the buzzer.
    --
  • Actually, somehting much similar is around as handicap-aid tools. Aditionally, some military application tech could also be used.
    I know of a handicapped person who has lost all motion below his neck and has a MOUTH piece that interprets touguemovement as a keyboard, by simply pressing his toungue against different parts of his mouth he can hype letters on the screen. Admittedly, it doesn't quilify him for a master at playing Quake 3, but it certainly allows him som leeway with respect to accessing computers.
    Why he does not use speech recognition You might ask ? Well I'm sure he does, but speech recognition requires that speech conforms to certain "standards" and that the program recognizes the word/command issued. The keyboard is much more flexible when You're doing something more complex than writing a letter.
    In keeping with this "bodily" interaction, There's been substantial progress in the military with tracking eyemovement (don't ask me why), thus allowing You to control a pointer on a screen by simply looking at the screen. Right and left click are done by simply closing right or left eye. But due to "normal" eyemovement requireing that Your eyelids close once in a while, closing BOTH eyes will result in nothing.

    I would agree that sign language would be nice, but interpreting it is somewhat difficult, because i might move my hands/fingers slightly higher/faster/closer or I might simply bend my joints a bit more than someone else. This could be very difficult for a machine to intepret, as it not only had to transform the movements into letters/words but also sometimes would have to make determinations on what was meant in special situations, did he accidentally move an extra finger, or did he forget one ? The computer would then have to make determinations on the context of what was being said. Additionally, how would the gloves know if I touched my nose or my forehead ? Two significantly different things in signlanguage.

    I doubt that will be feasible in the near future, especailly since people who can sign can also use a keyboard, but the idea is nice though.

    Howver, for commercial/private use, i think that the virtual glove/glasses combination would be much more likely to succeeed in the nearest future, as these handicap aids are not easier to use than the keyboard/mouse combo, which I think will determine the ultimate success of a replacement for these wonderful but low-baud-rate input devices.

  • The trackball could be the 3-axis type which would allow some interesting 3D navigation.

    If you mean a Spaceball [qualixdirect.com] kind of thing (6 degrees of freedom, translation and rotation in each space axis), then this would probably suck for what you want. About the only thing these are good for is manipulating 3D models or scenes, and even then they're questionable.

    The problem is that to rotate, for example, you twist and then hold the thing in its twisted state. The same goes for translation, you displace the ball in the right direction and then hold it there. For broad scrolling stuff, like you mentioned above, this would be worse than a mouse. :) Even for 3D stuff, a 3D mouse [qualixdirect.com] might be better if you want broad navigation (though of course 3D mice have their own set of problems).
  • Once upon a time (1990-91) there was a very innovative company called VPL, whose founder was a very bohemian-type dude name Jaron Lanier (I think I got that right). He developed what we consider today, the first "glove" interface. His original glove, according to legend (and fact, actually), was built by him on a kitchen table, so he could play air guitar with his C=64.

    A company (VPL) was soon born - the glove was manufactured and sold, along with a few other virtual reality interface devices, including a body suit, and an HMD. The glove idea - well, that was just too unique - and thus, was patented.

    When the bottom dropped out in the VR market in the mid-90's, VPL was one of the first to go (though, I think they may have been resurected in some fashion - I have seen a web site). However, the patent lives on. It is broad enough to cover almost any conceivable use for a glove/computer interface - in enough configurations of sensing apparatus (they used fiber optics in the commercial glove, but worked with Mattel on the PowerGlove, which used resistive strain gauges) - that no one has since been able to build a more cost effective glove.

    In fact, about the only innovation I have seen in a while was a glove that used the touching of fingers and finger/palm to make gestures, which could then be interpreted (and actually, this seems like a much better way to go)...

    My site has further info, if anyone cares to visit...

    I support the EFF [eff.org] - do you?
  • Quite often I use my graphics tablet for things other than Photoshop. The pen tracks just like a mouse, but it's far less strenuous to use for extended periods of time as you're moving a little under an ounce instead of several ounces like a standard mouse. I've got mine setup so, in IE 5.0 (for Mac), one of the buttons on the pen is command allow me to command drag with the pen to scoll. Quite cool and much easier than mousing about.

    ----
  • Typing in the air or on a table with some kind of gloves instead of a keyboard would take some getting used to, but would be a lot faster when you are used to it.

    Uh, *why* would typing on air or a table be faster? Someone has already mentioned tactile feedback and the need for the "spring-back" effect to reduce fatigue (and think about how much your fingers would hurt if you typed on a *table*), but there is also the question of how *your* perception of the difference between the "k" and "l" keys would compared to *the computer's* ideas. Another factor in fatigue is the dreaded "gorilla arm" syndrome, which pretty much doomed touchscreens for most applications.

    If you're wanting to go with a "virtual" interface, why stick with limiting concepts like a mouse and keyboard?
    --

  • You can pick up a regular mouse and use it. That's how I use mine. I have the mouse turned around, the part of my palm under my thumb on the "left" button, my thumb on the top of the mouse, and the rest of my fingers around the bottom. When I "right" click, I just move my thumb over there. A left click is just a squeeze (remind you of any recent mice *cough*Mac*cough*?). My middle finger operates the ball as a trackball. I find that this position is better because it keeps the wrist in a neutral position. Try it! YMMV, of course.

  • Canon makes a SLR camera (EOS 3) that can track eye movement and focus on that point. The viewfinder is divided into 45 squares. Whatever square you are looking at is the one it will focus on. Why can't they make goggles that do the same thing for monitors?

    Neil
  • Consider the tasks we perform on a daily basis.

    Emails - I cringe at the thought of someone sending me a voice created email. Just listen to the voicemail you get sent on a daily basis. Can you imagine it in writing and going on for two pages?

    Coding - Try speaking a shell, PERL or C++ program. Did you remember all the "." and ";"? Did you remember to declare those variables? I recall a study about our ability to speak and reason at the same time. Humans are not very good at it. (Quite a statement there I just realized)

    Perhaps we should just give up on the QWERTY board and go back to the most efficient design. Staying QWERTY helped when we were converting from typewriters and needed to make it easy on everyone. Now we need to jump ahead to efficient typing.

    As for mice, something else is needed. I do not want to remove my hand from the keyboard while typing. While surfing, etc. I do not want to use my keyboard unless I am typing a URL. (Perhaps a laser pointer on my hand?). However, a mouse or "precision" instrument is going to continue to be needed for 3d rendering, graphics, etc.

    Try putting a piece of paper on the wall and writing your name. See the problem? Pens are incredibly refined instruments. They relieve much of the pressure on your hand, provide a stable surface and are able to support writing in almost any direction for extended periods of time. I believe UI are going to evolve the same way, slowly, over time and to a simplified fashion combining multiple technologies.

    Hopefully, discussions like this will give someone here an idea that we can all benefit from.

  • Two optical mice with 7-10 buttons each - one for each hand - used in conjunction with an eye-tracking system & a reliable voice recognition program.


  • Despite your eyes not focusing on a single point, what about those who use mouse trail in windows?? It could get a bit annoying to watch someone with mouse trails on reading a document.

    Whoosh...whoosh...whoosh...mouse everywhere!

    Granted, pointers wouldn't really be necessary, but it would be the worst thing ever...

  • by dutky ( 20510 ) on Tuesday August 22, 2000 @07:24AM (#838352) Homepage Journal

    Have a look at the MIT Media Lab [mit.edu] Responsive Environments Group [mit.edu] and their current projects [mit.edu], specifically responsive surfaces [mit.edu] as well as the Tangible Media Group [mit.edu] and their senstable [mit.edu] and m etaDesk [mit.edu] projects.

    These projects all make use of various methods of sensing user movement without the need to add extra bits of stuff to the person. One of the projects that I have heard about (I think it is part of metaDesk) uses an array of theramin-like* devices to sense the users hand positions over a table top. The devices are able to sense both hands along with palm and finger orientations.

    * A theramin is an electronic musical instrument where the musician plays the instrument by passing her hand between a radio transmitter and reciever, interrupting the electromagnetic field and producing various tones and sounds. This is similar to the effect that you can see on either TV or radio reception as people move around relative to the receiver antena (obviously this doesn't happen if you have cable TV).

  • That dang mouse ball keeps pushing up into the mouse casing, where it's quite difficult to turn by hand (I'm using an IntelliMouse, but most others are structured similarly). It's a little better when I take the ball cover off, but then it falls out too easily. Maybe you've just got a knack to it.
  • As a replacement for the mouse, an eye tracking device would be pretty nice. You should be able to move the mouse pointer by just looking where it should go. The mouse buttons could be mapped to some key of the keyboard. One could even eliminate the mouse pointer (usually you know where you are looking at). People could develop the ability to move both eyes independently, thus having two "mouse pointers" and working twice as fast! (useful e.g. for drawing lines in GIMP or marking text or reading two /. articles at the same time).
  • Sign language, however, has a set of 26 strokes for letters... Then, it's just acting like a keyboard except no RSI, and after you've learned it it's faster than speaking [letters]!

    So your computer won't need to understand you any more than it does now.
  • Actually, I wouldn't type in the air. I'd let my hands hang wherever they were comfortable and wriggle my fingers a little to type. What's more ergonomic than your hands on your knees when you're sitting? Or your arms swinging back and forth as you're walking?
  • I always thought it would be nice to have something similar to a glove. Anyone remember that Nintendo glove they had? How bout the movie with Fred Savage called the wizard, it was in there. Seriously though, I think it would be very easy to be able to move your hand side to side, and up or down to mimic the movement of a mouse. There could be some issues with wrist strain but it might help to have the user hold on to a ball or something. Blech....just an idea.

    Oh and for the computer geeks who think a touch screen would be nice, hah! I'd love to see you trying to see through twinkie filling.

    LiNT

  • "Diamond mouse"...is that what you call this thing I picked up at a flea market? It plugs in as a PS/2 mouse, but is a ring you slip on your index finger. Small trackball on top, left-click with a trigger inside the ring, and two mouse buttons on top near the trackball. It's basically a thumb-operated mouse with an extra left button which you squeeze as a trigger.
  • Handkey [handykey.com] - mouse, keyboard and shortcut device operated by one hand.

    --
    Eric is chisled like a Greek Godess
  • That's like saying you can't type english on your keyboard because your computer doesn't understand english.

    It doesn't need to. I see no reason that you couldn't use sign language to write a letter or a story.

  • We can already cybernetically enhance a blind persons vision (a totally blind man had nerves stimulated to simulate a character) , a deaf persons hearing (hearing aids) and a disabled persons muscles (prosthetics, i think thats how you spell it) How long will it be before a full colour screen, inner ear audio implants, and using thought impulses for "typing" (or straight from thought to text ?) for input? The technologies are here, albiet very young. In the future, the next person you meet could be doing a mental search for your face in there augmented memories, Think about it.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...