Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Top Research Labs in Human-Computer Interaction? 184

legLess writes: "Jakob Nielsen's latest Useit column lists his opinion of the best HCI research labs, from 'The Dawn of Time' (1945) 'til now. Xerox PARC made the list each decade, naturally. He says that future HCI research is in jeopardy, partly due to Universities backing away from 'real-world' research, and partly because 'HCI has rarely been the first priority of new research organizations, so by the time research managers recognize the need for it and build up a world-class HCI team, it's often too late.' Is he right about the best labs? Is he right about his other conclusions?"
This discussion has been archived. No new comments can be posted.

Top Research Labs in Human-Computer Interaction?

Comments Filter:
  • Microsfot made it to the top in 2000-2010? I wonder what they were doing?
    • If you haven't noticed by now, they are making quality, useable, innovative software for both the general consumer and corporate worlds. Oh, yeah, and they are making money from it. That should tell you something.
      • I would certainly believe that technical excellence does not equate to "making money".

      • by Anonymous Coward
        I'm running with Internet Explorer 6.0 rig
      • I've noticed that beating up people is also profitable-- especially if you are innovative and manage to force them submit to repeated beatings by telling them there isn't any other option available .

        Yep, making money is a sure fire sign of innovation.

        --
        cscx: "Konke.. erm.. konceq.. erm.. that linax web browser thingy.. yeah that.. it's a monopoly"

        brought to you by the "hey look at me, I can say moronic things with impunity using a pseudonym" association of imbred programmer wannabees.
      • If you haven't noticed by now, they are making quality, useable, innovative software for both the general consumer and corporate worlds.

        This is both questionable and totally besides the point. Microsoft Research tends to be on the theoretical side of things--they make mostly prototypes and publish papers, rather than software they expect to sell someone. Complimenting Microsoft Research isn't complimenting the current state of Microsoft products--but rather their possible future ideas.

        If you go here, I think you'll see they aren't actually selling most of the stuff their researching today: http://research.microsoft.com/

    • MS has perhaps the best research team (at least nowadays) when it comes to HCI stuff. Think about it -- who invented that top-notch joystick? Natural shape keyboards? Wheeled mouse? MS on all three.

      Granted, if only their software groups could listen to them saying what a good error message is, we'd be set. (sorry, ranting about error msg in outlook "Operation could not be completed. Object Not Found" when clicking send&receive)
      • Re:Microsoft? (Score:2, Interesting)

        by Ubergrendle ( 531719 )
        "MS has perhaps the best research team (at least nowadays) when it comes to HCI stuff. Think about it -- who invented that top-notch joystick? Natural shape keyboards? Wheeled mouse? MS on all three."

        Joystick - Thrustmaster. Copied by MS.
        Wheel mouse - Logitech. Copied by MS.
        Natural keyboard - Not sure, but I had previously seen ergonomic models by IBM and Logitech long before MS got into the peripheral scene.

        I suggest MS is being cited for its GUI UI design and consistency across product lines more than anything else.
        • Joystick? How about Boeing or Lockheed? Weren't most topnotch computer joysticks based on fighter aircraft joysticks?
          • The early popular analog joysticks from Thrustmaster were largely based off commercial designs (e.g. - F16, F15, etc). They used it as a selling point, being identical in feel to the "real thing".

            They also cost >$100 when basic joysticks were running $10. The throttle was another $100 or two.

            Nowadays some of the better joysticks [saitekusa.com] aren't based off real fighter jock ones, but they're also way cheaper ($79 for the X45, about $30-40 for a combo joystick/throttle with numerous buttons and hats). Thrustmaster has also come down in price, because the market has expanded, plus competition has forced lower prices.

            Who first invented the force feedback style controllers? I think Nintendo was the first to popularize them, but as this thread shows popularize != invented.
      • MS has perhaps the best research team (at least nowadays) when it comes to HCI stuff.

        Though somehwat offtopic of the original HCI article, Microsoft also has the best research team in the world on 3D graphics, natural language processing, and a few other fields. Billions of dollars can rent an awful lot of talent.

        • Microsoft also has the best research team in the world on 3D graphics, natural language processing, and a few other fields

          Yup its one of the areas where they actually do produce interesting ideas.

          The work on 3D interfaces such as Task Gallery [microsoft.com] is pretty cool.

          I'll stop before this turns into a monty python style 'what has microsoft everdone for us' rant

          Why is that the good stuff at Microsoft either stays hidden in the research labs?

    • Msft r&d is pushing the envelope beyond human/computer interaction to human subjugation to computer domination. In the not too distant future, most drone employees will have a pc for a supervisor, will take direction from one, will be instructed what to do, what to buy, who to pay, etc, while the programmer bosses will be freed from the drudgery of workforce mgmt for more creative lifestyles, like golf, boating, etc.
    • if you want to see what research Microsoft is up to, go to http://research.microsoft.com/ [microsoft.com]

      They are working on some intresting stuff.
  • Jakob Nielsen's latest Useit column lists his opinion of the best HCI research labs, from 'The Dawn of Time' (1945) 'til now.

    Interesting definition of when the dawn of time took place.. :)

  • I'd like to get into HCI professionally but I have no credentials. Can anyone point out a good place to start? Even if it's just a few books? What kind of degree do you need, and what are the best schools?

    -dbc
    • Two great places to get info & book links are
      http://www.useit.com
      http://www.asktog.com

      the 2 people from those sites are considered to be the among best in the field
    • by Anonymous Coward
      Degrees in HCI vary quite widely. Some come from Psychology, Industrial Engineering, Computer Science or about any other degree since Human Factors and HCI is really a combination of disciplines.

      A couple good places to start would be the ACM SIGCHI http://www.acm.org/sigchi/ and the Human Factors and Egronomics Society http://www.hfes.org/

      If you are looking for a crash course, both CMU and the University of Michigan put on yearly 1 or 2 week courses.

      Info about CMU's Course is at http://www.hcii.cs.cmu.edu/

      and info about U of M's course is at
      http://www.umich.edu/~driving/shortcourse/inde x.ht ml

      Hope this helps.
    • Carnegie Mellon has the best graduate degree in HCI. They even offer undergraduate courses, which I believe they are unique in doing.
    • HCI is a broad field, and its practicioners have a a wide range of degrees, backgrounds, expertise, etc. Most either have degrees in psychology or in computer science (e.g., me), but some have degrees in art or design. Although HCI is not viewed well in some CS departments/schools, there are some where it's well-supported, such as U.C. Berkeley, Virginia Tech, Georgia Tech, and Maryland (College Park) (to name some I can think of off the top of my head). Also, there are a small but increasing number of schools that offer degrees in HCI. Carnegie Mellon offers a professional Master's and PhDs in HCI, for example, at the HCI Institute [cmu.edu]. (Full disclosure: I currently work at the HCII.) As another comment said, schools or departments of information science/technology are becoming more prevalent, and would provide a suitable background for HCI.

      Then again, you don't necessarily need a degree in HCI, CS, or psych at all. For example, if you're coming from the programming side (as I suspect many here on /. are :) ), you could get a job building user interfaces, which is mostly programming with some HCI component. Then you could migrate pretty smoothly to doing higher-level, design type work, which would be more HCIish and less CSish.

      As far as books, here are a few I like:

      • The Design of Everyday Things, by Don Norman.
      • Programming As If People Mattered, by Nathaniel Borenstein.
      Dan Olsen and Ben Shneiderman have written good HCI/UI (user interface) books, too.

      If you want to see what the cutting edge of HCI is, check out proceedings and journals, such as the ACM conference on HCI (Human Factors in Computing Systems [acm.org], a.k.a. SIGCHI) or the ACM Symposium on User Interfaces Software and Technology (UIST) [acm.org].

  • While there has obviously been a great contribution by Xerox PARC in the field, methinks Nielsen included it in every decade more to make a point than because they really deserved it. What have they done in the past five years anyway? The past ten? Not all that much that I can see. Their list of accomplishments [xerox.com] reads like high-tech-marketing-mumbo-jumbo, and makes some pretty far-reaching claims (object-oriented programming)?

    Nielsen's piece is more important to read because of its (rightful) insistence on HCI as something which is rarely considered when it should be.

    • Their list of accomplishments [xerox.com] reads like high-tech-marketing-mumbo-jumbo, and makes some pretty far-reaching claims (object-oriented programming)?

      Yes, OO and GUI were developed at PARC, but Xerox had no idea what they had in their hands, and let it slip away. Steve Jobs visited them on a corporate junket, and that's where the Macintosh came from (true story). A bit later, Jobs came out with NeXTStep. This illustrates that engineers need marketing and vice versa.

      This would be embarassing if not for the fact that IBM did exactly the same thing with RDBMS and indeed the PC, but it's got to rate alongside the greatest corporate blunders of all time.
      • No, not quite. OOP started with Simula and systems like Sketchpad, but PARC did have a big role in the 1970s and 1980s.

        Also, Xerox had every idea of what they had in their hands and tried hard to market it. But, as you may discover yourself some time, even with great technology and great marketing, having a huge success is still luck.

        Even Apple priced itself out of the early market with the Lisa. Apple only hit a marketable formula the second time around, and with the Macintosh they ended up selling something that looked a lot like the original Xerox systems but was much more primitive.

      • Steve Jobs visited PARC and saw the Star user interface, which was one of the first WiMP UIs. When he came back and described what he saw to the Lisa development team, Steve swore he saw overlapping windows in the UI. It turns out that he didn't see overlapping windows -- the Star didn't have them -- but the Lisa team managed to implement them to meet Steve's dream.
    • "and makes some pretty far-reaching claims (object-oriented programming)? "

      The first true object-oriented programming language, Smalltalk, was developed at Xerox PARC. It's not really a far-reaching claim.
  • by Telastyn ( 206146 ) on Friday April 12, 2002 @11:49AM (#3329757)
    This might be dumb/silly but isn't it more that Universities usually give out research funds via department? and the deparements rarely ever share? and because this sort of research requires both CS/CE knowledge *and* psychology?

    • The university of Minnesota has most of their computer science professors working with the medical school professors. Seems they see a lot of applications for computers in medician, but the time it takes to become a expert medical doctor means you don't have time to become an expert programer.

      I've always said that comptuers alone are useless. Combine a computer with some other field and it is extremly useful.

      • I agree with this assesment. I got a Masters in Computer Science from the University of Minnesota, and one of the courses I took was from the Psychology Department, named "The Psychology of Human / Computer Interactions," and the course was EXCELLENT. I don't know that they really offer much to students, after this course, but the one I had was great!
    • This is a cross-disciplinary field, and isn't taken too seriously by computer science departments OR psychology departments. Even Don Norman, whose work in the cognitive psychology of learning is still pretty important, probably couldn't get hired by a psych department nowadays. Consequently, in the past 5-10 years, a new trend has emerged--HCI researchers are finding homes in--of all places--the library. Increasingly, universities are starting "Schools of Information", or other similarly-named departments. These often combine HCI, Library science, design, aspects of cognitive science, and sometimes aspects of business school economics and sociology. And they are typically well-funded, both internally (via university "information initiatives") and externally (via corporate and government grants). Furthermore, they are frequently "Professional" schools, offering masters degrees to people who go off and work in all corners of the IT industry.

      Of course, this doesn't mean anything "Good" is going on in these schools. But many of Nielson and Norman's colleagues who haven't found cushy jobs as consultants at NNG are the people who founded these schools. I don't think the future of HCI really has much to worry about.
    • The problem you describe is called "Reductionism", which most universities suffer from. Reductionism tries to divide human knowledge up into a bunch of unrelated pigeonhole categories, like Science, Art and Humanities.

      HCI spans many categories, which makes it hard to fit into one pigeonhole. Which suggests that reductionist categorization is the wrong approach to education, not that the HCI people belong segregated with the humanities people.

      It's the hard computer science people who need to get out of the department more often.

      -Don

  • Boston College [bc.edu], though it lacks a graduate program in CS, is still doing some really interesting work in HCI. The CameraMouse [bc.edu] and EagleEyes [bc.edu] use computer vision and muscle eletric potential, respectively to control the mouse cursor. While this is mainly a user-assistive technology, they're continuing to develop the technology and at some point one of these could move into the mainstream of HCI.
  • CMU's Human Computer Interaction Institute (a href="http://www.hcii.cmu.edu/">http://www.hcii . mu.edu/ is worth a look - B.S., M.S., and Ph.D. degrees are offered.
  • My HCI teacher (Score:5, Informative)

    by ajiva ( 156759 ) on Friday April 12, 2002 @11:56AM (#3329804)
    Ken Perlin was one a guest lecturer at my HCI class at Stanford. This guy has so many good ideas, check out his web page:

    http://mrl.nyu.edu/~perlin/

    Alot of his work is Java/Web based and so its really easy to look at and get a feel for how it would work
  • I vote that Amazon should be included here since the net is more commericalized now, and they patented the "one-click" way of e-commercerce.

    I suggest he should put up a survey and include "Cowboy Neal" among the choices.
  • by AVee ( 557523 )
    Human-Computer Interfaces aren't much of an item any more. I think he is looking in the wrong direction. He is looking at how people interact with their home PC. If you look at the last few years the way we interact with our PC's hasn't changed much. But PC's haven't changed much as well. Yes, there is more disk space, faste CPU's etc. but how they work and what we do with it hasn't changed much. That means there isn't much need for a better interface.

    Only a new technology need a new interface, the way we currently interact with PC's is around for some time now and everybody is fine with it. If you want to see intresting thing I think you should be looking at newer devices like mobile phone, PDA's etc.
    • I don't think Nielsen is limiting his scope to the PC. It's hard to tell because he doesn't say why each of his picks makes the list, but each of his choices for the current decade has a strong investment in ubiquitous computing [xerox.com], as well as speech understanding and generation, which have obvious implications for mobile phone interfaces.
    • If you look at the last few years the way we interact with our PC's hasn't changed much.

      That's the problem. Today's computer user is not a highly technically literate professional the way they were a decade or two ago. The average Joe now has a PC, Mac or whatever sitting on his desk. By your own admission, interfaces have not developed to support this new class of user in performing his tasks.

      Added to which, I think the state of interfaces at present is pretty sucky even for the expert user. For a long time, the productivity in most offices was known to drop significantly when "old fashioned" tools went out in favour of modern computers. Has anyone ever seen anything to suggest that this is not still the case?

    • But PC's haven't changed much as well. Yes, there is more disk space, faste CPU's etc. but how they work and what we do with it hasn't changed much. That means there isn't much need for a better interface.

      No, that's wrong. Networks and especially the internet have fundamentally changed the way we use our computer. It's not really a new technology, but it allows new ways of interacting. And many users don't understand the consequences of this, which is why spyware and email worms work so amazingly well despite the fact that if you understand how your computer works, it is really easy to not be affected.

      I believe that this is an interface problem, computers do no longer communicate the consequences of an action the user is about to take in an appropriate way. (Actually they never had appropriate interfaces, but that wasn't a problem because until recently people didn't communicate with untrusted computers over untrusted networks all the time.)

  • It's worth noting that Don Norman [jnd.org], the former VP of Apple's Advanced Technology Group and the author of The Design of Everyday Things (among others) is currently a professor at Northwestern University. He's teaching a class this quarter, the future design of everyday things (sorry--login required for the class page), and it's fascinating!

    Josh
    • Look at the list of things he cares about. DVD menus are too complex? People can't deal with home theaters? Who cares? Why is that even worth spending any time on?

      Most people pop DVDs into their player and play them. Some people play around with the menus, many people don't. It's a leisure activity. It doesn't have to be efficient. It doesn't have to be simple. And if some people don't get it, it doesn't matter. If it's quirky, that's part of the charm. If customers don't like it, the manufacturer's focus groups will let them know.

      Tog had a similarly irrelevant column tearing apart the MacOSX dock. Come on, what's the problem? It looks nice, people like it, and anybody with an IQ greater than 80 can use it. Optimizing it doesn't save anybody any real amount of time.

      Guys: spend some time thinking about some real stuff, stuff that matters. Saying semi-obvious things about trivial little features really isn't interesting.

      • No, Norman is right. (Disclaimer, I saw Norman's keynote at UIST '94).

        I'm a geek. And I thanked heaven the day I realized that my new VCR would set its own clock. I have an ancient '80s vintage VCR that I still can't remember how to program without the manual. And I hate DST time change, because I have no clue how to set/change the time on my daughters' digital watches (four unlabeled buttons, -- too small to really press properly, none of which has the obvious function of time set).

        If we are to enter the era of what Norman calls "ubiquitous computing", then we've got to make it so you don't need to THINK at all to use the damn puppies.
        • And I hate DST time change, because I have no clue how to set/change the time on my daughters' digital watches (four unlabeled buttons, -- too small to really press properly, none of which has the obvious function of time set).

          The lower left button should toggle the stopwatch function. Hold it down for a second or two to access alarm set. Press it once again to access time set. Ask one of your daughters to help if you're still stuck.
        • No, Norman is right. (Disclaimer, I saw Norman's keynote at UIST '94).

          m I didn't say he was wrong, I said many HCI researchers are addressing irrelevant problems.

          And I hate DST time change, because I have no clue how to set/change the time on my daughters' digital watches (four unlabeled buttons, -- too small to really press properly, none of which has the obvious function of time set).

          If you can't deal with it, why did you buy that watch? Or if your daughter bought it, why do you set it for her? It's here problem. There are plenty of analog watches with a crown; I have one. Other people may want the features that are accessed through those buttons.

          If we are to enter the era of what Norman calls "ubiquitous computing",

          The era is already here, and Norman didn't even invent the term.

          • If you can't deal with it, why did you buy that watch? Or if your daughter bought it, why do you set it for her? It's here problem.

            Can you say "grandparents"? I knew you could.

            Other people may want the features that are accessed through those buttons.>

            The point is, it's not obvious what all these buttons do, they're hard to press, and *SETTING* a watch should be a fairly obvious function, and simple to do.
    • It's also worth noting that when Apple sought to create a computer with a radically different user interface, they studied (and eventually paid for) the philosophy and designs set forth by Xerox's PARC.

      That's not to say Apple hasn't had its own things to say about the subject. They used a simplified version of the interface for the Lisa (we remember how that did), and continued developing it in-house since then to try to perfect the Macintosh.

      (Come to think of it, given how Apple chewed through CEOs and how many different design shifts there were in those years, it's amazing that the Mac's interface didn't get shuffled around more than it did.)

      (Well, if you don't count Gil Amelio, Steve Jobs, Rhapsody, and X.)

      In 1984 they did publish the Apple Human Interface Guidelines, which some people still look to, but PARC is where the GUI fun started.

  • by j09824 ( 572485 ) on Friday April 12, 2002 @12:12PM (#3329910)
    I have followed HCI research on and off for the last decade, and I think it's largely missing the point. Just look at some really long-lived and successful real-world user interfaces: musical instruments, typewriters, cars, bicycles, electronic devices, etc. What makes a user interface successful is a very complex mix of factors. Being intuitive and efficient, two criteria that are the focus of much HCI research, are only two minor factors; factors like style, design, power, simplicity, and physical constraints are often much more important--and they should probably be for computer interfaces as well.

    Or, in different words, if musical instruments were designed like software, instead of violins and pianos, we'd probably only be getting those electronic children's books that play a melody when you touch different parts on the page. Kind of intuitive and easy, but not exactly very powerful or interesting.

    • An interesting counterpoint. However, as a side, I'd like to note that the trombone, being one of the first brass instruments, later evolved into the euphonium. :) Easy interface: buttons and simi-perminent adjustable valves. A lot easier for a beginner to use than one giant main valve. However, the other valves on it allow for the same flexibility as is found in the trombone. Interesting, now that I think of it! lol
    • if musical instruments were designed like software, instead of violins and pianos, we'd probably only be getting those electronic children's books that play a melody when you touch different parts on the page. Kind of intuitive and easy, but not exactly very powerful or interesting.

      A better example of this is the bicycle. It is not too intuitive and has a rather steep learning curve. If "pedal vehicles" were designed by a focus group we would only have Big Wheels. Sometimes a steep learning curve is worth the payoff.

    • Nielson's over-emphasis on regularity in
      interfaces makes him unpopular with designers.
      It's hard to make an interface both
      compelling and easy, but not impossible.
      Nielson, like many other "usability experts," is taking the easy and more elementary road. His popularity comes from putting in print what people think intuitively (face validity) - similarly to Maslow's popularity in pop psychology. Maslow came up with a "theory" (need hierarchy) that people in general could understand and "apply to thier lives" which was hugely popular, but completely non-science.

      Sure, Nielson is right that you have to keep things in their place, don't change their location, make information easy to find due to
      it's organization, but that's Usability 101.

      I'd reccommend The Psychology of Everyday Things by Norman, and the Tufte Books.
      These sources, unlike Nielson, leave designers with their creativity intact.
    • (* Or, in different words, if musical instruments were designed like software, instead of violins and pianos, we'd probably only be getting those electronic children's books that play a melody when you touch different parts on the page. Kind of intuitive and easy, but not exactly very powerful or interesting. *)

      More "old == good" bias here. One could make such buttons analog sensors, that respond different ways depending on how hard you pressed, back-and-forth finger movement, etc. You have many "axis" and variables that could possibly be measured.

      However, that would make the book cost like $2000, something nobody will pay for.

      Just like old-style musical instruments, you get what you pay for.

      It is a matter of resources, and not old==good.
    • Most HCI research does take into account successful "real world" interfaces. The first part of Jef Raskin's The Humane Interface talks about, among other things, a very non-intuitive-looking old shortwave radio that's very easy to use, and why so many people prefer knobs in car radios to the array of buttons most modern implementations seem to have.

      You're making the assumption that HCI research is about dumbing things down for the user. I don't think that's true at all--to put it inelegantly, it's about making the interface get out of the way, to be as transparent as possible to the task. Raskin takes an awful lot of heat from people infuriated by his dismissive attitude toward skinnable interfaces, but if you actually look at his research, he's advocating interface designs which are very powerful--i.e., entering commands in a text editing field by typing them in the text stream and pressing a [command] key, or navigating entire document collections with incremental searches. This is not the UI equivalent of "electronic children's books," and that's an unfair dismissal of HCI research as a whole.

      Most HCI researchers are dismissive of current GUIs because they're not making any attempt to change the paradigm. "If it works, don't fix it" sounds nice, but if we followed that too slavishly, we'd be steering our cars by reins--computers have changed sufficiently since the early '80s (in volume of information, at the very least!) that it's worth considering the thought that productivity could be improved if we were trying to do more than make our interfaces translucent and shadowed.

      • '...we'd be steering our cars by reins...'

        As a side note.

        Cars first came with tillers, like on a boat. Think about which way you would push the tiller to turn left?

        Boats originaly had tillers but changed to wheels. It was to do with larger boats and the need for mechanical purchase to steer. The original wheels steers the wrong way. That is turing the wheel left caused the vessel to turn right. The rerason was that everybody was used to the tiller!

        Lastly, all this horse power and I'm still not able to control my computer by thought alone. Jesus would the compu sci guys get off their asses and hop to it. I've got a world to conqueror and this keyboard is just to slow.
  • by cheekymonkey_68 ( 156096 ) <[ten.ku.urugbew] [ta] [dcma]> on Friday April 12, 2002 @12:19PM (#3329958)
    HCI has rarely been the first priority of new research organizations

    Thats true but the real failing has been its use in industry, HCI is rarely the first priority there either, being often seen as expensive, time consuming and something separate to the traditional design process.

    How many projects actually fail because the developers designed the system that the client wanted, not what the users would realistically use on a day to day basis.

    The most practical aspects of HCI focus on understanding the user, and most modern software design methodologies take account of this...actual use of HCI in RL is really lacking.

    Its one one the main reasons projects fail in the long term, ok poor project management and vague requirements do the most damage but its still pretty important

  • Some of the best HCI work has been done in areas like Aircraft control.
    I don't think anyone would disagree that the Euro fighter development team hasn't put a lot of research into HCI.
    Car manufacturers are also doing a lot of good HCI work.
    Nokia managed to develop a efficient interface with a low learning curve, this is a fairly major achievement.
    I think things like touchtone phones, and remote control devices should have made the list.
  • Check out their GVU pages (some profs hold appointments in both psych and CS)

    GaTechGVU [gatech.edu]
  • by Dr. Zowie ( 109983 ) <slashdotNO@SPAMdeforest.org> on Friday April 12, 2002 @12:31PM (#3330014)
    First thing I thought was, "hmmm... haven't we understood hydrochloric acid for a long time now?"
  • He says that future HCI research is in jeopardy

    I *strongly* disagree with him on this. In fact, the opposite is true. It is only in the past few years that universities and industry have realized that there is a HUGE demand for human factors or HCI specialists.

    Engineering deparments are also realizing that undergrads can benefit greatly by taking a human factors course in product/system design.

    If any one is interested in bringing human factors into their engineering education I suggest you look at Kim Vicente [utoronto.ca] who is trying to make human factors a part of every engineers education.

  • My two cents (and I am biased), is that the future of HCI involves the human and the computer sharing a common frame of reference.

    My dissertation research [greatmindsworking.com] involves developing a software system that will allow a computer to acquire a lexicon grounded in visual experiences. Thus words to a computer start to have some "meaning" rather than just being based on other words.

    I'm working through the Robotics Research Lab at LSU [lsu.edu].

  • Other Rankings (Score:4, Informative)

    by yerdaddie ( 313155 ) on Friday April 12, 2002 @12:53PM (#3330141) Homepage
    How rigorous. Usability pundit picks pet criteria and decides that these are the top HCI labs. Those interested in the real state of the field instead of opinion might take a look at the more rigorous listings available:

    Top Research Labs by Topic, 1978 and 1997 [businessweek.com]

    Where Researchers Want to Work [businessweek.com]

    BusinessWeek's Top 20 US Research Labs [businessweek.com]

    Google Cache of 1999 US News ranking of User Interaction Grad Schools [216.239.51.100]

    MIT Technology Review Corporate R&D Scorecard [technologyreview.com] (Requires subscription)

    HCI Academic Article Imapct Rankings [nec.com]

    I think that few of the people on avant garde of HCI research take Jacob Neilsen very seriously. He is a usability specialist, not a interface researcher.

    • Re:Other Rankings (Score:3, Insightful)

      by Watts Martin ( 3616 )
      I think that few of the people on avant garde of HCI research take Jacob [sic] Neilsen [sic] very seriously.

      So, instead of the Nielsen Norman Group, we should be listening to Business Week? Only one of the lists you linked to was about HCI research--an automatic indexer of published journal articles, many of which--even in the Interface Design subsection--are only loosely connected to research toward making more usable interfaces, which, yes, is what Nielsen (rightly) harps on.

      NN/g may not be "avant garde," but they're taken seriously by businesses [nngroup.com], which makes your counterpoint of Business Week's lists faintly ironic. You don't need to be an interface researcher to make observations about the state of applied usability research, you need to be someone who studies usability in applications for living.

  • HCIL for Kids (Score:2, Informative)

    by Anonymous Coward
    The University of Maryland's Flagship branch in College Park has a Human-Computer Interaction Lab that focuses in part on making NEW technologies for kids. This includes computer software, and cool interactive toys (think Teddy from A.I.). They have a team of children who help with the design process, and are overall doing all kinds of really neat things. I think they should have at least received an honorable mention, if only for including kids in the research process, and making _new_ technologies.
    The kid-oriented website is here:
    http://www.cs.umd.edu/hcil/kiddesign/

    The HCIL exists under the umbrella of the UM Institute for Advanced Computer Studies, their grown-up page is here:
    http://www.cs.umd.edu/hcil/

    Like several other responses, I thought the list was entirely too random, and didn't include nearly enough explanation of who got picked and why.
  • Does anyone else besides me think that Jakob Nielsen is an idiot? I've read several things by this guy, and have yet to agree with him on anything substantial, and I'm a UI fanatic.
    • I'm behind you, most of his points are rediculous. Things like recommending all links be blue because to change the colour would cause the user to lose "several seconds" of "cognitive overhead"... give me a break!

      I wouldn't go so far as to call the guy an idiot though, because overall he does have a point. That point being that most web designers don't think as much about how people will use the interface as much how good it will look.

      That attitude tends to produce navigation where you have to SEARCH to find the links on the page.. While that's great for a piece of art or for a popular band's website wehre people want to have fun...that REALLY sucks for business.

      The guy's got the right idea, the web needs to be built with the user in mind. He doesn't seem to have the knack for really understanding a user though, he takes a much too scientific approach to something that is clearly an art.

      • "That point being that most web designers don't think as much about how people will use the interface as much how good it will look." That would be the one thing that I agree with him on. But, have you ever seen his website [jakobnielsen.net]? Sure, it may be easy to use... if you can stay awake long enough to click on a link ;-) Actually, I'll admit that it looks much better than it used to. Once you get beyond that one point (which seems to me like it should be totally obvious), I can't think of anything I've ever read from him that wasn't ridiculous.
  • HCI Labs are expensive items, it ain't cheap to get either multi-disciplinary personnel or more single disciplinary people.

    The best in the business at the moment are HCIL Maryland, M$ Redmond even if they never implement their research!, Xerox PARC and Nokia's Research Lab in Finland (who ain't got a Nokia?).

    Others that I know more about personally are Prof. Stephen Brewster group at Glasglow Interactive Systems Group http://www.dcs.gla.ac.uk/gist/ and my own group the Interaction Design Centre http://www.ul.ie/~idc (Gotta mention it ! :-)

    HCI is one area which still needs both more available research and more universal courses on the topic.

    Research is definitely needed in new technology. As it requires investigating both current and possible HCI methods and techniques. As with technology, neither are social or personal interactions static these need to be further examined such as in CSCW ( Computer Support Collaborative Work) an offspring of HCI.

    The requirement for more univeral courses is obvious in that I've seen friends and students design UI's and winced at the end result. Until every programmer or software engineer is taught simple HCI principles interfaces will still pain the user.

    One easy book to read on this subject is Jeff Raskin's "The Humane Interface".

    The best place to see what the top research labs in HCI are is in the current research literature such as the ACM http://www.acm.org in the CHI section, this really is the best place to find academic research on the topic. To find the best place for corporate research just find a successful product that uses an interface and there you go!
  • When I read the short article a week or so ago, I remember wishing it contained more information. He has a bit about his criteria for the list, but I would have liked to see at least a short blurb about why each lab deserved their ranking. I know the Alertbox posts are not meant to be long involved discussions, but Nielsen's columns usually contain more analysis than this one.
  • I am a software engineer who spends 90% of my work time trying to find out what a couple of thousand people want to see in one system. I didn't have any formal training in school and now I am paying for that. From a Professor's point of view they think that computational problems are the more interesting and complex ones. This is because CS nowadays is taught from a userLESS aspect. Most projects/assignments are never going to be used by anyone in a real world environment therefore why bother designing usable chi for it. I agree with the author of the article. I am somewhat disappointed that CHI is not a big factor in the educational arena meanwhile anyone who designs big systems knows that without a good chi there is no product.
  • The IEEE just released a new publication called "IEEE Pervasive Computing; Mobile and Ubiquitous Systems [computer.org]". You can track down a dead-tree edition (got mine in the mail a couple of days ago) or read it online if you have a digital subscription.

    The first (paper) issue even includes a reprint of Mark Weiser's "The Computer for the 21st Century", Scientific American, 1991 article. A very interesting read, seeing how far things have and have not gone in ten years.
  • impressions (Score:4, Informative)

    by anothy ( 83176 ) on Friday April 12, 2002 @01:31PM (#3330384) Homepage
    well, other people have already noted that he's too focused on human-workstation/server interaction (rather than broader human-computer interaction which includes the range of computers people don't think about as computers, like microwaves and air traffic control systems). but lets look at it within that frame.
    easy stuff first: today. i think it's laughable that he'd include Microsoft rather than Apple, particularly given the criteria he states. Microsoft is very much doing evolutionary progressions on there Win95 UI on the desktop, and very unimpressive stuff in the WebTV realm. Apple, on the other hand, took a much more dramatic jump in the Aqua development. further, Apple does a much more thurough and complete job of UI definitions, work that MS has largely just ignored, leaving up to the app designer.
    it's also quite interesting that Bell Labs didn't make it in the '80s. it was 1981 when rob pike wrote the first bitmap window system for Unix, and that decade when Bell Labs created the jerq, blit, and DMD (or MDM?) series of multi-tasking graphical terminals. pioneering work that led directly to much of what came after, particularly much of the Xerox PARC and Bellcore work following it.
    his "fall of the good" observation is distressing, and i agree with it, but not his reasoning. Xerox and Bell Labs certainly hadn't "peaked" in any real sense by their respective apearances in the list (okay, Xerox maybe by its third).
    the article is less useful without notes on why a give place made the list. i certainly hope X wasn't a positive contributing factor for MIT, for example! to my knowledge, MIT did more interesting things in the '90s. and i confess total ignorance as to what PARC's done since 2000. i'd really like to, but he doesn't say.
    i think the author's assertions about HCI research in universities are bogus. while research universities may have avoided "real-world" research in the past, today that's nearly reversed. many universities are indistinguishable from corporate R&D arms. in particular, given CS departments' increasing trend towards vo-tech training over broad educational foundations, this becomes more and more true. but this just changes the cause, not the problem. now universities arn't likely to be involved in pineering HCI research because they're doing much smaller, more incremental improvement sort of stuff.
  • One of the primary difficulties faced by HCI within industry is that the field is still ill defined and misunderstood by those who are practitioners of software development. A very common view is that HCI is the study of how to make software that is easy to use for the first time, naïve user. For instance, once comment posted about this story states "...if musical instruments were designed like software, instead of violins and pianos, we'd probably only be getting those electronic children's books that play a melody when you touch different parts on the page. Kind of intuitive and easy, but not exactly very powerful or interesting."

    This same misperception that HCI is only about software for naïve users may also explain why it is so well embraced by the major players in enterprise web development and not is other areas such as application software. In the world of web development it is widely accepted that all users are naïve users. (This is partly why HCI practitioners such as Jokob Nielson are able to be so prolific in the area of web software.) However, in application development, the common view is the software is being developed for "expert users" and that catering to the needs of the naïve user through HCI will only dilute the program's capabilities needed by the "experts" .

    This same attitude is also leads software development teams to think that they can create user interface for naïve users simply by creating a lot of dialog boxes and wizards. (Yuck!)

    The fact is that the field of HCI is much broader than this common and simplistic understanding. While HCI does have something important to say about the way applications are designed for the naïve user, this aspect of usability is only one component of HCI. HCI also has a lot to add to the design of software systems to be used by "expert" users.

    People such as the ethnographer (Who works to understand how the end user gets their work done.) and the information architect (Who designs user interfaces for information-rich software systems.) are also working within the field of HCI. Their contributions are probably most useful when developing software systems that are not geared towards the naïve user such as Photoshop or even an enterprise application. In these applications it is even more important that the software accommodate the user and fit within the user's normal workflow.

    I have put together a short paper giving information about the different roles that are exist in the domain of user interface software and how these roles fit together fit together to form a loose user interface software development process. It is available at http://www.bobowen.org [bobowen.org]. I also recommend that software development practicioners get and read About Face by Alan Cooper for a better understanding of how user interfaces can be designed without resorting to all these dialog boxes.

  • I really wish people wouldn't worship the ground Nielson walks on. He SO does not deserve it. Just because he was one of the first to make some common sense suggestions to help web sites download faster, does not mean he is an expert in HCI. It just means that he was too cheap to get a modem faster than 9.6 kbps.

    I have been developing web pages commercially for 5 years. Frames do have a use, as do embedded images. W3C is smarter than Nielson. They have forsight and understanding of how people like to present their content.

    Take a look at the source code of http://www.useit.com/. Uppercase HTML tags, unquoted attributes within tags, single HTML tags such as img, br and hr without closing forward slashes at the end. He doesn't know what he is talking about. And worst of all, he uses Verdana, an ugly, unreadable font that is not as suitable as Arial, Helvetica and sans-serif for viewing text on computer screens.

    One reason new technologies are created is to enhance the education and entertainment that can be provided by online content systems. If content provided is dry and boring (eg: www.useit.com), viewers are going to learn less and be less satisfied with their experience.

    Nielson should take a reality check and leave the publication of usability papers to people who are experts, not just claim to be.
  • Hmm, is this a bit like a baseball 'World Series'? Surely not every 'Top Research Lab in Human-Computer Interaction' in the last 50 years is from the US....


  • The Vegas line on this one is PARC by 5. I mean, just look at the depth of this team. They were breaking new ground with graphical user interfaces, they had some serious talent, they weren't motivated by the constraints of the marketplace. The MS 2001 team had big bucks, but I'm just not sure they had the drive and motivation. You know, heart counts for a lot in these matchups.

    Seriously, though - I know that Nielsen is trying to stimulate discussion about the role of HCI labs and generate interest in the history of HCI. But ranking HCI labs over "history" just seems a bit silly to me.

  • Microsoft spends billions on Human-Computer research. I worked in speech recognition research there for a couple of years. They routinely do a survey of what the universities are doing, and share code from CMU and MIT. Microsoft has usually has several projects researching the Next Big Thing, be it speech, natural language, vision, AI or just new mouse designs. They do make some progress, but it is very slow.

    They are not getting their money's worth. Oddly, they don't expect to. Its pure research, some people say its the only pure research in industry today; possibly there is a good reason for the demise of the other pure research labs.

    For those of you who want to do research on some pie in the sky concept after your PHd, Microsoft is a great place to be, as it pays well and gives a fairly long leash.

    MIT and CMU are both leaders in HCI. MIT is for bright team players, and functions pretty similar to Microsoft..transitioning from MIT to Microsoft is pretty smooth. CMU is apparently for Mad Scientist loners. This is where the really radical stuff gets done. Of course, you need a big brain for either :)
  • In his article Nielsen bemoans:
    It's striking that only two of the 12 research medals went to universities. I think this is because university departments seem to view the best HCI research as both too mundane and too resource intensive. Many academics disdain research topics that are closely connected to real-world needs.
    From my experience this might be largely because the academic efforts network more readily than corporate labs do, and that experience might be closer to filling a book than a Slashdot post, so I'd better only mention where it all began.

    Back in the mid '80s, inspired by Neilsen Norman Group [nngroup.com] partner Bruce Tognazzini to explore the syntheiss of graphical user interface and online information services, my then trade press hat was enough to get me in to have a chat about user interface research with Professor Peter Poole [thoughtware.com.au], the then relatively new head of the Computer Science department at my alma mater, the University of Melbourne.

    At that interview Poole was dismissive of HCI as something best left to commercial interests but before the end of the '80s, through his role as chairman of an IFIP Technical Committee, he and I finished up in the Napa Valley at an IFIP working conference on Engineering for Human-Computer Interaction [tls.cena.fr].

    During those years, I had opportunities to follow a few of the interconnected strands of inspiration variously categorised under Hypertext, Computer-supported Cooperative Work and the broader Computer Graphics communities and share in the early work and inspiration coming from institutions in the form of Brown's Intermedia and MIT's Notes (pre-Lotus), and from indepenents like Ted Nelson and Doug Engelbart.

    Meanwhile Prof Poole was making the University of Melbourne Australia's gateway to the Internet and creating a supportive campus-wide IT infrastructure that would allow a few early innitiatives to be explored, especially educational multimedia. But as is so often the way of academia, the benefit became spread much wider than Melbourne through the natural progression of individual careers.
  • I'm surprised UIUC hasn't been mentioned for our present endeavors in HCI. There's a lot of money and work flying around here.

    The huge building known as the Beckman institute houses AI and HCI research with primary intermingling occurring among the CS and Psychology departments. Human-Computer Intelligence Interaction [uiuc.edu]
    ...and then there's my favorite baby project on campus, Active Spaces. Active Spaces [uiuc.edu] is just a part of the CS department, separate from Beckman, and is researching ways to gadgetize the new CS building, aka the Siebel center [uiuc.edu] (currently-under-construction).

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...