Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
GNOME Operating Systems Software Technology

Ask Slashdot: Unity/Gnome 3/Win8/iOS — Do We Really Hate All New GUIs? 1040

Brad1138 writes "You see complaints about the 'next gen' GUI's all over the place, but do we really all hate them? Personally, I don't like them — I tried very hard to like Unity in Ubuntu 11.04/11.10 before giving up and switching to Mint (I am very happy there currently). But is it the vocal minority doing all the complaining, or is it the majority? Are we just too set in our ways?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Unity/Gnome 3/Win8/iOS — Do We Really Hate All New GUIs?

Comments Filter:
  • by Nimloth ( 704789 ) on Tuesday November 08, 2011 @09:11PM (#37993706)
    What answer do you expect on Slashdot?
    • by Anonymous Coward on Tuesday November 08, 2011 @09:28PM (#37993878)

      What answer do you expect on Slashdot?

      The submitter is probably looking for answers from people who use Linux. There are a few of those on Slashdot so it's probably a good place to ask.

      Like the submitter, I have tried hard to like Unity but really can't do it. I can see how it might be a good idea for netbooks with small screens where you run all apps maximised but that's not how I work. I have a big multiscreen set up at work and a modest sized laptop screen at home. In both cases I like to work with multiple overlapping windows - which is not a mode where Unity shines.

      The shared menu bar at the top doesn't work for me - I would prefer it to be in the app window, close to where my mouse is already. I also dislike the fact that the menu options aren't visible until you move your mouse over the bar.

      I also prefer focus-follows-mouse - this just doesn't work with the shared menu bar as your focus changes while moving the pointer from app window to menu bar.

      The complete lack of support for applets is a real pain point for me too. I like having SSHMenu available, I like having the CPU and network monitors and I like having shortcut icons on the panel where I choose to put them.

      Things which I found easy to do with a single mouse click in GNOME2 now require multiple key presses or worse, mouse clicks and keypresses. I've been sticking with GNOME2 in the mean time and it frustrates me that that option is disappearing and available alternatives seem worse than what I have now.

      • by wizkid ( 13692 ) on Tuesday November 08, 2011 @10:55PM (#37994828) Homepage

        Slashdot is always going to give you a tainted answer. slashdot'rs hate everything. If your curious about it tho, check out the downloads on Linux Mint. They've had record downloads, and are zooming up on Ubuntu at a record clip, according to most webstats. I switched to Linux mint instead of upgrading Ubuntu desktop, now it's on my laptop, workstation, and I'll probably try upgrading my company workstation to it when upgrade time comes along.

        I still like ubuntu server! But I hate unity with a passion. Read what they plan on doing with Gnome 3! They're talking about putting gnome 2 stuff back on top of gnome shell, and making a hybrid Gnome desktop.

        The Linux desktop Developers keep thinking that making the desktop simpler is the answer. And they're doing it by taking away options. Windows is busy putting options back into they're desktop. Someone needs to go beat the Gnome and Ubuntu developers with a clue stick.

        Damn, now I'm sounding like a slashdot'r RATS!

      • by Idbar ( 1034346 ) on Tuesday November 08, 2011 @11:40PM (#37995176)
        No, he's not. That's just part of the question but as you showed, the slashdot reader is obviously biased. The headline clearly includes Win8 and iOS.

        I'm assuming the question is if you want all the OS to become the average user OS where you click big buttons instead of the extreme power user where you preferably type most of your commands and tend to be more tech savvy.

        And as the GP asked, I wonder as well, if this is the proper target audience for such question, particularly when I clearly hate limiting interfaces, but many people have found that iOS with all it's limitations to the user provides them with what they need.
      • The shared menu bar at the top doesn't work for me - I would prefer it to be in the app window, close to where my mouse is already. I also dislike the fact that the menu options aren't visible until you move your mouse over the bar.

        The problem isn't the global menu at the top; Mac OS has had that since 1987 when MultiFinder came out. The real problem is the second thing you stated: a mystery meat [webpagesthatsuck.com] global menu.

        • by Ami Ganguli ( 921 ) on Wednesday November 09, 2011 @07:27AM (#37997444) Homepage

          The global menu is the problem for a significant number of people.

          I could learn to live with just about every other Unity change, but the global menu drives me bonkers. When my visual focus is on an app window on my second display, I expect to be able to go to that window in order to work with the app. I don't expect to have to click on the window first, then move my mouse to the other display in order to access a menu that (oddly ) isn't located anywhere near my application window.

          I've never understood why Apple sticks with this setup. It made sense with the original Mac, which had a tiny screen and didn't really support multi-tasking. It's a huge usability problem for modern desktops with multiple, large displays.

      • Comment removed (Score:5, Interesting)

        by account_deleted ( 4530225 ) on Wednesday November 09, 2011 @04:02AM (#37996452)
        Comment removed based on user account deletion
    • Re: (Score:3, Insightful)

      by Blakey Rat ( 99501 )

      The open source, anybody can contribute ideas, rapid release development methodology would be the perfect way to prototype new UI ideas and designs-- if the community weren't a bunch of whiny luddite complainers.

      • by im_thatoneguy ( 819432 ) on Wednesday November 09, 2011 @12:54AM (#37995622)

        Nope I think you have it completely backwards.

        It's Open Source. So unless you're a developer your opinion is going to be derided, disregarded and dismissed.

        If non-programmers want input on their products then they need to pay developers to prototype their ideas. A developer's idea of a great UI and innovative interface is:

        > $ convert label.gif +matte \
        \( +clone -shade 110x90 -normalize -negate +clone -compose Plus -composite \) \
        \( -clone 0 -shade 110x50 -normalize -channel BG -fx 0 +channel -matte \) \
        -delete 0 +swap -compose Multiply -composite button.gif

        "So efficient!"

  • by YodaToad ( 164273 ) on Tuesday November 08, 2011 @09:15PM (#37993736)

    The problem that I have with all the new GUIs that are coming out it seems like it's all just change for the sake of change.

    • Re: (Score:3, Funny)

      no, silly! It's change for the sake of requiring the purchase of new hardware to run the new GUIs!
      • by ArcherB ( 796902 ) on Tuesday November 08, 2011 @09:52PM (#37994196) Journal

        no, silly! It's change for the sake of requiring the purchase of new hardware to run the new GUIs!

        Close. The idea is to have the same interface on every device you buy.

        Personally, I think that sux. See, I used to wish I could run Windows on my phone. Then, one day, I was able to vnc running on a phone... or ipad... or something small, I don't remember and was able to connect to my desktop. Again, I don't remember if it was Gnome, KDE or Windows, but I've tried them all and the results were the same. They all sucked! They sucked big time. It was impossible to bring up or pull down any menus. There was no right click, drag, or shift click. Scrolling using scroll bars or even accurate clicking was next to impossible. Well, let's just say it sucked. The problem is that the desktop OS was not made to run a four inch screen with a single touch or even multi-touch as an interface.

        So, now, rather than making a desktop GUI fit on a phone, they are trying to fit the phone GUI fit on the desktop. The results are exactly what you would expect. Most of the right-click functionality is gone. In Unity or Gnome3, if you right click on the menu bar across the top, nothing happens. Gone is the right click and "Run As" dialog. Gone is the right click and "add to bar". Basically, the right click has been removed from much of the GUI functionality. Gone are the nested menus. Instead of "gnome-foot"/"start-button"/"K" -> System -> Whatever-You-Want-To-Run, you now have something like this:

        Move the mouse to the left side of the screen and wait. Did anything happen? No? Move to the top left. OK, how about now? Do you see the program you want to run? I'm not sure what the icon looks like. Just mouse over everything and wait over each icon. It should tell what each app is. Is it there? No? OK, click on the top icon, click in the box, and start typing what you are looking for. Don't remember what it was called? It's like a CD Ripper but it's not called that? Hmmmm. I don't know what you'd type in. I know on the old system you would go to multimedia and look for it. Now-a-days... well, I guess you are just fucked. See, the developers that made your GUI didn't think that ABCDE [lly.org] was important enough to include on the main tool bar so you don't get to run it if you don't know what's it called.

        Oh, but if you were on a phone, this would look awesome!

        • by shadowfaxcrx ( 1736978 ) on Tuesday November 08, 2011 @10:32PM (#37994584)

          You hit the nail square on the head. I think what's happening is that the suits are running the companies now rather than the nerds, and so we're seeing what happens with suits everywhere - they see something work once, and they try to force it to work everywhere. This is why we see one good product quickly find itself surrounded with 50,000 lesser clones of itself.

          "Hey, Survivor was a big hit! Quick, make me a billion more "reality" shows!"

          MS and others are looking at the wildly successful smart phone and assuming that it's entirely the OS that makes them successful. They're right, of course, but for the wrong reasons. The OS makes them successful because it makes something so freaking small very usable. Miniaturization adaptations used by iPhone/Android/etc are only slick on miniature things. Shove them onto a 25" widescreen that you don't want to be touching all the time, and the concept which was so loved on the phone is going to be hated on the big(ger) screen.

           

        • by Grishnakh ( 216268 ) on Tuesday November 08, 2011 @10:39PM (#37994650)

          Gone are the nested menus. Instead of "gnome-foot"/"start-button"/"K" -> System -> Whatever-You-Want-To-Run, you now have something like this:

          Everything you wrote is spot-on, except the part here about the "K". KDE is the one heavyweight Linux DE that hasn't drunk the one-UI-for-every-device kool-aid. The "K" menu is still there in the latest 4.7 release, and it isn't going anywhere. Furthermore, there's actually multiple UIs you can select in KDE; the default one for desktop PCs is "plasma-desktop" (you can see it running with "ps"). But if you're running a netbook, there's a different one that's optimized for netbooks, called "plasma-netbook". There's more coming for other devices (namely tablets; I don't think we'll see KDE on any phones soon; the tablet one will be touch-oriented as you'd expect). Unlike the other morons, the KDE guys have a totally different philosophy: they believe that different devices should have different UIs, though they can use most of the same underlying libraries and other software services.

          Say what you will about KDE and their 4.0 screw-up, nepomuk, etc., but in this area, they have exactly the right idea.

          • by ArcherB ( 796902 ) on Tuesday November 08, 2011 @10:44PM (#37994712) Journal

            Gone are the nested menus. Instead of "gnome-foot"/"start-button"/"K" -> System -> Whatever-You-Want-To-Run, you now have something like this:

            Everything you wrote is spot-on, except the part here about the "K". KDE is the one heavyweight Linux DE that hasn't drunk the one-UI-for-every-device kool-aid. The "K" menu is still there in the latest 4.7 release, and it isn't going anywhere. Furthermore, there's actually multiple UIs you can select in KDE; the default one for desktop PCs is "plasma-desktop" (you can see it running with "ps"). But if you're running a netbook, there's a different one that's optimized for netbooks, called "plasma-netbook". There's more coming for other devices (namely tablets; I don't think we'll see KDE on any phones soon; the tablet one will be touch-oriented as you'd expect). Unlike the other morons, the KDE guys have a totally different philosophy: they believe that different devices should have different UIs, though they can use most of the same underlying libraries and other software services.

            Say what you will about KDE and their 4.0 screw-up, nepomuk, etc., but in this area, they have exactly the right idea.

            Spot on. I installed KDE on one of my Linux boxes and really liked it. I installed Trinity on another Linux box and may even like that more. We'll see how it stands up to more use.

        • by Anonymous Coward on Wednesday November 09, 2011 @12:01AM (#37995314)

          Close. The idea is to have the same interface on every device you buy.

          It's like when I went to the auto parts store for a new drive shaft.

          They gave me a propeller.

          Me: But sir, I don't need a propeller for my car.
          Salesman: Airplanes are newer than cars and they are awesome. The use propellers. You know you can propel a car with a propeller, right?
          Me: A drive shaft would give me more acute control than a propeller.
          Salesman: You'll thank me for making your car careen wildly with this propeller once you realize the cool factor.

        • by l3v1 ( 787564 )
          The idea is to have the same interface on every device you buy.

          The idea is to transition from personal computer to corporate computer, tailoring and shrinking the freedom we're used to on PCs. It might be far fetched, and hopefully we'll always have linux ( :) Casablanca, well, well), yet it still seems this train has started, from wherever you look at it.

          Coming back to the ground, the Apple and MS and Gnome-dev ideas to consolidate small/big/touch/nontouch/etc interfaces is probably the craziest idea
    • ^^^THIS^^^

    • Not necessarily. (Score:5, Insightful)

      by MrEricSir ( 398214 ) on Tuesday November 08, 2011 @09:22PM (#37993822) Homepage

      Not to defend any of the new-ish UIs, but the conventional UI model has always sucked. Every moment I spend moving a window around or resizing it is frankly wasted time. Same with launching programs or organizing my menus.

      If we can abandon the model where the user has to fiddle with a bunch of unnecessary crap just to use their computer, that would be a step forward.

      Thing is, I'm not sure any of the new UIs are quite there; they made radical changes but only minor usability improvements.

      • Re: (Score:3, Insightful)

        by telekon ( 185072 )

        The perfect UI for 90% of all use cases has existed for decades. I think In The Beginning Was The Command Line [cryptonomicon.com] should be required reading for all of those "Intro to Computer Literacy" classes they tend to require of college freshmen (or did about 6 years ago when I was still taking classes). I can see GUIs for Photoshop or Final Cut or whatever, but the vast majority of my computer usage is spent in bash/zsh and vim. And I'm not even describing my coding/sysadmin work, this is home use. As far as GUIs go, I

        • Comment removed (Score:5, Insightful)

          by account_deleted ( 4530225 ) on Tuesday November 08, 2011 @10:01PM (#37994280)
          Comment removed based on user account deletion
          • by lorenlal ( 164133 ) on Wednesday November 09, 2011 @12:09AM (#37995378)

            As someone who has supported users for years... On whatever F-ing interface they've had to use... 3270-Mainframe, Windows, AIX, Solaris (CLI or GUI), it all came down to one simple thing:

            How do I do my job?

            Frankly, it doesn't matter a damn what the interface is. For most business grunts, end users, whomever, it really doesn't matter what's in front of them because they'll learn how to use it. If it works, and if works *reliably* then the end users end up loving it. I've heard the phrase, "It's ugly as sin, but it works" and "It looks nice, but I can't use it" enough.

        • by Oligonicella ( 659917 ) on Tuesday November 08, 2011 @10:26PM (#37994522)
          "but the vast majority of my computer usage"

          You do understand that you and I (IT guy that I was/am) are a *very* small percentage of the user base?

          "And I'm not even describing my coding/sysadmin work, this is home use."

          You're certainly not describing the home use of anywhere near a meaningful percentage of users either.

          In the business world I of course used nothing but shells and flat text editors. On my home system, it makes no sense at all to confine myself that way.
          • Re: (Score:3, Insightful)

            by ediron2 ( 246908 )

            You do understand that you and I (IT guy that I was/am) are a *very* small percentage of the user base?

            Heh, then why don't car companies design cars for little old ladies from Pasadena, and not for performance geeks. Hell, aside from backwater stretches in UT and MT, where in the US do most cars even need a speedometer that shows more than 80 MPH? Why do most office apps have deep functionality reserved for mail-merge, legal forms, dynamic embedding, cross functionality like functions in docs and database

        • by Blakey Rat ( 99501 ) on Tuesday November 08, 2011 @11:10PM (#37994940)

          The perfect UI for 90% of all use cases has existed for decades. I think In The Beginning Was The Command Line should be required reading for all of those "Intro to Computer Literacy" classes they tend to require of college freshmen (or did about 6 years ago when I was still taking classes).

          You're trolling us, right?

          Computers today (I dunno, maybe you're writing that Slashdot message from 1986) do things like organize photos, edit videos, surf the web, serve as media centers, composing WYSIWYG documents-- all things the CLI is godawful at dealing with. That's maybe 99% of what people use their computers for.

          For the other 1%? Sure the CLI's fine. But who gives a fuck? We want an interface for the 99%.

        • CLI has one big disadvantage - it is very difficult to figure out how to do something if you do not know it already (and do not or can not google it). To add to that, different systems use different commands for the same thing.

          So, let's say I am a MSDOS user who installed Linux and want to do something. Ok, so I get the directory listing with "dir", though it does not display how much free space is left and want to delete a file. I type "del somefile" and bash does not recognize the command. So, what other

      • So use multiple monitors and maximize your windows - one per monitor. Problems solved and productivity jumps.

        And for the small stuff, like email, that doesn't require a horking big display, run it off a separate laptop or smart device.

        • Re:Not necessarily. (Score:4, Informative)

          by arth1 ( 260657 ) on Tuesday November 08, 2011 @11:53PM (#37995268) Homepage Journal

          So use multiple monitors and maximize your windows - one per monitor. Problems solved and productivity jumps.

          Please explain to my boss that I need more than a dozen monitors to be more productive. Yes, I use that many windows and often twice that at the same time, and no, they can not be tiled either. If I can't have overlapping windows in a sane way (i.e. no auto-raise focused window), I lose productivity. Cause I really do need dozens of simultaneous windows without the risk of forgetting any of them.

      • Not to defend any of the new-ish UIs, but the conventional UI model has always sucked. Every moment I spend moving a window around or resizing it is frankly wasted time.

        True, but the problem with new UIs is that they attempt to solve the problems of existing solutions by pretending that use cases that made them exist are no longer relevant. E.g. manual window management is too tedious? Let's ditch windows and run everything full-screen!

      • Not to defend any of the new-ish UIs, but the conventional UI model has always sucked. Every moment I spend moving a window around or resizing it is frankly wasted time. Same with launching programs or organizing my menus.

        Sorry, I don't follow you at all here. Driving yourself to the grocery store is wasted time too, but most of us do it anyway because we can't afford chauffeurs, we don't have cars that drive themselves, and we can't afford butlers and personal chefs to go buy our food for us and prepare i

      • by CAIMLAS ( 41445 )

        The most usable UIs I've used, in this regard, are tiling window managers:

        * awesome [naquadah.org], by far the best
        * xmonad [xmonad.org], a marginal second because
        * ion, because he (as far as I can tell) started the idea in vogue, and did a good implementation. That, and he's a vitriolic asshole who deserves honorable mention.

        They're usable almost solely with a keyboard, but a keyboard you do need. Throw a launcher on there, and it'd be the bee's knees for anything with a keyboard. I've used one of the above on 4" touchscreens up thro

      • by macraig ( 621737 )

        I share that frustration, and for a decade or more I've been using add-ons like Shove-It and HandyThing and others to try to automate some of that "window management". You'd think a so-called window manager would be doing that, huh? It's testimony to just how pervasive the tunnel vision is in the developer camps at Apple, Microsoft, and Canonical that they completely ignore the existence of such utilities and the fact that the only reason they exist is because they're solving problems that the not-so-aptl

    • by TheGratefulNet ( 143330 ) on Tuesday November 08, 2011 @09:29PM (#37993886)

      I started using windowing systems at DEC using DECwindows. my first wm was twm after trying and hating the motif wm. this was in the late 80's iirc. after leaving DEC, moved over to sun systems and grabbed twm and pretty much stayed there for 15 years or so. lately, I 'upgraded' to fvwm1.4 as my window mgr.

      notice anything: there's no 'desktop' and I don't have any need for it. I'm quick to open a term window of some kind, do things in it and if a graphic app pops up, so be it; move its window, place it and use it.

      drag to trash? really? people feel they need a desktop for that?

      indicators work for me (new mail, battery, etc). no need for gnome or any proc-to-proc comms.

      don't need my windows to 'shake' as I drag them across. opaque move has kept me happy for 20 yrs and its all the 'decoration' one really needs.

      I guess I don't see the draw of a desktop once you have a very powerful cli shell (term windows) at your disposal.

      my system is very fast with ghz-class cpus but with NO 'desktop' pile of daemons and procs that sit around and talk to each other behind my back ;) fvwm really does all you need in a windowing environment.

      as long as I can disable their desktop stuff and simply start my own wm, I'm happy. think of all that free ram and cpu cycles I have, too.

      unity? oh please! as a famous politician once said, 'go fuck yourself!'.

      • As a laptop user, there are two problems prevent me from using a stand-alone WM instead of a full DE:
        - Power management: To switch between profiles when the power is plugged in or not.
        - Network connections: Wifi and broadband connections. iwconfig/ifconfig/wpa_supplicant can serve me, but too much of a hassle. And broadband connection is really a pain, the last time I tried to use wvdial, I can get a connection, but suffer random system hangs. After that I stick to GNOME2 till now.

        Installing GNOME power man

    • Re: (Score:3, Informative)

      Agreed. And one excellent example is OS X Lion. I have to wonder why it didn't make the list.

      Lion represented a modest but decent upgrade in many respects. However, in their effort to bring the "benefits" of iOS to the desktop they rather dropped the ball.

      Their new "Full Screen Mode" is great, I suppose, on an iPad. On a desktop system with two monitors, like my workstation, it is worse that useless because it fills one screen but blanks the other. I end up with less working space than I had when I be
  • Ask me before you make the changes. Don't make the changes then say `try it..try and get used to it...this is better`.

    Unity is not better. It was fine before. There are other areas of Ubuntu which could be improved first, and you should have made Unity an option, not the only choice.

    I'm now sort of happy with Xubuntu but there's no point in pissing off loyal fans this way. It adds nothing but resentment and confusion.

    • by shish ( 588640 )

      Ask me before you make the changes.

      So that you can tell Henry Ford that you want a faster horse?

      • by ArcherB ( 796902 ) on Tuesday November 08, 2011 @10:04PM (#37994306) Journal

        Ask me before you make the changes.

        So that you can tell Henry Ford that you want a faster horse?

        Analogy fail.

        How's this:

        Henry Ford moved the steering mechanism to the floor board and all drivers must steer with their feet. The Ford Motor Co. says that it is better because it frees up your hands to hold and read the newspaper while drinking your cocktail. It doesn't matter that I like the old steering wheel and work better with it. For has refused to include them in any new models, even if the driver requests one because Henry Ford has done the research and determined that steering with the feet is better, end of story.
        Third party companies are offing modifications to the car to add a steering wheel like device to the car. XFCE Co has created handle bars that fit over where the old steering wheel used to be. FVWM offers a set of vice grips that will clamp on to rod that used to hold the steering wheel. Other companies have varying solutions to the wheel, but it is up to the driver to install and maintain whichever solution they go with.

        Or, they can buy a Chevy.

        Guess which one I chose.

    • by Gerzel ( 240421 )

      The problem is that they did ask, but they asked for new users of newer tablet style devices.

      I see a lot of the problems arising is that designers right now are trying to make one size fits all interfaces that work for desktop, laptop and tablet interfaces and it just doesn't work that well.

  • by Anonymous Coward on Tuesday November 08, 2011 @09:15PM (#37993744)

    But yeah, I REALLY dislike the dumbing down of GUIs, hiding everything behind big buttons to make it "touch-screen friendly" and just not considering the power user. I was fine with Netbook Editions of linux distros(even though I never used any for more than testing) but this is ridiculous. We have more screen space and screen resolution than ever before, and now it's all nice boxes with rounded corners? Sheesh.

  • People also hated... (Score:5, Interesting)

    by CannonballHead ( 842625 ) on Tuesday November 08, 2011 @09:16PM (#37993748)

    ... KDE 4, Windows 7, Windows Vista... some people hate ALL GUIs.

    Me? I like Windows 7. I find it nicer and faster than XP's interface, actually. I also like gnome better than KDE in general, but I preferred KDE 3x or 4x. I have not tried gnome3/unity yet, so can't comment there.

    I sometimes wonder how long this debate has gone on. I'm guessing people hated Windows 95 when compared to 3.1 (or equivalent Mac OS version changes). People probably tried to show how a monitor was a disadvantage from the teletype; afterall, with teletypes you had a permanent hard copy and didn't risk losing it! ... (I have no source for this, I'm just speculating ;) )

    I do think there are some things that don't make sense though - such as touch-screen-GUIs used on non-touch-screens, or the other way around.

    • by epyT-R ( 613989 ) on Tuesday November 08, 2011 @09:43PM (#37994074)

      with aero enabled the blitting is faster, but I wouldn't say it responds faster than xp. it's a bit slower..and I"m comparing both with all animations disabled.

    • by msobkow ( 48369 ) on Tuesday November 08, 2011 @10:06PM (#37994328) Homepage Journal

      People hate change.

      End of story.

    • by 0123456 ( 636235 ) on Wednesday November 09, 2011 @12:44AM (#37995582)

      I'm guessing people hated Windows 95 when compared to 3.1 (or equivalent Mac OS version changes).

      Everyone I knew thought Windows 95 was great; even those who used Suns had to admit that it was better in many respects.

      Because... drum roll please... IT WAS A BETTER UI THAN THOSE THAT PRECEDED IT.

      People don't hate change, they hate SHIT THAT'S WORSE THAN WHAT THEY'VE GOT ALREADY.

      Which part of this is so hard to understand? Why, instead of three clicks to start a program from a menu or two clicks to start it from a desktop icon, will be life be better if I have to move the mouse to the corner of the screen, wait for some stupid animation to bring up a full-screen overlay, hunt down some random icon which I hope is the right program or take my hand off the mouse to type in what I hope is the name of the program and then probably wait for some more stupid animations while it starts up? What problem is this solving? Why is this supposed to be better?

      The Win95 interface was the best thing Microsoft ever did for computing, which is why pretty much everyone else has copied it. GUI design has mostly been downhill since then.

      • >GUI design has mostly been downhill since then.

        Mostly agree about Win95 being pretty good.

        There was one big problem though: the Start menu is organized according to Software Maker, and then program. After you've installed a good number of programs, it gets hard to find your stuff.

        The one big improvement by Gnome/KDE/Freedesktop was organization by category. Genius. You can give an Ubuntu 10.04 computer to a 5 year old kid, and he can easily find and try all the games without bothering you. Your father (

  • > But is it the vocal minority doing all the complaining, or is it the majority?

    Brother, its *always* the vocal minority doing all the complaining. The majority (aka 'the great unwashed masses') will generally take whatever is being shoved down their throats.

    -x

  • by afabbro ( 33948 )
    ...because we're using desktops, not tablets.
  • by Anonymous Coward on Tuesday November 08, 2011 @09:17PM (#37993770)

    If there's one thing we should learn from these ordeals, it's that people claiming to be "UI designers" should be shunned. Every commercial and open source project needs to limit the involvement of these people. They can make icons, but that's where it should end.

    GNOME, Firefox, and Windows all had far more usable UIs when actual software developers were in charge of making the decisions. This isn't surprising, though. Software developers are mainly concerned with creating software that works, and that works well. "UI designers", on the other hand, are more interested in creating software that looks "pretty", even if it's damn impossible to use productively. Usability does not come from gradients and curved corners.

    • by kiwimate ( 458274 ) on Wednesday November 09, 2011 @12:03AM (#37995320) Journal

      Err...I don't think you have actually met any real UI designers, ever, in your life, or even read about what they do. Or else you're thinking of people who calim to be "UI designers" and confusing them with people who actually do HCI [wikipedia.org]. I suspect what you are actually talking about is a graphic designer; but that is very different from someone who designs user interfaces based on well known HCI principles.

      It's about far more than making things look pretty (and actual software developers are NOT the experts in that field, either, by the way). It's about studying how to make things usable. I am not an HCI expert, but I work with one, and I know that when she starts a project she sits down with users, interviews them, spends time observing how they work, until she understands the processes they go through better than anyone. Then she works with the developers to implement something that's usable, that makes sense, based on scientific research principles about how people work.

      Software developers are not interface designers. That's not their job. It's a different discipline, and when it's done properly it's magic. Software designers might or might not understand the workflows and the business processes. (Usually not, in anything but the simplest possible businesses.) None of this is a criticism of developers; it's recognition that they are experts in their field, and these projects work best when you get other experts in other fields working side by side with your developers.

      It's actually kind of frightening you got modded up +5 insightful. You're saying the equivalent of claiming a server administrator is the best at development. He or she might be really good at writing scripts, but real enterprise level software development is not even on the same plane.

      • by rshol ( 746340 ) on Wednesday November 09, 2011 @11:19AM (#37999516)

        The first fallacy of HCI is they start with things like user surveys. Users always say they want contradictory things like "make the same interface work well on a desktop and a phone". Anybody can see these are mutually exclusive things, but users say that sort of thing all the time. Users can never tell you what they actually want/need until you give them what they ask for.

        The second fallacy here is that HCI is somehow scientific. HCI types try to sound scientific, there are statistics and measurements, and even so called laws, but interface design is not scientific because its acceptance is based on individual preference. Its sort of like saying "We have statistically analyzed popular music and produced the ultimate song based on users requests and what they listened to before". So these UI's are the UI equivalent of the Monkees or Milli Vanilli.

        Designing UI's based on telemetry, user studies or Fitt's "Law" does not insure a good UI, some common sense must be used as well. The New and Improved Windows 8 interface, for example, does not permit multiple overlapping windows and the browser does not run plugins. Those are considered features not bugs. Statistics will not fix stupid.

      • It's about far more than making things look pretty (and actual software developers are NOT the experts in that field, either, by the way). It's about studying how to make things usable. I am not an HCI expert, but I work with one, and I know that when she starts a project she sits down with users, interviews them, spends time observing how they work, until she understands the processes they go through better than anyone. Then she works with the developers to implement something that's usable, that makes sense, based on scientific research principles about how people work.

        You hit the nail on the head. And you also demonstrated *why* the UIs for things like Unity sucked.

        They did some of that for the Unity UI.

        .

        .

        After they wrote it.

        And then, when the results came back indicating serious flaws, they shipped it anyways. (Study is here if you're curious.) [canonical.com]

        And now, one year later, lo and behold, people are bitching about the same things. But yeah, Shuttleworth, I'm sure it's just that we're stick-in-the-mud "power users", right?

        The entire Unity fiasco reeks of a group of self-pro

  • by Tyrannosaur ( 2485772 ) on Tuesday November 08, 2011 @09:17PM (#37993778)
    I want my things to be loaded as quickly as possible. I don't care about flashy desktop effects that make things slower.
    • by Twinbee ( 767046 )

      Don't forget about GUI latency in general - those barely noticeable sub-second delays that ruin an application through death by a thousand cuts. Visual Studio 10 springs to mind.

  • Not all of them (Score:2, Interesting)

    by Anonymous Coward

    I generally like kde 4 design though they need to work on reducing cpu usage / latency. In my opinion, it's the only one that does it right in that the interface for tablet/netbook and desktop are separated seamlessly and easy switched between the two. Programs do not need to be compiled to two different gui and users can pick which interface to use and don't have to bother with the other.

    http://www.youtube.com/watch?v=OjO5X1ADUrE is an example.

    My main problem with "next gen" gui is that they are too force

  • Been using it for 10 years. Yes, they had issues with 4.0 and 4.1 and I stuck with 3.5 until 4.2 came out. But from 4.2 and on I'm liking it. It does everything I want it to and looks pretty too.

    Some don't like that the entire workspace is composed of widgets, but I think it's a great concept. I can customize my desktop to suit my style, and just about everything is adjustable/customizable.

  • by ThorGod ( 456163 ) on Tuesday November 08, 2011 @09:26PM (#37993852) Journal

    For many people, in my experience, expressing hate more quickly passes the 'urge to talk about' than love. Plus, if you're pissed then you want to be heard. But, if you're happy, who cares who's talking? (Side note: the more visible something is, the more attention any changes will see. "New Coke", for example.)

    I think that's what's going on with the latest GUIs. Change always has it's subtractors, and GUIs see *tons* of use.

  • Give me one that doesn't suck and I won't hate it.

    My current ire is directed toward Google for its new Gmail interface. What a joke.

  • by Dracos ( 107777 ) on Tuesday November 08, 2011 @09:27PM (#37993856)

    The /. crowd generally is more knowledgeable about computers and their interfaces. UI teams are dumbing down their interfaces to cater to the lowest common denominator of user. The simplification has reached a point where even median level functionality is not just hidden, but removed. The targeted users don't know any better (and likely never will), but we do.

    These new interfaces are just too simple for us.

    • Have you ever watched a lowest common denominator user try Office with a ribbon for the first time? They hate change for the sake of change as much as we do.

  • Show me how these new UIs produce a benefit for the end user. That's it.

    I'm a bitter KDE4 user. From everything I can tell, they did it to make the code "neater" and for window candy.

  • I still fail to see why anyone but Grandma would want a UI that even Grandma could run.
  • Need for change... (Score:5, Insightful)

    by Junta ( 36770 ) on Tuesday November 08, 2011 @09:38PM (#37994010)

    Talk about an article just asking for rants. I'll chip in my rant...

    I think the challenge is the UI paradigm preceding this generation is just too mature and way too many UI developers really have a hard time justifying their continued work. The MATE and Trinity projects forked out of an apparent strong desire to keep things as they are and have some confidence it won't magically bit-rot away, but they are far from 'glamorous' and really don't have much of substance to actually *do*, the job is pretty much already complete.

    Now a whole generation of UI designers are largely pretending that computers *didn't* catch on every where and that some mythical large mass of people cannot cope with the UIs that all evidence suggests are working just fine. For a time they were sated with the genuine issue of UI design not scaling down to ~4" screens, but they are seized with the silly notion that there must be *one* UI to rule all form factors. MS decides their Metro UI is the answer for phones/tablets/desktops (despite not even making sufficient headway in the handset arena to prove that out even in the most likely case). Nearly every review of use of the Metro-UI in Windows 8 suggests a degree of awkwardness in the laptop and desktop case. Apple decides the iOS experience should dominate the OSX world (Apple is a bit of a special case, they can pretty much do *anything* and their loyal userbase will lap it up, it's more like a fashion brand and they probably see minimal difference in business results between the times they truly deliver an enriching experience and when they make missteps). Gnome 3 pisses away tons of screen real estate on oversized default titlebars to accommodate inprecise touch interaction regardless of context whilst also hiding their 'dock' for fear of wasting real estate.

      A large part of this is what I think is a bad assumption that tablets will just logically displace all laptops/desktops. iPad has seen commercial success (for reasons I think are more fanboy than a 'genuine' revolution) and now a ton of companies are wondering why they can't reproduce those results and get people off their laptops and assume something must be 'wrong' since tablets are *obviously* the way of the future.

    Anyway, if you want the UI paradigm to continue as it has been, throw your weight behind MATE (or see if MGSE successfully decrapifies Gnome 3) or Trinity. Elect not to upgrade from Windows 7 if you prefer that (though you are at the mercy of MS in that scenario and you cannot force them to keep Windows 7 going). Alternatively prove me wrong by embracing KDE4, Gnome3, Metro, full-screen OSX apps as you get off my lawn.

  • by lkcl ( 517947 ) <lkcl@lkcl.net> on Tuesday November 08, 2011 @09:54PM (#37994204) Homepage

    i installed KDE 4 for a friend's friend. it took me 3 days to set up, because their ISP is very unreliable, at the extreme end of a broadband connection and they get 15k/sec (not kidding).

    it all installed: i ran it, logged them in... and could i understand what the fuck was going on? not a chance. it was incredibly embarrassing. i spent 15 minutes _failing_ to do something as simple as set their background image. first we couldn't find it - i had to log in at the console and use "find . | xargs grep {filename}". then we couldn't find how to even _change_ the background image. on standard desktops, it's right-mouse, click "set background". done.

    they now are so angry with me over how i told them that linux is great, and windows will result in their bank account details being stolen (a virus destroyed the bootloader, which is why i was called in), that they are no longer speaking to me.

    now - you tell me that it's a great idea that KDE spent an entire multi-million Euros EU grant merely copying the UI of the most vilified and failed version of windows, ever, known as "Vista", and then make yourself known to me some day face-to-face i'll punch your fucking lights out.

    gnome - i've never installed gnome, so i don't know about it. but, personally i'm sticking to fvwm, and i'm going to install LXDE for people, from now on. it's basic, it works, it's a known paradigm, and it's quick.

    eventually i'll get round to finishing pyjdwm https://sourceforge.net/projects/pyjdwm/ [sourceforge.net] though, and the first version _will_ copy the "standard" paradigm. window. bar. cross. menu at bottom. maybe :)

    • by wvmarle ( 1070040 ) on Tuesday November 08, 2011 @10:54PM (#37994816)

      >

      it all installed: i ran it, logged them in... and could i understand what the fuck was going on? not a chance. it was incredibly embarrassing. i spent 15 minutes _failing_ to do something as simple as set their background image. first we couldn't find it - i had to log in at the console and use "find . | xargs grep {filename}". then we couldn't find how to even _change_ the background image. on standard desktops, it's right-mouse, click "set background". done.>

      So, if I understand your story well, you're trying to give your friend a computing solution that you have never even looked at before yourself? No wonder you're running into trouble.

  • by wierd_w ( 1375923 ) on Tuesday November 08, 2011 @09:56PM (#37994228)

    The way I look at this issue, is that these UI's are being written, not because there is some outstanding need to implement such new features, but because the vendors that made them wanted to look like they were still innovative and agile.

    Sticking with the same tried and true ui, and simply optimizing every bit of code that makes it work, to the point of perfectly polished code perfection is not what gets non computer experts excited about purchases. What does, is "the shiny!".

    Thses days, I could clearly see a need for a very efficient and simple ui system for cross device remote purposes. The less the window manager has to do to present information, the better it would be for that purpose. However, that is not the direction that the ui is being pushed.

    Realistically, in terms of functionality, you could build a useful ui using blitting tech from 20 years ago, and be just fine.

    Instead, we are using more processing and memory cability to run solitaire than entire mega corps had in their computing labs from that period. (That dx10 certified gpu you have rendering aero for you, so that solitaire can present pixel shaded 3d cards to you is able to crank out more flops than a cray supercomputer from the 90s. Think about what that means, when it is a requirement to play solitaire.)

    Clearly, the ui designers simply reject the KISS principle of engineering, and do so because "we can, and resources are cheap."

    This is the biggest reason that I hate nearly all newer generation ui flavors.

  • They Don't Work (Score:5, Insightful)

    by bky1701 ( 979071 ) on Tuesday November 08, 2011 @10:02PM (#37994288) Homepage
    The problem, to me, is not that the UI has changed. I'm generally OK with changes, even bad ones. I can deal with it.

    What becomes an issue is when all the GUIs out there seem to have showstopping bugs. KDE4 is a great example. I haven't used it in about 6 months, because it was nearly too glitchy to use and the constant graphical errors were starting to make my head hurt. I'm sure someone will tell me "KDE 4 works now!", but that's a lie and you know it. KDE 4 "worked" when I was forced back to Windows because I could barely use Firefox without having a seizure or at least slamming my keyboard through my monitor. I didn't even use the first releases of KDE 4: they wouldn't run. I only went to 4 at all when programs began to require QT4.

    Yes, my ATI drivers had a hand in this, but that's part of the problem itself: why do all new GUIs demand glossy, sugar-coated rendering at the cost of my processing power? Why do they do so especially when they are aware of the driver issues that their member base constantly faces? Most GUI projects only want to look "cool" and seem new, not actually provide a usable product. That is evident in the horrible (or even non-existent) support for software rendering. For the record, even KDE4's non-accelerated mode rendered incorrectly.

    I used to be the biggest proponent of Linux around, but it is really difficult to advocate something when its quality is dropping so quickly, and you yourself are barely able to operate it. Linux-sphere developers don't care about the user anymore, they care about themselves and doing what they want. This is evident in how almost every Linux-oriented project is now run as a dictatorship. Do not question project leaders. They know best. It wasn't always that way, and it needs to go back. The reason we are seeing more forks of major projects than ever before is precisely because of that. "My way or the highway" invariably leads to forks.

    Meanwhile, Windows still seems to have no issues. I hate that I am using it, but I actually have things I need to do. I can't rely on a system that is built on so many flawed systems and only gets worse with every release. It's time for Linux developers to pull their heads out of their asses and start working to actually make a usable product again, or others will start jumping ship, too.

    Another example of all this is Blender. Blender was a love-it-or-hate-it GUI. Eventually, if you forced yourself to use it, you would love it and no longer want to use anything else. Getting to that point was more brutal than anything, but it was arguably worth it. So what did the developers do in the most recent version? Completely change the UI. Every hotkey changed, the menu layout completely flipped around, and in general all the things the users had gotten used to no longer being as they were. Worst part is, it is still impossible to put it even close to how it was. I'm not convinced this change was in any way for the good: it's still as hard to learn as ever, and of course, now EVERYONE has to learn it again. Why was this done? Who knows. Certainly not me. I frankly don't care, either, as I no longer use Blender, nor will I ever use it again. And, yet again, Maya and 3DS keep on.
  • by Animats ( 122034 ) on Tuesday November 08, 2011 @11:15PM (#37994966) Homepage

    Phone screens and tablets are output-mostly devices. Their primary function is content delivery, not content creation. Inherent in the touchscreen concept is that pointing, dragging, and viewing work work well, but input is slow and difficult.

    Exporting the output-mostly metaphor to desktop machines is painful for people who do any significant input or content creation. But that's what seems to be happening. This reflects what the average user is now doing with a computer - watching TV. A third of Internet traffic is now Netflix.

    Incidentally, while the low end is struggling with point and drag UIs, the high end of 3D animation and engineering systems is finally getting that problem solved. 3D content creation systems have been painful for two decades. Finally, programs like Autodesk Inventor have managed to make 3D drawing and navigation fluid, without requiring vast numbers of hotkeys or multiple 2D views. You do, however, need something with a sharper point than a finger, like a mouse or tablet, to get work done in that space.

  • by s0litaire ( 1205168 ) on Tuesday November 08, 2011 @11:48PM (#37995230)

    ... Is that these changes were imposed onto the community over a very short time-scale.
    In the case of Unity, it first was introduced in 11.04 as the default (with gnome fall-back was an option) and as standard in 11.10 (Gnome fall-back removed). Their was a vocal group that had problems with Unity and felt it was not ready for prime-time and Conical was only rushing Unity out the door (so to speak) to keep up with Gnome 3 "Gnome-shell". Which has just as many haters as Unity!

    I think if they slowed down the introduction by having it as an option instead of a default in 11.04 and then having it as default in 11.10 it would probably have let the community drift towards Unity once all the bugs were ironed out! then in 12.04 have it as the standard option with more public support!

    Unity is bearable on my 15 inch laptop (but gets tiring after a while!) but I put up with it. My desktop is on Ubuntu10.10 and is probably last version of Ubuntu it will see!! Looking at Mint or another Flavour that keeps a Gnome 2 style interface.

  • by FellowConspirator ( 882908 ) on Wednesday November 09, 2011 @12:30AM (#37995520)

    Major changes to a GUI are an expensive (time AND money) venture. They aren't changed without reason, and if it were just to change the proverbial drapes then all you'd need to do is develop a simple theming system once and you'd be done. Changes are being made because of a perception that there's something wrong, or that people are changing the way they use their computer - and they're right. Think how much more you use a browser and mail-contacts-calendaring uber-client now than you did 10 years again.

    Chrome that provides feedback or contextual cues is good design. It's good design for physical hardware, and it's good for software. People are naturally very visual. Changing layouts and interactions to handle different modes of input (touch and gestures as opposed to keyboard or mouse movement), also very important.

    What's happening now is that developers of GUIs are awakening to the fact that the elements of the UI define the ergonomics of interaction. Just like in the physical world, you can't turn screws with a hammer or pound nails with a screwdriver (you can, but not effectively). GUIs are no different. To make a GUI an efficient means of operating a computer, you need to consider the means of input, the ratio of input to output, and the most frequent operations so that you can remove as much overhead as possible from the interaction. The use of appropriate cues, consistency in the UI, and references to well understood symbols or real-world objects are effectively symbolic documentation and can be very efficient.

  • by renegadesx ( 977007 ) on Wednesday November 09, 2011 @01:15AM (#37995728)
    I think its more that the new GUI's are less functional and oversimplified to the point where it becomes unusable. Remember Linus' rant regarding GNOME 2? The one where he famously said "if you treat your users as idiots, only idiots will use it"?

    I think its the same sort of issue, most people had more tolerence for less options than Linus. He prefered KDE3 which despite was nice, was pretty bloated at the time. The new GUI's crossed the line for alot more people.

    My issue with iOS is stuff like multitasking and options were made a headache because Jobs wanted one button, the walled garden apprach also limits what you can do with the device.

    My issue with GNOME 3 is it removed basic principals like minimising and maximising windows (something I do quite a bit). My issue with Windows 8 is that too, despite I see its potential on tablets using Metro on a PC is terrible given you have to mimick finger swipes with your mouse. It makes no sense.
  • by epine ( 68316 ) on Wednesday November 09, 2011 @01:33AM (#37995862)

    It's the classic trap of optimizing the solution domain instead of the problem domain. The center of the usability world is not the computer, but the human skills and limitations of the person interacting with the computer.

    A computer ought to be like a good bartender, who just knows on the first look that you aren't going to start with a parasol sticking out of a maraschino cherry. It ought to greet me with an offer to try out a unique Slivovitz or the vintage Calvados rather than offering me the pitcher of Coors Lite for one buck less than yesterday as if I care.

    There's a guy on Wikipedia with nearly 800,000 edits. Shouldn't the computer make certain assumptions about his work process rather than popping up an interface suitable to his grandmother? If I sit down at a computer I've never used before and plug in my iPod, shouldn't it notice that I've never listened to a three minute pop song since I bought the device, but I do have 16GB of hour long lectures in the areas of technology, psychology, politics, history, and economics? Should my 30 years of keyboard experience not be taken into account? Or my 100-500 Google searches per day, 300 days per year, for the most of the last decade?

    The bartender should just know that I need tabs and desktops, or failing that, some reasonable way to spread out.

    The ultimate human assistant is the one who knows exactly how much bandwidth you have available, and when and how to interrupt you with new information or a better approach.

    The interface I deserve is the one designed for the F35 fighter pilots where they actually do give a shit about your cognitive limits and making it possible to reach them. The start menu is just another deck chair on a biplane. I'm sick of interfacing with the computer. Wake me up when the computer interfaces with me.

  • by FyberOptic ( 813904 ) on Wednesday November 09, 2011 @01:44AM (#37995906)

    I don't hate new GUIs, particularly for mobile devices where it's still a relatively new area and companies are still learning how to do it best. But for desktops, where work actually gets done, I just see no reason to take away something that's worked perfectly for years. Microsoft nailed it with the start button/task bar/system tray interface. We've used it for over 15 years now, and it's been cloned countless times for its shear functionality. But for some reason, many Linux distros/software, particularly Ubuntu, thinks that cloning OSX is the way to go. You know, OSX, the operating system which literally hasn't changed its GUI in 30 years aside from adding a dock bar to it. A GUI which was designed to handle individual applications at a time due to hardware limitations. And a dock bar which, I might add, is one of the most uninformative task management devices ever created. It's fine for grandma to see if she has her email client open, but not for someone who wants to see how many web browsers, directories, or terminals they may have open, and displaying where or what those windows are currently doing.

    Don't get me wrong, I'm cool with Microsoft trying something new, in an effort to bridge the desktop and the mobile device. But I want the ability to disable it on my desktop machine. Right now you can't without breaking shit. But this is Microsoft, and they're pretty well known for configurability and backwards compatibility, so I have a feeling nobody is going to be forced to use it on the final product.

  • by SmallFurryCreature ( 593017 ) on Wednesday November 09, 2011 @04:37AM (#37996634) Journal

    There are countless futuristic movies in which there is this fantastic intferface or sentient computer that makes ordinary tasks we never do seem so much more convenient. When have you really checked a detailed weather forecast before going out? I live in Holland, the weather will be grey and rainy with the wind blowing from all corners at once. Same with checking mail or arranging meetings. The sci-fi movie never happens. Or take the Star Trek computer. It seems so fluent that interface the TNG crew uses but have you noticed how what they do on the keyboard never has any relation to what is happening? That is because it ain't real but how many touchscreen fanboys wanted a computer with a touchscreen keyboard because of it?

    Same thing with speech control, that sounds nice but needs to exist in a world where "help" is not a long google session.

    The interface of tomorrow isn't happening because the tech of today just ain't there and PART of that tech is our own body. My voice is very different in the morning. If I had to use a voice command to turn the lights on, it would remain very dark. Coffee first but how do I get Mr Coffee to regonize my groggy voice?

    The existing standard gui's on the desktop are very much based on the idea you have a surface on which you arrange windows containing applications or parts of an application. It ain't perfect but it works well enough since it means all each application developer has to do is present a rectangular box that either fits all screens (dialog) or can be resized. It is fairly easy... it is so easy in fact that on netbooks a LOT of windows and dialogs appear to far down and are cut off. They can't even get that right.

    But Unity suddenly wants to throw this away and present an intelligent and smarter way of doing the same but different... and it doesn't quite work and most of us have years if not decades of experience doing it the standard way.

    There may be room for a joystick driven car but if it crashes everytime I sneeze I am not going to unlearn my steering wheel skills.

    Gnome and Unity are not just changes we do not want, their basic functionality was broken at the time of launch. Both crashed, had zero customization and removed widgets people had come to rely on. this would be like introducing a joystick controlled car that crashes when you sneeze with no windscreen no passenger seats no luggage space and an action radius of a half a mile. You can then bleat on about how good the joystick is, the hate for all the other stuff will kill your idea for ever.

    Gnome 3 and Unity should have stayed as a research project for at least another year and only have launched for real when they were feature capable with the software they replaced.

    As for Metro... am I the only one having flashbacks to active desktop? I am typing this in a fullscreen browser, like my toes, I haven't seen my desktop in years. Somewhere out there there must be people who run one app at a time, who have just 1 tab open in opera (mine are so small it takes totally mastery of subpixel clicking to get one) and when they are done they close everything to have the desktop re-appear.

    It is not that we a stuck in the past with your basic window managers, it is that everything else has been tried AND deemed NOT to work. Try this one. Tell an Apple user that you do not think he is a complete faggot and fanboy and then ask him to honestly speak about the unified menu on a large screen setup. Handy no? Having to move your mouse for miles to get to the menu (people who use OSX just for photoshop and moved their menu to their touch pen thingy don't apply, you bought an expensive gadget AND spend ages to learn it to get away form the menu on the screen being out of easy reach.

    Maybe like so many other things we have just gotten used to, the standard desktop gui just works. And if it isn't perfect then at least it is better then the usual attempts to fix it through half-finished code implementing barely thought out ideas that only apply in a few cases.

    Ta

  • by ledow ( 319597 ) on Wednesday November 09, 2011 @05:47AM (#37996976) Homepage

    If it takes me longer to get used to an interface than the interface will save me during its lifetime, then it's pointless to use it. It doesn't matter who I am. Novice computer users may only run one or two programs, and those all from the desktop, but they will struggle if you change things. Advanced users might have 10 common programs and dozens of handy little programs and utilities and you can't put them all on the desktop. So the novice user will acclimatise at about the same pace to a new interface as an advanced user. First rule of UI: Don't piss off the established / advanced user, or cater only for them.

    Similarly, if the interface costs me more in CPU, loading times, hunting-the-program times etc. then it's pointless. It's like defragging a modern laptop hard drive - the time I save in less seeks is VASTLY outweighed by the time it takes the damn thing to defrag. I really don't need or want fancy Aero-accelerated sidebars and clocks, thanks. No, honestly. No matter how cool they are they will get switched off as one of the first things I do.

    Although there are obviously reasons for GUI's aimed at other uses, every machine I have is set up to do what I want as quickly as possible and no messing about. Fancy graphics are disabled. Stupid menu items are removed (Help on the Start Menu in XP? Just how often did ANYONE ever use that?). Timeouts for UI elements are set to their lowest (e.g. Start Menu flyouts in XP). Desktop elements that are unnecessary are removed (everything from screensavers to backgrounds to sounds to anything that tries to throw crap on my taskbar at all).

    "Intelligent" menus that adjust to my usage are disabled (*I* can't predict what menu items I will need next, or most, so I'm *certain* that it can't either). Shortcut keys are used infinitely more than browsing through a menu for the right option (so even changing a keyboard shortcut to something new messes me up for almost all future versions of that UI - I still have to edit Opera's config so that Ctrl-N gives me a new tab and not a new window and it's been like that for about 5 versions now). Take note designers - no keyboard navigation from day one means I won't use it. If your desktop is too context-driven, keyboard navigation is impossible, nonsensical or too confusing.

    Menu bars are flat colour. Window icons are simple and clean. Hell, give me a modern equivalent of the Windows 3.1 desktop (and by modern, I mean in what it can do, not what it looks like - I'm always scared of "Modern" themes and tend to stay on "Classic" themes for my entire usage of a computer) and I'll be more productive. "The desktop is a customisable programs window with subwindows" was always such a wonderful idea compared to "The desktop is a random dumping ground of whatever junk you or programs want".

    I *will* happily spend some time customising a UI if you give me the option and most of those customisations will be to turn crap off. I don't want to do two clicks to get to a particular window of Office being open (Stupid task-bar "grouping" costs clicks and stops me finding the right file so I just Alt-Tab instead or turn it off). Is the Windows key or Ctrl-ESC REALLY the only option to open up the Start menu from the keyboard? How long would it take you to allow the user to customise that? Similarly, why isn't the Windows key the default to open menus in most Linuxes and why can't I even customise it to BE that key if I want?

    What's quicker? Going into the Start menu using the mouse and waiting for menus to fly out and scroll down and search for the program I want, or just pressing the same keystrokes every time to get to it without having to explicitly suggest a keystroke for every program? (Hint: Start, P, I, O runs Opera for me unless I install another program starting with O into my Internet folder on my Start Menu - and YES - that categorisation is invaluable. From a clean desktop, I can start a handful of my programs quicker than they can load and sometimes quicker than the Start menu

  • by e**(i pi)-1 ( 462311 ) on Wednesday November 09, 2011 @10:38AM (#37999020) Homepage Journal
    what is nice in linux that its possible to work with different windows managers. I use a minimal windows manager myself (blackbox) Sometimes, I switch to KDE, sometimes to Gnome, or explore Unity. Installation of a new windows manager is an apt-get away. I sometimes wish this would be possible in OS X.
  • I had to deal with gnome 3 when I had to "upgrade" one of my user's system to FC 15, and *loathe* it: screens of icons that vanish unless you roll over them, transparency - it's all eye candy for the sake of eye candy. It also goes vehemently against the *Nix & F/OSS idea that you do things the way *you* want to do them, not the way M$ (or whoever) wants you to do them.

    The concept that "screen real estate is valuable" seems to have passed them by. I'll put up with everything being fullscreened on my netbook, *NOT* on anything else.

    And, of course, the idea that you might want to use your processing power for, um, doing things, or work, rather than spending so many cycles doing *nothing* other than running eye candy also passed them by.

                      mark, who runs all 600k IceWM at home

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...