

Ask Slashdot: Are Linux Desktop Users More Pragmatic Now Or Is It Inertia? 503
David W. White writes "Years ago ago those of us who used any *nix desktop ('every morning when you wake up, the house is a little different') were seen as willing to embrace change and spend hours tinkering and configuring until we got new desktop versions to work the way we wanted, while there was an opposite perception of desktop users over in the Mac world ('it just works') and the Windows world ('it's a familiar interface'). However, a recent article in Datamation concludes that 'for better or worse, [Linux desktop users] know what they want — a classic desktop — and the figures consistently show that is what they are choosing in far greater numbers than GNOME, KDE, or any other single graphical interface.' Has the profile of the Linux desktop user changed to a more pragmatic one? Or is it just the psychology of user inertia at work, when one considers the revolt against changes in the KDE, GNOME, UNITY and Windows 8 interfaces in recent times?"
Classic Desktop (Score:5, Insightful)
What is a "Classic Desktop" and in what way are the other GUIs being discussed not "Classic Desktops"?
Re:Classic Desktop (Score:4, Informative)
I RTFA and didn't quite find the answer to your question.
I think it means users are conservative. Other than KDE users, most people are using something that undoes GNOME's "upgrades".
Me, I use KDE. But that doesn't mean I don't use several GNOME apps. Disk space and RAM are cheap enough nowadays that you don't have to choose one or the other.
Re:Classic Desktop (Score:5, Insightful)
I RTFA and didn't quite find the answer to your question.
I think it means users are conservative.
Exactly. Once you have a setup that works for you, you don't change it. There are enough other things to tinker with anyway. (New kernels, interesting applications, even games.) Then you get older. I have used icewm for 15 years, why should I change? Many things have gotten better over time - even LaTeX has improved. But "desktops" haven't. The alternatives are just different. Not better. Not more efficient. Cooler perhaps, but I won't bother re-learning anything just for cool.
They waste so much time developing GUI stuff, when positioning is all I use the window manager for. Work is done on the command line, or in a few graphical applications. The window manager is just for positioning stuff. Not for effects, not for configuring "look and feel" or anyting else. Configuration is in /etc/ where it belongs - and is accessed exclusively via command line.
Re: (Score:3)
you do know... (Score:3)
...that pressing the super key (aka windows key) and typing is not an innovation exclusive to windows 7 don't you?
IIRC win8 retains that ability though I don't use that os. My regular desktop is GNOME 3 and it works just like that too.
The thing with Win8 and GNOME3 is that there is so much angst over what amounts to the introduction of a full screen launcher to replace a stale but familiar cascading drop down menu launcher. In both cases once you launch the same old apps all that crap is out of sight.
Of the
Re:you do know... (Score:5, Insightful)
The thing with Win8 and GNOME3 is that there is so much angst over what amounts to the introduction of a full screen launcher to replace a stale but familiar cascading drop down menu launcher. In both cases once you launch the same old apps all that crap is out of sight.
If the change is really so insignificant... why the hell would you change it?
Oh, because it's 'stale', and God forbid, we can't have anything 'stale' when we could have NEW and SHINY.
Re: (Score:3, Insightful)
Because everyone is going crazy on the idea of using the exact same interface for desktops and mobile phones. Even though it simply doesn't make sense. They are different devices, with different physical interfaces and different usage styles.
I mean, it makes very much sense to use the same underlying technology. But one user interface to rule them all does not work well.
Re:you do know... (Score:4, Insightful)
If the change is really so insignificant... why the hell would you change it?
Oh, because it's 'stale', and God forbid, we can't have anything 'stale' when we could have NEW and SHINY.
Great question.
People are always telling objectors that the changes are both insignificant, and also so absolutely essential that they just need to get with the program. Doesn't make a whole lot of logical sense.
Re: (Score:3)
After a lot of tuning my Ubuntu desktop is down to this: a bottom bar with the names of the open windows; the Applications and Places menus; a very seldom used icon to minimize all windows; the Netspeed and the System Monitors applets; the generic applet that collects application icons for Skype, Shutter, Dropbox, keyboard layout switch, network manager, logout and the HH:MM clock.
I use ALT-F2 to run Firefox, Thunderbird and Emacs in the rare cases I have to close them. Basically they boot
Re:Classic Desktop (Score:5, Insightful)
Re:Classic Desktop (Score:4, Informative)
Yeah, KDE is a freaking classic desktop. At least as long as you don't switch to the tablet look of it.
Gnome to has always tried it's best to show a familiar look until 3/shell.
Re: (Score:3)
Why would you want to do that?
Is there a new rule that desktops have to look the same as tablets now? Why wasn't I consulted?
Re: (Score:2)
One reason for doing so would be that you're running KDE on a tablet.
Another one could be that you're running it with a touch monitor. Either at home or say presentation kiosk somewhere.
A third alternative because you like the clean look of it.
A fourth could be that you decided to develop it because it could be done using Plasma and you're checking it out.
A fifth that you by accident / curiosity clicked the Activities widget and picked it.
And so on.
Lots of reasons. I'm totally fine with such an option exist
Re: (Score:3)
Re: (Score:3)
Yup - this is why I run KDE. It is about as clean as xfce interface-wise, but it has the searchable launcher that most of us like, and it is extremely tweakable with applets/widgets/etc. You can basically stick anything anywhere (a desktop in your task bar, a window pager on your desktop, etc).
I keep it fairly classic, but I appreciate the fact that in any of the native apps I can just use a fish:// URL to browse files on remote ssh servers, automounting works, and all that. I still tend to use the comma
Not-yet-fully-functional GUI shouldn't be default (Score:3)
All Ican figure is that either the author either believes that it's not classic if we can customize a GUI to the point that it's no longer "classic" looking
In my opinion, a desktop that can be easily customized to act "classic" is classic enough, so long as users are made aware of this customizability. But there's a practical problem with presenting too many options for customizability. See the section "The Question of Preferences" in this article [ometer.com].
or is judging it based on the first few releases when it wasn't fully functional as a 'classic' desktop yet
A not-yet-fully-functional GUI shouldn't be shipped as the default GUI of a GUI-oriented operating system until such time as it becomes fully functional.
Re:Classic Desktop (Score:4, Insightful)
Posting this from Ubuntu 10.04 LTS, and I consider pre-Unity as a "classic desktop," and it is Gnome.
Seriously, I have nothing against change, but I think there should be a cross-distro standard desktop that JUST FREAKING STAYS THE SAME. There should also be bleeding-edge environments for more adventurous people. Why shouldn't people have a choice? But it would be nice to install most any popular version of Linux and get a standard desktop.
I think you're thinking too hard and the author is (Score:5, Insightful)
using too many words. He means that users of personal computers (as opposed to mobile devices) want simply a "desktop."
As in, the metaphor—the one that has driven PC UI/UX for decades now.
The metaphor behind the desktop UI/UX was that a "real desktop" had:
- A single surface of limited space
- Onto which one could place, or remove files
- And folders
- And rearrange them at will in ways that served as memory and reasoning aides
- With the option to discard them (throw them in the trash) once they were no longer needed on the single, bounded surface
Both of the "traditional breaking" releases from KDE and GNOME did violence to this metaphor; a screen no longer behaved—at least in symbolic ways—like the surface of a desk. The mental shortcuts that could draw conclusions about properties, affordances, and behavior based on a juxtaposition with real-world objects broke down.
Instead of "this is meant to be a desktop, so it's a limited, rectangular space on which I can put, stack, and arrange my stuff and where much of my workday will 'happen'" gave way to "this is obviously a work area of some kind, but it doesn't behave in ways that metaphorically echo a desk—but I don't have any basis on which to make suppositions about how it *does* behave, or what affordances/capabilities or constraints it offers, what sorts of 'objects' populate it, what their properties are,' and so on.
I think that's the biggest problem—the desktop metaphor was done away with, but no alternative metaphor took its place—no obvious mental shortcuts were on offer to imply how things worked enough to allow users to infer the rest. People have argued that the problem was that the new releases were too "phone like," but that's actually not true. The original iPhone, radical though it was, operated on a clear metaphor aided by its physical size and shape: that of a phone—buttons laid out in a grid, a single-task/single-thread use model, and very abbreviated, single-option tasks/threads (i.e. 'apps' that performed a single function, rather than 'software' with many menus and options for UX flow).
Though the iPhone on its surface was a radical anti-phone, in practice, the use experience was very much like a phone: power on, address grid of buttons, perform single task with relatively low flow-open-endedness, power off and set down when complete. KDE4/GNOME3 did not behave this way. They retained the open-endedness, large screen area, feature-heavy, and "dwelling" properties of desktops (it is a space where you spend time, not an object used to perform a single task and then 'end' that task) so the phone metaphor does not apply. But they also removed most of the considered representations, enablements, and constraints that could easily be metaphorically associated with a desktop.
The result was that you constantly had to look stuff up—even if you were an experienced computer user. They reintroduced *precisely* the problem that the desktop metaphor had solved decades earlier—the reason, in fact, that it was created in the first place. It was dumb.
That's what he means by "classic desktop." "Linux users want a desktop, not something else that remains largely unspecified or that must instead be enumerated for users on a feature-by-feature basis with no particular organizing cultural model."
Re:I think you're thinking too hard and the author (Score:5, Insightful)
After 20 years of experimentation, the conclusion is that the desktop metaphor is probably too complex for the average user. Power users appreciate floating windows, file hierarchies, multiple screens, notification bars, hierarchal menus etc. Meanwhile the more typical user maximizes one window at a time, clicks icons, and saves everything in the same place. The "phone/tablet" model is much closer to the average person's mental map of how a computer should work.
The problem is that Linux users are 'power users' almost by definition so KDE/Gnome were terrible places to experiment with replacing the desktop metaphor.
Re:I think you're thinking too hard and the author (Score:4, Insightful)
I wish I could disagree, but I help so many users that run one program full screen. I just sit back and shake my head as they constantly switch from one program to another instead of arranging the program windows to see everything they need at one time.
It really start to piss me off when they have two monitors and switch between two programs, both on the main screen, both full screen. Then they wonder why it takes so long to get things done.
Spoken like an arrogant developer. (Score:3)
Do they continue to be gainfully employed as a digger, yet still dig with their bare hands?
What do they and their boss know about their productivity and job requirements that you don't?
What are they digging for? Is it likely to be damaged by a spade? Are they relying on the tactile sensation in their hands as they dig to make critical digging decisions of some kind? What is the cost of spades? What is the urgency of this dig? Is the limited supply of spades reserved for cases in which rapid digs are needed,
Re:I think you're thinking too hard and the author (Score:5, Insightful)
Except that the desktop cannot work using the phone/tablet model because user expectations do not suggest that metaphor when they sit at a desktop.
Even if the desktop metaphor was too complex to master, users still sit down at a desktop and think, "now where are my files?" because they intend to "do work in general" (have an array of their current projects and workflows available to them) rather than "complete a single task."
As was the case with a desk, they expect to be able to construct a cognitive overview of their "current work" at a computer—an expectation that they don't have with a phone, which is precisely experienced as an *interruption to* their "current work." KDE, Gnome, and most recently Windows 8, made the mistake of trying to get users to adopt the "interruption of work" mental map *as* the flow of work. It's never going to happen; they need to be presented with a system that enables them to be "at work." In practice, being "at work" is not about a single task, but about having open access to a series of resources about that the user can employ in order to *reason* about the relatedness and next steps across a *variety* of ongoing tasks. That's the experience of work for most workers in the industrialized world today.
If you place them in a single-task flow for "regular work" they're going to be lost, because they don't know what the task is that they ought to be working on without being able to survey the entirety of "what is going on" in their work life—say, by looking at what's collected on their desktop, what windows are currently open, how they're all positioned relative to one another, and what's visible in each window. Ala Lucy Suchman (see her classic UX work "Plans and Situated Actions"), users do not have well-specified "plans" for use (i.e. step 1, step 2, step 3, task 1, task 2, task 3) but are constantly engaged in trying to "decide what to do next" in-context, in relation to the totality of their projects, obligations, current situation, etc. Successful computing systems will provide resources to assist in deciding, on a moment-by-moment basis, "what to do next," and resources to assist in the construction of a decision-making strategy or set of habits surrounding this task.
The phone metaphor (or any single-task flow) works only once the user *has already decided* what to do next, and is useful only for carrying out *that task*. Once the task is complete, the user is back to having to decide "what to do next."
The KDE and GNOME experiments (at least early on) hid precisely the details necessary to make this decision easy, and to make the decision feel rational, rather than arbitrary. An alternate metaphor was needed, one to tell users how to "see what is going on, overall" in their computing workday. The desktop did this and offered a metaphor for how to use it (survey the visual field, which is ordered conceptually by me as a series of objects). Not only did the KDE and GNOME not offer a metaphor for how to use this "see what is going on" functionality, they didn't even offer the functionality—just a series of task flows.
This left users in the situation of having *lost* the primary mechanism by which they'd come to decide "what to do next" in work life for two decades. "Before, I looked at my desktop to figure out what to do next and what I'm working on. Now that functionality is gone—what should I do next?" It was the return of the post-it note and the Moleskine notebook sitting next to the computer, from the VisiCalc-on-green-screen days. It was a UX joke, frankly.
The problem is that human beings are culture and habit machines; making something possible in UX is not the same thing as making something usable, largely because users come with baggage of exactly this kind.
You're quite wrong, and it's not "theorizing," (Score:4, Interesting)
there are 30 years of detailed field research on this. Again, see Suchman's "Plans and Situated Actions," Dourish's "Where the Action Is," etc., or visit the ACM digital library and look at usability research (i.e. involving observation of real people in real settings) in CSCW, HCI, etc.
You have one basic fact wrong: they *do* have to think about what it's "time" to do.
Users in computer-at-desk contexts do not have a detailed roadmap for what to do on a click-by-click basis, either from their boss or inside their heads. They have a general set of goals for, say, the quarter ("Get this project launched"), perhaps the week ("Make sure everyone is on-task and progress is being made; keep the CTO appraised of any roadblocks"), and the day ("Put together charts and graphs for Wednesday's meeting to detail progress").
But it is *these* tasks that are "theoretical" quantities. They translate into dozens and dozens of clicks, mouse movements, UI interactions, and so on, many of them interdependent (or, in Suchman/Dourish terms, indexical—that is to say, order-important and constitutive of an evolving informational and UI flow context).
The user may have "Tell bob about tomorrow's meeting" already decided, but they are imagining Bob and imagining Bob *at* the meeting. From there, activity is practical and adaptive. They emphatically do *not* have this in their heads:
- Take mouse in right hand ...
- Flick mouse to lower-left to establish known position
- Move mouse 5 inches toward right, 0.5 inches toward top of desk to precise location of email icon
- Click email icon
- Wait 0.4 seconds for email window to appear
- Move mouse 7.2 inches toward top of desk, 2 inches toward left to precise location of To: field
- Click to focus on field
- Type "Bob"
- Wait 0.1 seconds for drop-down with completions to appear
- Hit down arrow three times to select correct Bob
- Press enter
You laugh, but in fact this is precisely what you're suggesting: that users have a roadmap already. They don't. That's why we invented the GUI—to provide a visual catalogue of available computing resources and an indication of how to access them on an as-needed basis. Then, the user has to decide, in the moment, what was needed. Every single attempt to make things more "simple" or more "efficient" by presenting *only* that one thing that designers imagined to be needed at a given time—the "obvious" next step—has led to users that either feel the system is useless, that fight it to get it to do what they want, or that simply go around the system (I'll just do this task offline, on a pad of paper). You can make very telling changes to users' productive workflows and levels of productivity by changing orderings or locations of icons, etc. Marketers also know this very well on the web (google "page hotspots" to see the research about positioning of advertising and how deeply it affects CPC and other factors in online marketing).
At a less granular level, something like "Get this project launched" is also not available in a detailed roadmap to a user. Go ahead, ask them to elaborate on the precise set of tasks involved in their big quarterly responsibility. They'll come up with 20, 30, maybe even 80 split into four or five sub-areas. But getting the project launched for an average middle manager over the course of a quarter involves tens of thousands or even hundreds of thousands of discrete actions, gestures, etc., some computing-based, some not, with the computing-based ones split across dozens of applications and contexts.
It cannot be mapped out because it is contingently assembled—it has to be done on an as-we-go-basis. So the tasks in the "to do list" (and, in fact, in cognitive behavior) are theorized ("Create a new instance of the platform on test VPN, set up credentials for team") rather than existing as a detailed, moment-by-moment list of actions. This is why user docs people actually have to sit down and use the system, and int
Re: (Score:2)
What is a "Classic Desktop" and in what way are the other GUIs being discussed not "Classic Desktops"?
Tail fins & Chrome.
Well, scratch Chrome.
Re: (Score:2)
Classic desktop means Amiga-style desktop, where the initial icons are which disks are in the drive, and double-clicking them opens a window containing more icons.
That came from the Mac, not the Amiga.
Re: (Score:3)
That came from Xerox PARC, not the Amiga.
FTFY
Re: (Score:2)
TFA doesn't say... my guess: "something that has roughly the same interface metaphors as win9x".
Yup, or System 7 / OS 8 / OS 9. These interfaces did rule the personal computing world back when the personal computer was an easily definable device that sat on your desk, so I think it makes sense to call them classic if we're talking about personal computing.
No, UI designers went crazy. (Score:5, Insightful)
Re: (Score:2)
Microsoft was talking about the shift away from desktops towards tablets in 1999. What happened in 1999-2008 was that sales were still solid and no one wanted to endanger the core product by making the radical shifts needed for a dual purpose system. You can agree or disagree with Microsoft but let's not pretend that tablets were not something Bill Gates was focused on heavily as the next step of the GUI from pretty much the time the Windows 95 GUI got the kinds out.
Re: (Score:3, Interesting)
Re: (Score:2, Interesting)
They didn't want a Windows 8 disater so they made the UI as desktop one complete with a non touch friendly start button.
The Office team sabotaged it too by making sure the fonts were not LCD friendly for freaking 7 years. They didn't like the tablet.
The infighting at MS was INSANE during Balmers tenure. Now it is starting to change but out of necessity as the fruity company they laughed at and left for dead is more powerful.
If I wrote that last sentence in 1999 I would be laughed at and modded down as a -1
Re: (Score:3)
Why is "dual purpose system" a good thing?
It is no sin to make a different GUI for a device with a 7" screen that is controlled by a touchscreen and runs on a few watts, a device with a 21" screen that is controlled by a keyboard and mouse and runs on a hundred watts, and a device with a 4" touchscreen whose power draw is measured in milliwatts.
Microsoft's failure was to think that we wanted a consistent user experience for all these things.
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
It took a little getting used to at first, but after a week of using it daily, I found it quite easy to work with.
Android is easy to work with the first day, what's your point?
Re: (Score:3)
Re:No, UI designers went crazy. (Score:5, Informative)
Of course, the Mac desktop is just a hi-res version of the Amiga (toolbar at the top for the active window, task bar, ... were all Amiga desktop features).
Temporal anomaly in your argument. The Mac launched Jan 24th 1984. The Amiga didn't launch till 1985.
Re: (Score:2)
Re: (Score:3)
I would agree, that visually NeXT had more in common with the Amiga direction than Apple.
Possibly. But don't forget that the Amiga UI was largely copied from Mac OS. As are all UIs with application drop down or pull down menus.
Productivity (Score:5, Insightful)
Everything has to do with productivity. Sure we all like a bit of novelty and it's fun to tinker with new features of a desktop or user interface, but the majority of these innovations are never used (if the user has the choice), but the recent Linux desktops (Gnome mostly) have forced a new set of heuristics on a user base that increasingly uses Linux for productivity and not just tinkering.
It's a waste of time to have to learn a new way of doing everything when the existing ways work already. That is why 'classic desktop' is favored. It works, and although new things might work, they have not proven to work better.
Re:Productivity (Score:5, Insightful)
Revolt against changes? (Score:5, Insightful)
I don't see it as a "revolt against change" but a revolt to changes for the sake of change (enter gnome 3 and windows 8 as exhibit A and B).
Re: (Score:2)
I smirk and giggle when I see XP users all pissed off and furious at MS in slashdot of all places which *historically* gets excited about change and new things. I think in my eyes it proves people will resist change no matter what as there is no reason to actually go out of the way and install XP on a new i7 after spending 2 weeks running hacks and reversed engineered drivers to get it to boot?! It is because they like the pretty blue taskbar and green hills and being in a familiar environment. ... do not t
Re:Revolt against changes? (Score:5, Insightful)
Most XP users use it because their current PC is good enough for what they do and they do not want to reinstall Windows or buy a new PC. If not for DX11-only games, I would still use XP (built a new PC in November) on my old PC. The 3GB RAM was a bit limiting, but not enough to 1) spend a lot of money on new hardware and 2) the pain of reinstalling Windows.
As for why Metro is bad while Android UI is good: Metro UI is good UI ... on a phone or tablet, but not on a desktop. Just like I would not use Android UI on my desktop, I will not use Metro UI too.
A tablet has a relatively small screen and is operated by touch. You need big buttons so that it is easier to touch them. A desktop has a large screen and is operated by keyboard/mouse. Metro UI places 5cm x 5cm or larger buttons, while I can easily click 1cm x 1cm icons, so it wastes screen space and makes me move the cursor further.
A tablet is usually used for one task at a time. I use my desktop with many windows open, most of them overlapping. If I had to use one full screen window at a time, I would be much much slower. I full-screen only two types of software - video players and games, everything else runs in windows that are usually considerably smaller than the screen.
The start menu takes up a small portion of the screen, but allows me to choose from many items. The start screen takes up the whole screen (there goes my context) and allows me to choose from a smaller list of items. Oh, and desktop programs are not on it by the way (at least for RTM Win8, don't know about Win8.1).
Another gripe just with Windows 8 UI - it gives no indication that some text can actually be clicked to do something.
Different interface for different devices (that have different uses). After all, I would not want to use this [theonion.com]
Re: (Score:2)
Different devices, different expectations. Then again, I haven't really made much use with metro. What I saw from Unity (briefly interacted with it at school) is that it works well enough for using one or two applications at a time. It seemed confusing, though. Specially trying to find some things.
The inertia of muscle memory (Score:4, Interesting)
I'd like to have something like the Win 7 Start Menu, but XFCE with the Panel on the bottom is (a) Good Enough, and (b) easy on the brain, since I frequently switch between my Linux box and the company's Windows 7 Enterprise laptop that sits right next to it.
Re: (Score:3)
Re: (Score:3)
Quick access to frequently used programs. It's a personal preference, so I'm not going to argue about which form of "quick access" is the best.
Pragmatism (Score:5, Insightful)
If you can't have a consistent experience across even one day, why get too reliant on customizations and shortcuts?
Back in the day, I had to switch between Data General (terminals), MacOS, and Amiga keyboards and UIs on a daily basis between work and home. These days, of course, everything has changed - now I bounce from Linux to Android to OSX, and more than occasionally Windows too. It's just never paid off to build a super-custom setup when you can't stick with it.
I use Linux for my main desktop at home partly because it is so quick and easy to reinstall - just keep your data on a backed-up server and you can virtually forget about maintenance or troubleshooting. Get used to the default setup and just reinstall whenever you run into something you can't work around - 15 minutes to get back to a familiar desktop is quicker than any full restore-from-backup I'm aware of. (I actually like Linux internals but every time I learn something, I end up forgetting it before I need it a second time; it gets frustrating...)
I'm aware I'm giving up a fair amount of potential productivity and convenience. I don't care any more. I'm just happy when I remember not to try and touch the monitor on my wife's iMac.
I got friends and colleagues who, for example, use Dvorak. More power to 'em. They're younger and more stubborn than I, and most of the time they have one laptop they use both at home and at work. As a wise man once remarked, I'm older now, I got to move my car on street-sweeping day, I can't be doing just anything I want any more...
Re: (Score:2)
Change vs. Churn. (Score:3)
When version N+1 was probably an improvement, getting motivated to go poke it until it works was easier. Now version N+1 may have some cool new feature; but it'll probably have 8 regressions, the pointless removal of something you liked, and probably tentacles. Why bother?
Re: (Score:2)
I wasn't really unhappy with Linux 10 years ago and a few years ago Ubuntu and other even started to polish it up to be really nice (remember project 100 paper cuts?). I don't know what happend then, but at some point it all went downhill.
They started to constantly break my user interface, by randomly changing things, removing features, or just creating new bugs. Now I am even scared to upgrade, because some programm I rely on might not work anymore (or just disappear because it was coded against some obsol
Re: (Score:2)
...they've just been changing random things in some horrible mockery of genetic drift.
The first time I encountered Ubuntu Unity, I did think of Aliens 4: Resurrection and the lab full of failed clones begging for death.
Beg The Question Much? (Score:5, Insightful)
"Pragmatism" versus "Inertia"? What a strange choice that doesn't align with pro/con argumentation.
FWIW, let's look at a continuum of Linux/Unix desktop users instead. We know that a core group will tend to prefer a minimalist X-Windows desktop such as IceWM for the least impact on hardware performance. Many users prefer desktops like XFCE, Razor-QT, LXDM, and others that offer lightweight but fuller and more integrated experiences than the truly minimalist ones, acknowledging that the load on a system tends to increase as more features are included and deciding strategically to suit their usefulness-efficiency preferences. At the other end of the spectrum are those users who want an entire desktop environment in which all the bells and whistles are integrated into a particular look and feel, as characterized by KDE and Gnome, but understandably with a heavier load on the underlying hardware. So, I suppose pragmatism enters into such choices. To each their own, and having such choices is wonderful. Inertia? There are those who will say "I use KDE because I learned on it and I'm used to it", but this also is a pragmatic choice and not one of "inertia".
Summary of Linux on the desktop (Score:2, Informative)
GNOME: the desktop that COULD be awesome, if only the dev team actually cared about performance, polish and a reasonable feature-set. Overall this desktop has the best feel and most potential, but sadly it is never quite realised.
KDE: at first this desktop seems powerful and feature-rich, but after a week of using it you realise how little its devs care about usability and sane defaults. Not everybody wants to make a career out of tweaking their desktop.
Unity: has SOME nice usability aspects, but it is only
Re:Summary of Linux on the desktop (Score:4, Informative)
Mate devs, however, aren't resting on their laurels. Mate is being adapted to integrate with the OS more, and use more modern, up-to-date, and maintained libraries. No one was maintaining GConf anymore, and GTK+ and Gnome moved on to GSettings with a Dconf backend. Now Mate 1.6 uses Gsettings instead of Gconf. A natural progression (though I wish gsettings used plain text files instead of dconf), and it works well. Also there is movement to migrate Mate to GTK+3.
Whether or not this duplicates effort with regards to Cinnamon, and if it can be kept up I don't know. But Mate is fairly feature complete even as it stands. GTK+2 still works fine for now. It's not going to stop working on its own accord. Things like Wayland will likely force its abandonment, but time will tell.
Re: (Score:3)
Summary: if GNOME would stop reshuffling the deck chairs and spend a few releases on performance, polish and features real-world people care about, they could easily become the most popular desktop. They've done 99% of the work, but for some reason are blind to that crucial last 1%. Given that this is probably never going to change, the Linux desktop is pretty much an exercise in futility and inefficiency.
You may want to take a peek at elementaryOS. A few of my friends, on seeing what I've done with my laptop, have described their "Pantheon" desktop as "Gnome that doesn't suck". Pantheon was originally forked from Gnome, though it's taken a life of its own... as of now, it's only officially supported on elementary, where it's the default DE.
Unity's been tolerable (Score:3)
Re: (Score:2)
"Classic?" Or Just Uniform (Score:2)
All of them offer:
- A desktop
- some kind of task bar (top, bottom, left, right - doesn't really matter)
- some form of menus for getting to stuff
- some kind of file manager application
There may be some things that are very different from one to the other (Lord knows that when I switched to a Mac I found some of their choices thoroughly obscure) but in the big picture most desk
Re: (Score:2)
Actually it is more political than you imagine.
KDE was not pure (L)GPL, it had dual licencing for money etc. It was the biggest FUD ever pulled successfully, even Microsoft failed to do something in this scale.
All of this is now over 10y ago but that's what really created the GNOME project. And they won't be finished until all functionality of KDE is completely removed from your desktop, leaving you with a single mouse pointer, single mouse button and a single window, full screen.
Re: (Score:3)
And here, you make the mistake most FOSS advocates make - You actually believe (or at least, "care about") what you just said.
I like open source. I use open source. I've rolled my own kernels, I've even modified them to fix an early broken multi-PCI bus enumeration routine. And yet...
I don'
Re: (Score:2)
Nothing the GP said was incorrect - perhaps you've misread it. I thought GP was referring to the FUD/backlash against KDE which lasted many, many years longer than the actual licensing dillemma itself (less than a year?).
So yes, politics/belief/FUD drove the creation of Gnome, and that mis-maneouver by Qt/KDE project - despite being quickly rectified - had repurcussions that lasted much of a decade, despite the indifference of pragmatic users such as yourself.
Is it really that hard to figure out? (Score:2)
Re: (Score:2)
Linux users have the option of choosing a different desktop environment or window manager.
I predict that 2014 will be the Year of the Linux Desktops. ;-)
unlike on Windows or OS X, where there is only one path forward, at best having some kludge solution that may or may not be reliable.
As a recent victim of Windows 8, I've just tried the"Start 8" add-on, and it looks promising. I hear that "Classic Shell" also is good. So, it looks like we have multiple paths forward. I don't know if the add-on approach is what you mean by "kludge", but that seems to be quite popular in other cases, e.g. Firefox. Since I'm used to the Windows 7 interface nd basically like it, it's nice to have a form of choice that helps me get back to whe
Counterpoint (Score:5, Informative)
I just (five days ago) spent two days huddled with a half dozen other developers in the corner of a large conference room filled with IT people in Chicago. We were testing our various implementations of a new protocol that we expect to see in wide use during the next two years.
I had brought a brand new laptop, for various unfortunate reasons, on which I had just installed the complete stack of software I needed night before in the hotel room. I put Ubuntu 13.1 on it because I happened to have that particular distro on a flash drive that was at hand just then and I was in a hurry.
Things worked out. The laptop worked well and I got my part done. Thing is, I spent that rather intense period of time using Unity. For development and testing of software. Really.
I get it. Unity is fast and effective, particularly on the limited real-estate of a laptop screen where you end up switching rapidly among full screen applications.
I've avoided Unity like the plague on desktop hardware were I have multiple, large displays, and I think I'll continue doing that. However on a laptop that is not running external displays Unity works pretty well. You can navigate quickly with mouse or keyboard and avoid fussing with things. The fixed position of the large icons (although too large by default) on the sidebar is particularly useful.
So, bust out the fangs and hate me down with your mod points; I found a use for Unity and said so on Slashdot.
Re:Unity caused me to switch to Windows (Score:2)
How great for you. You admitted it does not work except in a limited netbook like sense for 1 task.
I do more and do not like where linux is going. So in 2011 I switched to Windows 7 and never looked back ... until Windows 8 :-(
The Pragmatic vs Tweaking war rages on (Score:4, Interesting)
My wife has a mildly customized XFCE setup, and she loves it. It almost never gets changed or tweaked.
Re: (Score:2)
I've been using the same since about 10.04, but I really wish they'd solve the "panel crashing on login" issue. I put a shortcut on the desktop to re-launch it, but it's kind of annoying.
Still, its my only real complaint, which puts it head and shoulders above the other options.
What Do You Need a Desktop For (Score:5, Interesting)
All those other guys can keep their all-encompassing UI vision. I don't want their kool-aid. I'm glad I get a choice in Linux. I may have to occasionally beat my head on the computer for days at a time when something stops working, but at least I can avoid having some corporate assholes or desktop environment programmers who like the smell of their own farts ramming their bullshit down my throat.
post internet stock crash (Score:5, Interesting)
I think it has a lot to do with when you came up. When I came up with computers in the 1980s and 1990s we had hard problems and solved them. It was a world of rapidly growing IT spending, with IT taking on more and more tasks. After Y2K the technology sector began to get very conservative, the focus was on cost cutting and reliability. Far more like the world of the late 70s and early 80s in Mainframe and Minis that the PCs had replaced. What's exciting now is that mobile devices have brought back that enthusiasm for change and excitement again. They haven't caught up with desktops but at least they are creating a generation of developers who are used to a market that grows and expands rather than stays put at minimal cost.
I watch the threads on any kinds of change whether it be ubiquitous computing (Windows 8), IPv6 (networking), Wayland, the new hardware designs... and there is a pervasive pessimism among younger IT, a terrible can't do attitude.
Back in the 1990s when Linux was coming up we had sorta GUIs die: FVWM, AfterStep, SawFish, AMI-wm, Openlook (olwm), blackbox... Systems grow change and die leaving behind better ones. What's terrible is that the new generation wants stagnation. Either Gnome 3 succeeds or it doesn't. But regardless of what happens the work on Gnome advances the ecosystem.
Re: (Score:3)
... and there is a pervasive pessimism among younger IT, a terrible can't do attitude... What's terrible is that the new generation wants stagnation.
That is silly. Some old codgers are terrible at programming and only got there because they got in early, then decry that the world is going to hell in a hand basket because of "new generations". From what I've seen, young people are often the most sane (except those from overseas that come because they are cheaper) and have been robbed blind by older generations that pass the buck onto them. Young coders have to deal with all the problems that old people foisted on them because they couldn't solve, as well
Re: (Score:3)
#first-world-problems
It is such a good thing because it makes modern innovations available to people in countries where $200 is more like some months' salary than it is like three days'. This, in turn, enables those people to potentially contribute in big ways, such as becoming software engineers and developing local solutions to local problems, and often contribute in small ways, such as bug repor
Is it just me? (Score:2)
Does the summary make sense to anyone?
Re: (Score:3)
No sense at all. And the article makes damn little sense either.
Desktop != tablet != touchscreen (Score:2)
The more I look at the whole changes in OS-UIs lately, the more I get the impression that the whole cross-platform thing got lost it's grip to reality.
Sure I like my tablet, my smartphone, my laptop,... and I live with the smudged display I have on my tablet and my smartphone, since the do not really bother me. Probably because I can easily overlook these smudges, but since I can't overlook them on my normal monitor (or laptop display; or my glasses for that matter), I'm no friend of UIs which seem to be de
Linux UI as drying cement (Score:2)
What exactly is a "classic" desktop anyway? Are we talking classic Windows? Classic Mac OS? There's a constellation of UI paradigms which work. Some of them are mutually incompatible, you can't use them simultaneously. If you want to come up with something new, it has to actually work better than what we had before. If it merely works "as good" as what it's replacing then users won't be happy. You're changing things for the sake of change. So from those choices you pick the ones you think work best together
The key is that it now works (Score:3)
In many ways I think that it less that we don't tweak as the machines are coming pre-tweaked.
Obviously this is not for everyone as we all know those people who must spend a full day getting a new machine just the way they like it.
But if I had a new machine built from scratch tomorrow I would say that 50 percent of the few minutes of tweaking would be spent changing the IDE defaults for a few keys and whatnot. The bulk of the rest would be eliminating stupid default icons and putting up a few that I frequently use (Terminal, etc)
I just spun up a raspberry pi and with the arduino IDE sitting right on the desktop I'm not sure that I'll make a single change at this point. Any changes going forward will be 100% in support of critical functionality.
Everyone Hates The New UIs (Score:4, Insightful)
All i really want in a UI is the ability to switch between these apps without having to mentally switch contexts. On a non-touch computer, a menu list of installed apps+taskbar with a stacking window manager is ideal.
Linux users are not the only ones who are rejecting the new UIs. Everyone hates how windows 8 works.
There is clearly a need for new UIs for touch based machines. The mistake is trying to create one UI that works for both worlds - this is the mistake Win8 and GNOME3 made.
Re: (Score:3)
"All i really want in a UI is the ability to switch between these apps without having to mentally switch contexts. On a non-touch computer, a menu list of installed apps+taskbar with a stacking window manager is ideal."
Try WindowMaker. There is no desktop, it's the 'root window' - a floating thing of light and color. There is no button to get your cluttered list of apps+taskbar, there are two buttons with which to get whichever one you need at the moment, and they are both on your mouse. Just click the appr
We grew up (Score:4, Insightful)
*nix desktops (Score:3)
If you want a UNIX desktop that just works, then you get a Mac.
Classic desktop popular with users, not UI guys (Score:3)
Everyone wants a classic desktop, but no vendor wants to provide one. Microsoft wants everyone on Metro so they can take a cut of sales through the App Store. The KDE and Gnome teams want to experiment because it's more fun than maintaining a tried-and-true design. Apple is seemingly holding the line for now, but all it takes is one bad VP in the UI team and OSX will become a clone of iOS.
UI designers don't like the desktop metaphor for a variety of complicated philosophical reasons. They think it would be easier for people to learn how to use computers if it was abandoned. Maybe they're right about that – iOS has been very successful among non-technical users because it simplifies things a lot more than a standard W.I.M.P. design – but once you get beyond casual use and into doing real work, multitasking becomes a necessity, and there is still nothing better than a "classic desktop" for that.
I want what Ironman had (Score:3)
Give me that holographic 3-D translucent panel that I can throw data at by waving my hands around. As long as it runs a kernel with a UNIX philosophy and I can compile the entire thing from source like my current Gentoo distro I'll be happy.
All I ask is that you don't F' it up. If you make the decision that I don't *also* need a keyboard and a console window because 'who uses VI anymore to program when you can wave hands around' then you're full of it. I'm the one to decide if hand-waving is better, not you. If you toss out a half-done re-write like KDE 4.0 with regressions on every major integrated application, you deserve the hate. If you break the entire metaphor like Unity or Windows 8 did for the sake of some other platform you deserve the hate. If you abandon decades of proven philosophy on a whim just because, you deserve the hate.
On the other hand, if you have something truly unique, revolutionary, game-changing, bring it on. If it is truly a step forward the world will quickly abandon the old in favor of your new, my old self included. It's when you try replacing the old way forcibly in favor of your new that you fail. That's not your job. That's my (the user's) job.
FYI (Score:3)
Desktop should be an intelligent canvas (Score:3)
What you see is various software packages all reinventing what should only have to be done once, right.
Various people have invented corkboard ideas, on the mac Stickies is post-it notes and Scrivener is a research and writing system with corkboard as part of it. I have seen various drag and drop style interfaces for drawing uml or configuring networking. One package I am involved in now has a canvas you can drag and drop nodes in a flowchart.
Personally I had an idea for a tool that would draw on the desktop and define regions of it.
Currently the desktops I have seen are just a blank screen that inevitably gets filled up with crap which then has to get put somewhere, or it is just a few shortcuts. The manu bar (on a mac), the trashcan and doc are the only actually functioning items.
I would like to propose that the desktop should be an object oriented scriptable canvas with some intelligence, with storage, networking, layers, ability to transport them between instances and platforms, and something that actually helps you do your work. Smalltalk comes to mind. Anyway, my two cents. There is a lot of screen real estate but none of the operating systems actually do anything useful with it. The drawing tools that are out there in powerpoint, libreoffice or whatever are pitiful and unintuitive, so it takes a lot of work to make something useful and you don't use them in a meeting to illustrate something, you go to a whiteboard and scribble something illegible. Or you get out a big piece of paper. I'm saying a strong canvas with simple unbloated widgets in place of the desktop would be extremely useful as a standard computing component, instead of using the tons of little widgets that solve little bits of the problem.
Ideals (Score:3)
The Windows '95 style desktop captures an ideal well. The MacOS 9 did as well, so does the original MacOs X design. The trouble is that they are far from perfect, but small deviations from the ideal makes for no real improvement, and large explorations away from the ideal tend to be horrible. Kde were probably the first to really wrestle with this horror, and they seem to be past the worst of it, though I have no idea where they're headed; gnome is trying to do the 'HUD Overlay Control / MoreThanAToyDashboard' thing which would be good if they work out how to do it well. But the Windows 95 style, with Gnome2 additions like desktop toolbars you can just stuff what you like in, are a good ideal.
I wish they would invest more effort in making it easy to repurpose what is there: make a simple Gnome Terminal a 3-line program:
w = Gnome.Window(menu=DefaultMenu,content=Terminal.Default)
w.content.run("/bin/bash",["-"])
w.show()
and make customising what the terminal window does as easy as this. Do likewise for getting a browse component running (recall the Cocoa browser demo on MacOsX?)
We should be working less to come up with new stuff, and more on making what we have both rock-solid, dead-simple, and as trivially easy to implement as possible. I wrote a little note about this idea on my Wiki: http://thewikiman.allsup.co/Im... [allsup.co] where the idea is to write your program in your ideal imaginary language, and then build from the language you have towards that ideal, striving to make as much of what you produce reusable in other projects (and to share with others so that they don't have to repeat your progress: DRY is good, DRIP (Dont-Repeat-If-at-all-Possible) would be a better principle, since with proprietary software the DRY principle is violated whenever a different developer has to rewrite functionality because of legal or lack-of-source issues).
Re: (Score:3, Funny)
Would be funny to have a "Score: --1" for your post.
Re:I'm using FVWM... (Score:5, Insightful)
good for you. To answer the question though I think it's psychology of efficiency. If the tools aren't efficient for the brain to categorize/understand it's not practical as an interface (desktop or otherwise). The problem with Metro isn't that it's different, it's that it's too much visual clutter for the brain to process quickly. This is reflected in GNOME/KDE in that, while neatly organized, it relies on memory association of images to functions. Icons are everywhere these days so those associations aren't as strong or that part of the memory is overloaded to access efficiently. Non-graphical interfaces suffer from something similar in the ability to remember all the commands and their associated flags.
The classic desktop organizes things in groupings, lists, etc and while there's icons associated the overriding organization of alphabetical text gives shortcuts for the brain to compartmentalize information where it can or to simply analyze because all the information is there (where KDE/etc you must hover to get all the info one icon at a time)
Re: (Score:3)
Re: I'm using FVWM... (Score:2)
Re: I'm using FVWM... (Score:2)
GNU Midnight Commander.
Re: (Score:2)
Re: (Score:3)
Laugh if you want, I'm working on a product that's still shipping fvwm2 (recently updated from fvwm95...)
It's... sufficient, and the devil whose face we know - all of its shortcomings, bugs, and workarounds are well known and documented - unlike a "cutting edge" desktop that throws you a mystery quirk every so often that nobody knows about.
Re: (Score:2, Insightful)
My car has four wheels. Works best at the moment.
ftfy
what about if a car came along that didn't have wheels? would you not buy it simply because it didn't have wheels?
wheels on cars only works best because you haven't experienced anything better... but that doesn't mean that wheels will always work best.
change for the sake of change sucks, but innovation stems from change and innovation can also lead to change for the better.
Re: (Score:2)
"what about if a car came along that didn't have wheels?"
They have. It's called a "hovercraft" and proponents saw them as the wave of the future.
They are inefficient, lack positive steering or braking (good luck stopping one on a downgrade) and remain in the niche markets they suit.
If a future wheel-free car is offered, I won't need to "try" it to determine if it suits my requirements. I can infer that from what I see it do.
"Desktops" are crap period (Score:2)
It was a lousy metaphor when first proposed and it remains a lousy metaphor today.
And ironically while the article defines a "classic desktop" with icons on it (how gauche!) it goes on to mention WindowMaker, which offers a root window metaphor instead. That's my personal favorite.
"Desktops" in the sense of KDE or GNOME are just too creepy, too cluttered, too always trying to make you do things their way. They include WAY too much garbage I will not use and do not want. KDE *is* more tolerable than GNOME fo
Re: (Score:3)
I love WindowMaker but I just wish it would get with the times. It seriously need to add some (optional) eye candy. Would it be too hard to either add a compositor or at the very least add support for one of the many XWindow compositors out there (e.g. xcompmgr and compton). Real transparency, that is really all I want.
Seriously, I love WindowMaker but 1997 was 17 years ago. It would be nice if it didn't look like it was still stuck in 1997.
Re: (Score:3)
I've been a Gnome user since around 2001, to say things were pretty rough back then is an understatement... In 2012 I switched to KDE. I finally had a machine with 16GB ram to run it on (FWIW KDE seems slightly better at running on limited hardware now, but stil..) Its defaults made me angry, though (especially Konsole - seriously, no keyboard shortcuts to hit a specific tab? Tabs at the bottom [oposite edge to the menus and titlebar]?) but I can actually repair it a lot quicker than fixing Unity/Gnome.
It's
Re: (Score:2)
About the same obsession teenagers have with the way their room looks: they are probably going to spend a lot of time in it (YMMV), it's the equivalent of home (the rest of the house is the parent's territory, and thus hostile) and it's also an extension of who they are.
If it is for work, you just deal with it. It might be "wrong" enough that you need to try to cause change, but ultimately it's not about you, it's about getting the work done. You'll notice that offices are usually slightly personalized. You