The End of Unix? 451
Unix may never become big on the desktop, but that battle is still being fought and it probably won't be over for a few years. However we shouldn't forget where the large strength of Unix lies: the network. Unix -runs- the Internet. I don't think any other Operating System can say that. The Internet started on Unix, the Internet was built on Unix and unless something better comes along (and that implies that we don't have "better" yet) the Internet will die running Unix.
Of course Unix, like any other modern OS, must change over time to accommodate new technologies and methodologies, but I see Unix being more able to adapt in todays fast changing Information Technology world than other operating systems based on monolithic kernels.
What do you think? Am I missing something? Is there a Unix killer in the works that I might have missed?
Unix is on the way out!!! (Score:1)
Now, switched to what? That's a good question. Windows seems to be attractive at the current time, but let's face it: The type of slobbering invalids attracted to Windows are generally not the people that you would expect to pass their genes onto a new generation. By and large, these people will have electrocuted themselves in the shower or killed themselves in freak blender accidents before they have the opportunity to reproduce.
So Windows and Unix are both out. So where are we at, then? Well, unless I miss my guess, I forsee a future where both BeOS and Hurd have huge followings. This, of course, is notwithstanding the fact that these operating systems are currently being used only by sociopaths, homosexuals, and extremely fat people. That's okay, though
At any rate, we can agree on one thing: We live in interesting times. Let's see how this all pans out.
pass the jimson weed, brother (Score:1)
Brother, I would like to introduce you to the concept of 'the enterprise'. Or even 'the internet'. Your grasp of what is important in large-scale computing seems to be relevant to a thing called 'ms-dos'. Here in the real world, we like the idea of 'sharing' and 'interoperability'. 'Privilege systems' and 'user rights' are nice things, too, but I guess you're perfect enough to run everything as 'root', or as we like to call it, 'firehose-mode'.
Okay, enough cuteness. Moral: you're an idiot.
But that's okay; most of us are idiots. Your particular fuck-up is to assume that everyone use computers just like you. Which is a stupid thing to believe, about computers or any human endeavor.
I'll let you re-think that post, and point out more of your logical flaws later.
ciao!!
Re:I think it's meaningless... (Score:1)
1. a next-gen distributed architecture based on the Tao [tao-group.com] VM (think of it as a language-agnostic generalised VM a bit like Java, but that can run on real hardware too).
2. the "classic" Amiga OS was extended to PowerPC with WarpOS [haage-partner.com] (no relation to OS/2) microkernel. This allowed the user community to use more modern hardware, such as G3 accelerators, and 3D gfx cards.
The Amiga OS design, in the form of AROS [aros.org] - the "Amiga Research OS", which recently received blessing from Amiga itself, also lives on.
For more amiga info, go to www.amiga.org [amiga.org]
Re:Un*x is also popular in real-time embedded syst (Score:1)
Tektronix sells Logic Analyzers with Windows 95 embedded into them. That's a pretty "out there" environment where the OS takes a beating and survives.
Here's the clue: They aren't encouraging (or allowing, in many cases) "Hackers" to add all kinds of kludges and DLL hell. Nope.
Run in an embedded setting, where the hardware is completely captured, specified, and validated, it's not that risky a proposition.
Not that it matters in this discussion. Too many zealouts in these parts for any semblance of reason.
Re: Re:I have SEEN the future (Score:1)
USENET not dead! (Score:1)
Well, on the moderated groups, the S/N ratio is usually very good. e.g., compare soc.culture.japan with soc.culture.japan.moderated.
I download and archive select Usenet groups, and... umm... a lot of them are a total loss.
Well this was true from day one. Nothing has changed. More crap sure, but more content too.
The Linux groups have some of the highest traffic, but also the largest volume of clueless noise.
We're all clueless about something. And that's what the Linux groups are for. To ask questions. Not to post braggings about how you're tunneling IP packets over email to get around the worst firewalls. And they're good at this too. When I have a question, I'll search the groups, failing that, I'll search on deja.com to search the stuff that expired off my server, failing *that*, I'll post. And usually get the answer I need within a few minutes to days. And that thread gets snarfed by deja and other news archivers to help someone else later on down the line.
The real action has shifted to private forums like /. and listservers.
Yeah, but the problem of everyone migrating from a common forum (USENET) to Slashdot and freshmeat and kernel.org, and redhat, the mailing lists, etc., is the fragmentation. You've lost the world audience and the single searchable source for answers. I can't know that my question was answered on some obscure mailing list I never knew I could subscribe to. Or worse, you can only discuss whatever the oficially sanction topic(s) of the day are. (see /.)
The bulk of the Usenet traffic these days is purely Binary attachments. The most popular NNTP clients on Windows machines strip off and throw away all 'text' content, keeping only the attachments. NewsBin, Pluckitt, etc.
And these programs are good at doing graphics. When I want to download pr0n or sailor moon images, I'll use GUI news readers, otherwise, it's good ol' trn for the bulk of my news reading. I usually hang out on rec.arts.anime.misc. Most non-binary groups with a lot of traffic, excepting the flame or controversial topic groups, have maintained a decent S/N ratio for the last 10 years and more.
IMO, USENET is better than ever because there are more people using it. Most of the noise lands in the abandoned groups, or the flame groups anyway, so who cares?
Internet (well ARPAnet) predates UNIX (Score:1)
Re:monolithic random comments (Score:1)
Its already been done. Check out either BeOS or QNX. Both are top notch, extremely stable (especially QNX), blindingly fast, and don't carry all of the extra baggage/bloat that legacy OS's have.
More reliable? (Score:2)
Lets' see, the JVM gets compiled by a C/C++ compiler most likely, then runs on a native operating system. So you've effectively added a layer of complexity to the system where problems can occur compared to C/C++ on said native OS.
As for Java's usefulness, it's not a bad language, but let's be honest with our evangelism.
----------------------------
Many meandering mammoth murmurs, merrily musing (Score:2)
It is ALSO a truism that you don't get something for nothing. There's always a trade-off. With networked computers, your trade-off is in delays: delays within the network, transmitting the information, but also delays within the computer, which now has a real-time device to constantly monitor for potential traffic.
It follows that setting up a network of streamlined machines, each dedicated to a narrow range of tasks, will out-perform a single, monolithic system, IF AND ONLY IF the savings from the parallelising is greater than the penalty imposed by the network.
The future of Operating Systems, as I see it, will consist of intelligent distribution of tasks over a wide-area network, and where any program is parallelized at run-time by the primary OS. However, I also predict that 99.9% of all software will be kept local and run serially.
It would seem to follow that Unix will be extended to support run-time parallelization and fairly sophisticated task/net load-balancing.
It would also seem to follow that there will be a blurring between =PROGRAMS= and the network, NOT the machines and the network, OR the OS and the network. Why should there be? If it's not visible to the user, it won't last.
Unix as a philosophy and the Legacy Myth (Score:2)
This is patently false. What Unix _is_, is a system of tools which have been refined for 30+ years. AFAIK, 'Legacy' should be used to describe something which is no longer developed or supported. At least this is the general use of the word.
New doesn't mean better, and in many cases, it can mean untested or unproven. Unix is proven technology. But above all it's a proven philosophy.
Call Unix whatever you like. Text-book Sys V, BSD, or clones like Linux. The underlying philosophy is largely the same. THIS is why Unix has been around for decades (besides techinal merrits).
In the philosophical sense, I doubt Unix is going to die, ever.
Re: (Score:2)
Re:Unix is already doing this. (Score:2)
There is no such thing as "Iron-clad" validation, but validation tied to something physical (retinal scans?) will come pretty close to this in less than 10 years, IMHO.
Everyone always seems to think that retinal scans are a decent method of validation.
However, all they will do is provide a key. Furthermore, it's much more secure to remember your key than to write it down somewhere (think: decapitation). Until we have mind-reading technology, I'll stick with passwords, thanks.
Hamish
Re:hey, give BeOS a chance... (Score:2)
-jwb
an Anti-UNIX operating system (Score:2)
Re:No replacement (yet) (Score:2)
While I happen to think that all the Unices are probably superior platforms for this type of application, don't forget about some of IBM's offerings: AS/400, OS/390, and so forth.
Big Blue is active in the development of Apache, and I would be surprised if they weren't sneaking parts of Apache to their other OS's, which have anemic TCP/IP suites to say the least.
Just a thought.
Big iron matters (Score:2)
The more stuff you have on the user's machine the more your support costs, as it breaks, needs to be replaced, etc, etc. When you look inside large companies you see a greater and greater tendency towards web-based and other centralised systems.
Simon
UNIX is dead--hah!! (Score:2)
The reason is simple: both the commercial UNIX variants and the "freeware" FreeBSD and Linux variants have extended the life of UNIX far beyond it was possible in the past.
The UNIX of 2000 can do multiprocessing, multithreading, SMP support, powerful networking, and even massively parallel processing via Beowulf clusters. And engineers are already pushing UNIX so it supports 64-bit processors like the Compaq (neé Digital) Alpha and the upcoming Intel Itanium CPU's.
In short, I just don't see any replacements for UNIX in the near future. The technology in Windows 2000 may be a major competitor, but it won't be a UNIX replacement.
The Feel of the OS (Score:2)
1 - Engineers / Computer People need a command line interface
2 - Regardless of the virtual reality smell-o-vision interface, there will be a command line at the kernel of a real (READ: non-consumer) OS.
3 - The idea of a filesystem hierarchy isn't going anywhere.
We should be able to plop down at any box and know that binaries are in
Re:Unix will never die (Score:2)
MVS, VMS. That's two.
There are others, such as the OS on Tandem NonStop systems.
Re:UNIX, the OS of the past, present, and future (Score:2)
Always has, always will.
UNIX didn't have good network support until BSD UNIX. I used to run V7 UNIX on a PDP-11, the only networking that it supported was UUCP.
2. UNIX has superior reliability
Therefore, UNIX was coded for rock-solid stability.
Don't make me laugh. UNIX was written at a research laboratory as a research project. Error recovery in the UNIX kernel was limited to calling panic(). Running out of memory or disk space did bad things to the stability of the system. The file system tended to self-destruct if power failed or the system crashed. UNIX applications often didn't check for errors. If they did check for errors, they didn't attempt to recover, they just called exit(). Kernel device drivers assumed that the hardware was in perfect operating order. DEC used to burn in VAX systems with UNIX because it was such a good system hardware diagnostic. Any flakey hardware would crash the system. VMS would run just fine on many systems with multiple hardware problems and glitches.
Re:There is no answer. (Score:2)
Never mind the scores of 'unix-like' embedded RTOSes?
And I know at least one vendor (my former employer, VenturCom (at vci.com)) had an embeddable version of Novell's UnixWare.
Unix and Unix-influenced OSes have always been out there in every market. The Unix-dieback was a dieback of the workstation and a little bit in the lightweight server area.
--Parity
There will come a time ... (Score:2)
1) that UNIX will be killed of by Windows or BeOS or
I think that if something is to replace UNIX, it will have to be something _significantly_ better than UNIX. I don't know of anything else currently out there that can claim this (although I will admit to not having looked at Plan9 and Hurd).
2) that UNIX is immortal because it is somehow 'above' operating system paridyms (sp)?
This is clearly wrong, is the whole UNIX model is based on a hierachy of files, where a file is a stream of bytes/characters. As a consequence of this, a most of the tools that make UNIX so powerful are text stream based.
I see room for a new sort of operating system that works with a much higer level of abstraction throughout, including type management and garbage collection as part of the OS (relegating streams of text to where you really just want to store text information), high level messages between components allowing easy distribution, different views of the file system (or maybe not even so strongly file-system based), and maybe taking some component ideas from Apple's failed OpenDoc to end monolithic applications.
However, I don't think that there is anything out there that comes even close to this, and there won't be for a while, and even if there is, it will take years to catch on.
UNIX will only be overtaken when the problems of the high level of complexity required by what people want to do need a higher level of abstraction than the basic model of UNIX can provide, and even then, UNIX will take a very long time to actually die - just go the same way as DOS that is still widely used but generally regarded as inferior by most people.
Re:It's not going anywhere.. (Score:2)
Re:Media sensationalism! (Score:2)
True. Nut unfortunately, nobody takes reports of the Amiga's rebirth seriously anymore either.
Two more weeks! BIG!
---
Re:wow, anouther death of unix (Score:2)
That isn't even close to true. In fact it is a pretty silly statement given that it would be virtually impossible to get all of the major universities to agree on anything let alone something that specific.
I think you would further find that even in universities that have adopted MS-centric curriculum that it is not pervasive throughout their entire CS program. Any university which would purport to offer a well balanced and rounded background for their students would be ill serviced by making their program so focussed on a single company's technology. That is the sort of thing that lower end institutions such as trade schools and community colleges do, not major universities.
Re:Killing Unix is like killing computers: (Score:2)
Actually, MS-DOS was originally intended to be a clone of CP/M targetted for the 8086/8088 instead of the 8080/Z80. About the only UNIX-like feature it included was starting at 2.0, it incorporated heirarchial subdirectories (unlike CP/M which had a flat file structure), the original 1.0 and 1.1 versions of MS-DOS used a flat file system like CP/M.
CP/M, in turn was originally intended to be a simplified subset clone of RSTS/11, a DEC minicomputer OS for the PDP-11 family, except targeted to the 8080/Z80 based microcomputers of the early hobbiest computer era. RSTS and RSX were the DEC predecessors to VMS. VMS, as we know was primarily architected for many years by the same person who went to Microsoft to design the NT kernel. All of the MS-hype aside, NT is largely a reinvention of Micro-VMS with the Windows GUI and the MS-DOS command interpreter grafted on, and without a lot of the stability and scalability that made a lot of people like VMS (I wasn't one of them mind you). Yes, for the inevitable Microsoft apologists, that is an oversimplified view.
Yes, each of these took a little different take on Unix, and tried to re-invent the wheel: but the influence of Unix cannot be safely ignored.
UNIX only influenced those other products by being a competitor that they were trying to respond to. Notice that virtually all of the proprietary minicomputer OSes are dead or at least lifeless zombies, despite all of the years of predictions of the doom of UNIX (for mostly the same reasons people predict the demise of Linux). NT/W2K is, in my opinion, the last great hurrah for proprietary OSes. UNIX on the other hand has been much more equipped to change with the times and adapt to new and different purposes. The fundamental difference is that it is built with a different philosophy, one that small is beautiful, and that simple tools which do one thing, and do it well and can be put together to solve more complex problems is a better way to do things than the huge, integrated, monolithic monsters that the proprietary OS world puts out.
Re:Killing Unix is like killing computers: (Score:2)
The AS/400 is probably about the only proprietary OS mini that hasn't died off, and I think that is partially because it comes from IBM, but also because it never really tried to compete directly with the other minis (despite IBM's intentions). It basically found its own whole seperate niche. While the AS/400 may be bigger and more profitable than Sun, of course Sun is only one of many UNIX midrange vendors. They are nowhere near 1/2 of that market, as there is still HP/UX, AIX, Compaq Tru64, SGI Irix, etc. At any rate, while the AS/400 is probably going to continue along as it has been for the forseeable future, I don't see it as suddenly starting to increase its market share or start to move outside its current niche markets.
Compaq probably makes more on VMS than they'd like, but it is obviously a legacy system that is getting slowly replaced. They aren't pushing it to new customers and their customers have mostly targeted other platforms as their future directions. I'd classify VMS as one of the 'lifeless zombies'. It is dead, but that doesn't stop it from shuffling about a bit.
You are right that UNIX hasn't won the midrange wars yet, but a lot of the competitors have dropped out, leaving NT/W2K as the last 'great white hope' of the proprietary OS.
Re:Killing Unix is like killing computers: (Score:2)
It's market should be expanding, but I don't see that happening very much, and I live in an area that is about as condusive to the AS/400 as there is (the plant where they are built is only about 200 miles from here, and there are a lot of stodgy insurance and financial companies around). Mostly the newer features they are adding to the AS/400 are just slowing down the defection rate in existing AS/400 shops.
Still, most shops get into the AS/400 because of some line of business application that only runs there, or they're a true blue IBM customer.
Bingo. Usually it is something like an accounting system, and often the AS/400 is used only for a single purpose. A lot of the true-Blue IBM shops (tons of those around here) use AS/400's, but there shops often have IBM mainframes as their main back end system, and often RS/6000's in the middle tier as well.
The other tasks are largely an afterthought.
Yea, and mainly only in smaller shops where they don't have the financial means to support multiple platforms, but they want to add stuff like email, groupware, etc.
I don't think even IBM would consider the 400 a head-to-head competitor to Sun and DECpaq -- that's why they have the RS/6000s.
Well, I think that a few people on the AS/400 team at IBM think that the AS/400 is a direct competitor to Sun, Compaq, etc. Most of the other IBMers I've met have a bit more realistic viewpoint of where the AS/400 is, and where it is going.
Re:Killing Unix is like killing computers: (Score:2)
As I said, I live in the AS/400's back yard, and I'm seeing the opposite happen. Most of the AS/400 shops are hedging their bet by implementing other platforms (AIX/RS6k, other *nix or Wintel) beside the AS/400 for other purposes. Only the most hardcore AS/400 people are putting stuff like web servers, groupware and email on AS/400's. However, few are making that serious of moves toward abandoning the AS/400 for the stuff that it is doing today, so the AS/400 really isn't losing ground too quickly either. It's kinda like where Netware seems to be these days. It isn't making many new converts, but I don't see anyone actually ripping it out either. Most of the people who were talking about it have backed off or at least slowed down their timetables to do so.
OS X = Evolution (no this is not a troll) (Score:2)
(I will state at this point that I'm an HTML and graphics guy; I am not a programmer in any sense of the term, and have desire to be one in the sense of writing big applications and pouring over code for bugs. I will leave that to the experts!
However, I see Apple's OS X as a *possible* evolution of what UNIX can be: it's got the friendly GUI on top and the creamy fudge of BSD on the bottom for those who want to delve into it. From what I hear, you can code into the BSD and it works well. I could be wrong.
In the same vein, with Linux, UNIX ain't going anywhere! More and more people everyday are installing and using and learning it, and there will be more in the future. The open source concept will live on, and I don't think UNIX is going anywhere: it does what it does to well to be replaces with anything else!
Pope
Death of Unix? a.ka. adaptation and re-thinking (Score:2)
I see unix adapting in a few needed ways (and I think it will be lead by open source software):
1. Adaptation of the security model; file system ACL's are something that all IT personel appreciate. In todays enterprise it has become almost necessary for file server environments and is venturing way beyond that. As I tell people @ work NT has a pretty good security MODEL, but it's not the best implementation. I'm not saying stick NT's model into UNIX, but do like unix and OS software have always done....improve upon current implementations and develop new ones. This covers not just ACL's but many other things.
2. Fragmentation....Always a hot topic! Linux is starting to see fragmentation whether we like it or not and this is where I see companies like RedHat as good things. Take KDE and GNOME for example they are both wonderful desktops and the developers are working towards common standards for drag and drop, etc. This is the kind of thing I like to see, but unfortunately the effort hasn't produced obtainable results...yet. This is where a standards body would help us all out considerably because in the end it means more apps and functionality for everyone in the game.
This is just a short version of my $0.02....
-Aaron Dokey
Gainful Employee of Technology
Evolution not Revolution (Score:2)
Unix Is Not An Operating System (Score:2)
Look at GNU/Linux. It isn't really Unix, but is really close. Isn't it some kind legal problems with naming? And besides, GNU and Linux were actually two separate operating systems in development that eventually merged, neither really Unix but now we are all happy.
Now Apple is putting Unix inside of Mac OS X, which I would say the opposite: Apple is putting MacOS inside of Unix. Apple is known as an innovator in the software industry, who have seen far. But they have been standing upon the shoulder of a giant named UNIX.
And if you stare at DOS hard enough, it starts looking like UNIX (anyone know "TYPE FILE.TXT | MORE" ?) but I won't go there
My point is that you won't be able to look at a new OS and say, hey! that's not Unix (unless you are explicit like GNU). You will just kind of see it and think, hmmm, it's different but it's cool and UNIX.
Who was the guy with the quote from Henry Spencer as his
No replacement (yet) (Score:2)
While I'm unsure about Unix' (Linux') future on the desktop, I'm very confident about the professional part of the computer world.
Re:UNIX's usage (Score:2)
Might have to add 2-5 for the # of S/390s running linux, too
Technologies don't die; they evolve. (Score:2)
Sadly, it seems that things aren't evolving as fast as they could. Linux *is* a Unix rewrite, and it's still not that much better than its foundation. NeXTStep re-implemented Unix a decade ago and it did a much better job --so much so, that its successor Mac OS X has still a big technological edge...
I wonder if the Linux community can look ahead far enough to stop worrying about backwards compatibility with older Unices and start innovating in the fundamentals of the OS. Security, administration, configuration, maintenance, documentation and quality of service are hopelessly krufty and kludgy in most Unices and Linux too.
There was a point that Linux needed to emulate its siblings, to remain relevant and useful. But in the next few years it's quite possible that Linux will become the dominant Unix-clone. Compatibility and tail-light chasing be damned --we need to innovate.
engineers never lie; we just approximate the truth.
Re:Unix as a philosophy and the Legacy Myth (Score:2)
Fundamentally, Unix hasn't changed much since sockets and TCP/IP support were added to 4.2 BSD, because it's a highly optimal solution for running a bunch of different stuff at the same time on one machine. Yes, a few neat modifications like loadable kernel modules have come along, but today's Unix would still be largely familiar to a programmer from 15 years ago. Can the same be said of Windows, or MacOS?
Re:death of unix (Score:2)
PS. The fact that its kicking ascii shows how dated parts of it are becoming. All major OSs (ie. NT and BeOS) have moved onto unicode.
Re:that's silly (Score:2)
>>> Yes, people ARE tired of the security problems, instability, and price. HOWEVER, they could care less about it being open source; with X and the current state of desktops Linux is just as bloated as windows; MS releases services packes every few months, Linux releases a new kernel rev every few weeks and core apps (believe it or not X and KDE and GNOME and their attendant libs are part of the OS to most people.) are update every few weeks. Finally, W2K does not perform poorly. It takes up space, but if you've got 128 meg of RAM, it is just as fast as KDE and a hair faster than GNOME on my system. And IE is nowhere near as bloated as Netscape. Yes, W2K is pretty bad, but don't lie about it. And Linux isn't exactly the greatest OS ever made either.
Unix isn't going away any time soon. (Score:2)
Mainframes running UTS (mainframe-compatable clones of SVR2 and SVR4) now handle mission-critical functions for many large companies: All the Baby Bells, for instance, do their long distance billing data capture on it, and run their where-are-all-the-wires databases on it. (If it ever went down all the long distance calls would be free until it was back, which is why uptimes in years are mandatory.) Brokerages support their trading with it (even more $/second if it ever went away). Web sites run on it. (Apache has been there for a while.) And so on. And of course they fixed the Unix clock-rollover bug long ago, so they shouldn't have as many hiccups a few decades down the road when it finally rolls.
Semiconductor design is done with tools that run on Unixes. Some have been ported to Windows to try to take advantage of the cheaper crunch - but not many, and there's little demand for them, since they can't be easily combined into a design flow with scripts. Some of them are now being ported to Linux to achieve the same cost savings. This is easy. (For many, it's just copying the source tree and running "make", for some it's a little tweaking.) And on Linux you DO have the scripting tools, plug-and-play with Unix networks, and a familiar environment. So this IS being accepted - nay, demanded - by major ASIC design ooperations.
Billion-dollar companies in trillion-dollar industries are depending on hundreds of large applications written to run on unix. If they ever DO port them to something else, any bets on whether it will be something new, or another flavor of Unix?
(And right now Linux qualifies as a flavor of Unix for this discussion. Windows, NT, and OS2, of course, do not. What a pity.)
Re:monolithic random comments (Score:2)
UNIX is just a method (Score:2)
Since it's so much more versatile than an all-in-one GUI approach (like win9x but not really like the original NT architecture) it's unlikely to die out as a concept even if every single currently existing UNIX maker goes out of business.
Eric
Want to work at Transmeta? Hedgefund.net? Priceline?
The king is dead! Long live the king! (Score:2)
I can run the same scripts under Linux, AIX or FreeBSD for the most part with very little portability problems. For me, Unix is a set of tools and a lot of leverage. It is the idea that I should be able to carry my tools and data with me until they are no longer necessary rather than making the programs that process them artificially obsolete.
Re:There is no answer. (Score:2)
Embedded UnixWare?!? What time frame?? I'd didn't hear of such a beastie until 94-95! Granted, that was when I was trying the embedd microcontroller thing, so I probably was just out of touch.
The "death" of Unix (Score:2)
Re:The Death of Unix (Score:2)
Nothing short of an asteriod collision will prevent unix for surviving.
Sometimes I get the feeling that my Linux box will still be accumulating uptime even after the cockroaches have inherited the Earth...
numb
Annealing OS (Score:2)
Microsoft invested enormous amounts of capital in self-configuration for Windows. As a result, despite complaints to the contrary, Windows _is_ less costly/risky to install/reconfigure with novel hardware. Aside from the encyclopedic knowledge base required, the logic behind this sort of autoconfiguration is fairly impressive. It approaches some of the best induction algorithms ever fielded. But it still is a pain to install new hardware/software components.
What this means is that the OS of the future is going to have to focus a lot more on dynamic autoreconfiguration with relatively sophisticated "truth maintanence" induction algorithms that draw on, and are sensitive to changes in an enormous rule base supplied via the net by the vendors of hardware and software components. Further, OS vendors are probably going to have some sort of minimal "boot up" network configuration, similar to the dumb video modes used to get your better video drivers installed.
The next stage beyond that is a very high level interface specification language that can describe the hardware and software to the OS which will dynamically re/generate the driver/libraries for its particular configuration (with appropriate roll-back safeguards). Such systems will eventually even have Hotspot-like dynamic optimizations built in so they can generate encached code on the fly to the spec of the high level rules based on the patterns of usage and other dynamic information. Much of the nasty code that goes into autoconfiguration and driver installation will be annealed via dynamic compilation and inference and eventually hardened and optimized -- always sensitive to changes in the high level rules and descriptions.
Re:Big iron matters (Score:2)
I work for a software company that is moving away from the "shrinkwrap" package we've been selling to an ASP/Portal model, and you've stated exactly why.
Instead of supporting a million installations of the software all over the world, on 8 different operating systems on seemingly infinite different system configurations, we will only have one installation, and we will have physical access to it. If something goes wrong, we can get up and go fix it.
Most of today's browsers aren't exactly "thin," but the thin-client metaphor fits -- they may be bloated, but they're ubiquitous. Various unices of various weights can fill just about every niche out there.
While QNX is not exactly Unix, it's growing on a branch pretty close to the tree, and it's the underlying system on all of those i-openers everyone seems to be stocking up on these days. It's also embedded in thousands of things today, and has a toehold that Linux has not yet achieved... but it is still an example of "a" unix running the appliances. It was almost even the base of the next Amiga OS.
Web-based systems are only going to grow in strength and number for the next several years, and the myriad of Unices and their offspring will be morphed to fit into just about every niche.
Distributed OSes (Score:2)
Linux? (Score:2)
Re:I think it's meaningless... (Score:2)
Although I am sure that someone did something with that new fangled thing called curses. With curses and the like you can create a windowing system on a unix environment quite easily. This was a natural outgrowth of programming.
GNU has likewise changed what was Unix, and, despite it's acronymic denial, has become Unix. But not the Unix from before.
Other than costing a lot and having some little quirks that are anonying to people who are using linux now how is there a terrible difference?
The next Unix will not be today's Unix. But it will be Unix!
The unix that I use at home for the most basic things probably has not changed terribly from what a person in earlier times thought of. Although I do rely on various graphical input methods I could say take the base install for debian and have it pass as unix. True there are some extra bells and whistles but essentially they remain the same.
Re:empirical evidence (Score:2)
Actually isn't it really meaningless what would be said? Considering that you are going to go splat anyway?
What I really think is that you would have to have hard evidence that unix was in fact dieing. You would also have to make an intellectual leap and define that exact moment that unix "jumped" and started doing something stupid that was largely irreversable and untreatable. The analogy is flawed and crappy.
You could say that if running unix only on big powerful servers is dead. However something called linux came along and the definition of a "server" converged largely with what people now use for their standard computing tasks. I would wager a bet that if your machine is really, really, really, good at playing the latest computer games then you are a good choice to be a server for something relatively normal.
Re:It's not going anywhere.. (Score:2)
No I mean having say an application that could request data from a server and do some little thing. Now take that little chore and replicate thousands of clients that could transparently work on all of the machines on a network. Everyone from the secretary to the CEO of the company could have one of these little things on their desktop. Now say I want to figure out something really, very complex. How about how many times people have complained about product XYZ and how that correlated to the stock prices over the last 50 years. Now that data could be done on some mainframe with a high rate of failure or requiring special attention. However if you distribute the task to say 30,000 clients to work on in their spare time I would dare say that an answer could be easily found within the hour. All the main server or set of upper level area servers would have to do is just run the client solve their portion of the problem and then return the result back to the server they are responsible for and just correlate the final information.
Re:wow, anouther death of unix (Score:2)
selling mainframe ever in initial 6 month sales.
The stats? I would be interested in which one this is, how many people bought it, price, etc.
I predict that 20 years from now, we'll still be hearing how Unix is dying and almost extinct, prolly by the same people who will still be saying the same thing about mainframes.
I really had an interesting talk with one of my professors a couple of days ago and pretty much found that all the major universities are using Windows type development models for their CS programs. Essentially I was faced with a rather unpleaseant concept. Basically I could be forced into buying a new machine just to do standard coding.
What people are saying is that as a percentage of people who are using the medium in which the thing is in that it is decreasing in share because more and more people are entering the fray. I am sure that if you were to look at all the computers that are using unix the figure has gone down from the best of times for unix. You can also say that for mainframes. Generally people do what is best for them and choices start splintering.
From the machine to the network (Score:2)
What you will see in the future is more things like Napster, Quake 3, etc. Things which take advantage of the network but do so by using the power of the PC and not relying on something like a web server so much. I can definitely see the web shriking in the near future. Imagine how fast, efficient, manageable, customizable, etc, etc, eBay could become if it was a simple client application with the logo graphics cached, connecting to a distributed server farm, keeping track and tracking your auctions pulling data straight from the eBay databases, etc? Think of how efficient Slashdot could be as a distributed client application, relying on the PC to do a lot of the computation like sorting, getting the slashbox data, etc, etc...
Esperandi
And hopefully it will be fueled by adware. Programers get paid, users get to use for free, and the babies who don't realize that advertisement is subsidizing a life they wouldn't be willing to pay for get a fe more sharp kicks to the crotch.
Long live Akkadian script! (Score:2)
(The scene - two cutting edge technologists talking about their favoured operating systems circa 2000 years BCE)
Gilgamesh (Bronze caster from the city state of Ur) : I tell you Nergal, this technology you're using, it's going to die out and nobody will use it any more ...
Nergal (Basket maker from the city state of Akkad): Ahh, away with you, everybody uses this technique of making baskets. Besides all the manuals are written in Akkadian, people will still be using Akkadian 5000 years from now!
Hahaha only joking, but hey....
Actually I heard the US military were looking at devising language / symbols to put on their high level nuclear waste bunkers, so when humans or whatever stumble across high level dumps 10,000 years from now they'll know what's inside is dangerous and should be treated with caution. When you consider that the history of human writing only goes back about 5000 years this is some task. Can anybody help me track down some of the literature/ urls/ research about this project? Many thanks.
Why GNU is Unix (though of course it's not...) (Score:2)
This touches on a point that is relevant to most people here: that's why GNU is a unix-like system. At first RMS, what with his love for all things lisp, had thought about making the free OS he was planning a really big lisp environment. But he realized that in order to be a general-purpose system, it would still need to be built on top of a general-purpose OS; so he chose unix. And that's why we have our wonderful unix-like system with Emacs (= a really big lisp environment) running on top of it.
The best way, imho, to make your wonderful ubercool environment it to build it on top of (a subset of) unix. That way you can let unix take care of the mundane things (device drivers and whatnot) that you don't want to, and be quite portable.
Definitely not the end... More like a big chance. (Score:2)
Microsoft still controls a lot of desktop machines, but the networking code in Windows 98 is so broken that people might consider upgrading to Linux or *BSD if they were doing more networking.
Embedded devices, another part of this move, are another big chance for Unix-like systems (primarily Linux and PicoBSD) - I think Linux is in use in more embedded devices than Windows CE already.
Re:Plan 9 (Somewhat tangential) (Score:2)
Unix can't really die... (Score:2)
Really anyone can make a Unix with in a year.
Unix is a set of command line applications.
Is everyone going to suddenly realise that grep and wc were kind of silly and should be gotten rid of?
Unix is organised filesystems.
People are going to decide that placing libraries at random is better?
Unix is standards.
More than just the posix standard there are tons of standards. Even if they change they'll still be Unix a heart.
Unix is a philosophy.
A pretty good one. Pipes are good. Shell scripts are good. And small programs are less buggy.
Maybe someday people will say, "I only want to deal with the files in my
Un*x is also popular in real-time embedded systems (Score:2)
Re:monolithic random comments (Score:2)
It's been done: Plan 9 [lucent.com]
Re:Define Unix (Score:2)
Re:empirical evidence (Score:2)
A man jumped from the top of a 30-story building. Around the 10th floor, a person called out to him, "Hey, how's it going?", to which he replied, "So far, so good!"
defining unix (Score:2)
Is NetBSD Unix? Is Linux? Solaris? AIX? Minix? GNU?
Sure, an OS can be "certified" Unix or certified "POSIX compliant", but that isn't an end in itself. Unix (however you define it) has eveolved through the years, as everyone has already pointed out, but is also modular (I can replace proprietary ls with GNU ls), and portable.
Where will Unix be in 10 years? I don't know. But I know where it won't be: lost & forgotten.
Re:"Imminent death of the net predicted!" (Score:2)
Yep. And we're all here making our comments. Not on USENET, where there are no moderators.
I download and archive select Usenet groups, and... umm... a lot of them are a total loss. The Linux groups have some of the highest traffic, but also the largest volume of clueless noise.
The real action has shifted to private forums like
The bulk of the Usenet traffic these days is purely Binary attachments. The most popular NNTP clients on Windows machines strip off and throw away all 'text' content, keeping only the attachments. NewsBin, Pluckitt, etc.
Re:It's not going anywhere.. (Score:2)
Centralization of data is what we're moving to. This is why things like Hotmail are so successful -- easy access to e-mail regardless of where you're connected. I just bought a couple of i-openers (see the Slashdot article a few days ago) that I'm converting to cheap X terminals.
Some attempts to do this on a large scale (WinTerms and "Thin" clients) haven't really caught on, mainly because of the still high cost of the clients and the fact that it's not cost effective to switch. But who's to say that won't change?
Which leaves us with the question: what type of platform do we want supporting the network devices we connect? I don't think I've seen NT or any "recent" OS compete favorably on these grounds, which leaves us with OSes like Unix, still being developed as fast as the technology does.
Of course, that's not to say that other operating systems won't possibly step up and smack Unix out of the arena entirely, but we can't bet on that..
Re:Unix is already doing this. (Score:2)
--
OPEN SOURCE UNIX (Score:3)
to better describe my concept of unix, i have developed a new mathematics which defines unix in terms of "super-time." our beloved unix is embedded in this supertime, which contains no singularities! thus, unix itself is without bound in time and space... it simply always is.
thank you.
Define Unix (Score:3)
--
Re:Unix and Change (Score:3)
> toward Unix every day.
In the world I live in, it's years between NT Releases; months between service packs.
Not that I wouldn't love to see this daily evolution of course...
Media sensationalism! (Score:3)
It's like when Apple was having financial troubles: "Don't buy Apple, what if they go out of business?" Who cares if they have $2 billion in cash reserves? Media sensationalism.
And Amiga. Amiga has died so many times, nobody takes the report of Amiga's death seriously anymore. Perhaps nobody defined dying: new products are becoming available, a new AmigaOS came out 5 months ago, and so on. So what is "dead"?
Now slashdot: anyone who has spent any time in the industry knows that Unix is the most dominant force holding the whole computing world together, so why pretend to take a question like this seriously? Media sensationalism. That's all.
Until I see a headline like, "Unix is dead!", followed by "a young man named Unix was gunned down", I'll stick to reality.
Now: a more apt question is: Is Windows dying? I have some compelling ideas about THAT...
Unix is already doing this. (Score:3)
Sounds a lot like any university's internal LAN. Which probably runs Unix (Solaris at my university).
Unix was already designed to address many of the issues that come up with network computing. I only see a few things that need to be fine-tuned:
Portability of user accounts across the entire network, checking of permissions/licenses for running applications, etc. This is already pretty much here; you just have to know what you're doing to set it up. The challenge is to make sure that all applications on your system work fine for all users, everyone can do what they're supposed to and have access to what they need to, but to make sure that nobody can do anything they shouldn't be able to.
If you are a validated user at one university, you would ideally be able to log into another university with guest privileges, and have it recognize you as a specific user ("foo at bar university"). Similarly, I'd like a validated user on my personal LAN to be able to access someone else's service while keeping an individual identity. Or through another network access their "home" network's services with their full privileges. The idea being that identities and permissions carry over robustly and securely over heterogeneous and possibly untrusted networks.
This ties into the whole "the network is the computer" idea. If I could just use the collective computing power of all of my hypothetical LAN terminals to distribute tasks, I might not need a central server at all (assuming that my tasks have low communications overhead). Similarly, it would be nice to be able to farm off tasks to another "friendly" network. Protocols and support for this is in development, but would need to be standardized for true "network computing" to come into its own.
Again - by and large, these are capabilities that already exist, or exist at least in part. They continue to be developed - and chances are, that development is happening under Unix.
UNIX is constantly adapting (Score:3)
Programs I wrote for UNIX 10 years ago still work just fine and take full advantage of the larger memory and faster processors. That is not at all true of systems like Windows.
UNIX is also a particular approach to writing kernels, based around a monolithic, fairly simple privileged core. UNIX kernels are also pretty closely tied to the C language. That's very different from microkernels, Windows, or other systems. And UNIX is a set of conventions for where and how to represent system configuration information, command line programs, etc.
That kind of continuity can't last forever, and eventually people will start using systems that can't realistically be called "UNIX" anymore.
Sooner or later, kernels will have to be written differently and in languages other than C, the file system will change into some different kind of database, etc. People will also build new replacements for system configuration information, the init/login sequence, etc.
I suspect, however, that the transition from UNIX to its successors will be fairly gradual, and that any successors will continue to offer reasonable POSIX implementations for a long time to come.
Plan 9 and Mach are both examples of successors to UNIX that really have some good continuity with it, and that's probably what's in store both commercially and in the open source community.
layers of functionality (Score:3)
Since a unix core install can be made very small,. suitable for embedded systems, I don't see a reason to throw away a perfectly good model with a well-understood API and many thousands of man-years of refinement.
Unix is being used successfully on systems from mainframes to wearables, from super graphics boxes to the tiniest pinhead sized embedded systems. What design criteria do you envision that would contraindicate the use of Unix?
Unix History (Score:3)
Montgomery's short "An Introduction to Unix" [unix-wizards.com] points at the Unix system family tree [unix-wizards.com].
That 1997 document does not mention Linux, which grew out of the POSIX definition, System V, NetBSD, and GNU tools (developed on many Unix flavors). The Unix History [faqs.org] segment of the Unix FAQ does mention Linux briefly.
Re:I have SEEN the future (Score:3)
Well, I was a bit surprised by the 5, but it wasn't meant to be a troll either. Damn it, I want Windows to live up to the hype. I'm in a huge NT shop at a large college where desktop security is important. Most of the pieces to make my life easier are there, but I see no light at the end of the tunnel.
Take applications for Windows. Damn they suck. Not the apps, but the design. Microsoft can readily fix this if they got their heads out of their asses and realized that the world isn't about one person/one computer with full control.
How can they fix it? By taking their already existing label standards for apps and strenghthing them so new apps at least behave properly. Don't follow the rules, then your app is not "compatible with Windows 2000." But then that would break all of THEIR apps too...
Example:
All of my bitching about NT/2000 comes from actual experience trying to make it work as advertised. When Microsoft's own fucking applications don't follow good design standards, how do they expect others to do so.
Do you realize how long I took to get that piece of shit IEAK to work properly? First of all, the IEAK book spends about 250 pages talking about how wonderful it is and all you can do with it, and then when it gets to the point where it talks about how to implement policies, it spends exactly 1.5 pages on what all 200 settings do. During an unattended install, it throws shit loads of stuff into runonce keys but, heaven help you if, before the next reboot you or another program invokes rundll32, because rundll32 triggers all runonce keys to be processed immediately, even if they were installed during THIS boot instance. That one killed me for a long time. Then, during a user logon, you have to ensure loadwc.exe can run and *IT* uses rundll32 to do a lot of the customizations with the policy settings installed. But, get this, rundll32 won't run if it can't for some unknown reason have write access to the runonce key. But allowing users r/w access to that key violates a KB article saying what a huge security risk that key is for users to have r/w access to it. So, I have to give admin rights to users who I want to policy control their computer?! Oh yeah, that makes a lot of sense... :( And none of this is documented in the IEAK docs. No, just hundreds of pages of marketing fluff. I'm a sys admin. I buy an administration kit to get technical details, not marketing hype.
None of this actual implementation stuff ever gets mentioned in the press. All the grand claims of how Microsoft makes administration tasks easier are taken at their word and I wonder if anyone actually tries to use these features. When I hit problems. dejanews and altavista searches for similar things usually turn up nuttin on these issues.
OK, I'm ranting as usual. I really really hate platform bigots, UNIX and NT or whatever. I go forr the best tool for the job. I just really get tired of doing careful research, picking a Microsoft solution as the best tool, then finding out I was an idiot for actually trying to implement the solution and expecting it to actually work.
How many times do I have to be abused before I learn my lesson? :-(
Yes, Linux and UNIX programs have their problems too. I couldn't believe the hassles I had to go through to get Amanda to work with my DG/UX box. sendsize silently kept failing to calc disk sizes cause the fork/exec of runtar was screwed up. But you know, fixing it wasn't a big deal. I had the sources right there, went through the code, found the problem as it relates to my OS. I fixed it and will send the patches back to the authors.
The difference is, when something breaks in an open source OS, I can always fix it myself. When something breaks in a proprietary OS, I'm shit out of luck and can only hope that the next version fixes it and the upgrade costs are not too prohibitive for my installed base of thousands of desktops...
times are changing (Score:3)
Already we are seeing that the notion of a strictly one-dimensional hierarchical file system becoming archaic. Having a system of files is useless if you are so overrun with data that amongst the plethora of files nothing is meaningful. With the network-as-computer approaching I believe we will shift to the systems of "resources". We are already seeing this with distributed computing. URLs locate abstract "resources", whether they are on a local file system, or over the net, whether they are static data, or an executable service or agent. We have to move to an
That being said, a lot of Unix is based on the good old file system metaphor. At the time, addressing everything as a file was as novel as addressing everything as a resource (think system components as CORBA objects - check out AllianceOS). Because of the above reasons I think we need to graduate to a more abstract, associative model. Also, very simple security structures like ACLs are showing their age...they do not scale up well. We are seeing new security models, like capability-based systems, where each entity in a system, be it user, or program, has a set of "capabilities" which it can use to interact with other pieces. A higher level of abstraction. I think for Unix to stay in places requiring these features (associative data storage/filtering - desktop, new security paradigms - large networks, network OSs), it will have to change.
Where things like this don't matter one bit (like the mainframe), Unix will continue to reign supreme.
...and here's why. (Score:3)
http://home.xnet.com/~raven/Sysadmin/UNIX.html
says it very well. "People are confusing dying with age," and that brief article has a good idea why Unix will still be around for a long time.
(It was written shortly after Lose95 was released.
UNIX is the fate we have chosen (Score:3)
Unfortunately, as personal computers became more complex, they also became more unreliable. If Windows 3.0 crashed and you could just hit a button and be back to work in three seconds, then no one would have cared. But you had to sit through an unbearable two minute boot sequence. Networking was messy. Arcane INI files were just as bad as anything from the seventies. As reliability dropped, UNIX began looking more and more attractive. It was still butt ugly, but at least it worked.
By now, we should have something better. We've had an additional ten years to deal with the problem. We should have something very small and very stable and very easy to take care of. A computer should be able to reboot as fast as a calculator. We shouldn't have to deal with driver issues and such as much as we do. But it didn't happen. PCs got faster and more varied, but nothing improved in the reliability or simplicity department. And now, to our horror, UNIX is looking like the simpler alternative. No one would have believed it.
Evolution (Score:3)
I don't think it'll 'die' exactly. It may eventually evolve into something that bears little resemblence to what it is today, have a different name, have absolutely no code in common with what we know today, but it'll be evolution.
Licenses that make the source available for reuse make this more likely than ever. The open source movement is giving Unix a lot of strength. I'm fairly confident that people will still be typing "ls" to see their files 30 years from now (assuming the keyboard isn't dead by then
numb
Re:It's not going anywhere.. (Score:3)
and lesser-equipped but ubiquitous terminals to access those resources, but Unix will still be there in some fashion.
I don't see the likelyhood of this. All you really have to do is increase the ability for the client to work properly and increase it's capabalities. For somethings (say perhaps tactical nuclear weapons simulations) you may need mainframes however this is not the norm nor very supportable. Applications are mostly writen for PC type platforms considering how much Microsoft has spent convincing people of this.
Who's to say Unix won't be the OS that drives the appliances?
But appliances are just so.. well boring. What would be nice is to have a large mainframe that you could optionally use and use for massive backups of the target machine (say copy the entire image of the client in case of power failure and such) and allow the client to have responsibilities.
Personally I don't want to have some rather fiendish god controlling my computing resources at one particular point. If someone would write an application API that would work like distributed.net and allow for say a complex process to be broken down into many smaller parts and work on any number of client machines that could be increased and decreased at will would be much better. Add into this a possibility to have "relay points" where the data could be copied for a particular portion of the network in case some machine failed or sent corrupt data.
UNIX is far too open... (Score:3)
The trick of UNIX: it has always been availiable and highly adaptable to different environments. While this was changing in the 1980s (the UNIX wars), RMS, Linus and all of those open source programmers have insured that UNIX in some variant will always be in use.
If you look at recent corporate inroads, such as IBM-Intel, Phillips TiVo, etc., the market for UNIX like solutions is actually growing!
In 1984, Kernighan & Pike saw this coming... (Score:3)
In the Epilogue of their book, The UNIX Programming Environment, Brian Kernighan and Rob Pike were not as arrogant to think UNIX would live forever, but did have this to say:
"The principles on which UNIX is based--simplicity of structure, the lack of disproportionate means, building on existing programs rather than recreating, programmability of the command interpreter, a tree-structured file system, and so on--are therefore spreading and displacing the ideas in the monolithic systems that preceded it. The UNIX system can't last forever, but systems that hope to supersede it will have to incorporate many of its fundamental ideas."
So long as this statement holds true, I'll gladly work with any future system which provides this set of core ideas.
Plan 9 (Somewhat tangential) (Score:3)
Plan 9 is UNIX-like, but it treats all system objects like files. This includes objects that exist across the network. Because of this, it is very easy to distribute the OS across several machines with it being completely transparent to the user. We set Plan 9 up like this in the OS lab at my college a couple years back, it's very odd.
It probably isn't an OS that will pick up by itself, but it's an example of a way in which an OS can be distributed with a reasonable degree of transparency.
The Lessons of History (Score:3)
Unix will adapt and grow- it always has.
The Unix that we use now has little in common with the Unix of Thompson and Ritchie. It has been in a state of continual evolution and will remain thus until ^we^ stop caring about it.
Unix has transitioned from PDP-machine language to portable C, moved from minicomputers to microcomputers (and even to mainframes and PDAs), acquired thousands of tools and roles never dreamed of by its creators. The Unix user today has a choice of command line shells, a choice of GUIs, a choice of vendors, even a choice of fundamental architecture as the file systems and such have evolved quite differently amongst the different Unices.
I think we'll see changes in the coming years, but no greater change than we've seen in the last 30. Unix will continue to evolve until the Unix of our children is as unrecognisable to us as the Unix of our fathers. New hardware and new markets only create now opportunities for Unix to grow; it does ^not^ ring its death-knell.
~wmaheriv
Re:I think it's meaningless... (Score:3)
I think I ran into that at work once... and it pissed me off. Let me explain... although most of our HTTP and DB stuff is served from IBM Big Iron, we do have a few Sun E450s and a couple of Sun E1Ks. The Suns are loaded with Solaris 7, as is to be expected, and most of the operators (myself included) use CDE. So I'm helping someone write some tapes the other day, and the machine had CDE, and so dumb old me (thinking I was in Solaris) started pounding out Unix commands. After nothing worked, I looked at the title bar on the Xterm. It wasn't an Xterm, but a "DECTerm". I looked under the table, and sure enough, there was a DEC Alpha box. Ah.
The point of all that ranting is, yes, I agree, DEC still makes money.
I don't usually go over to that side of the server room, so I looked around a bit after finding the DEC machine, and what I found scared me. A bunch of 10-year old DEC daisy-wheel printers and reel-tape machines. I had no idea the company was using such old crap. What's even more scary is that the tape machines are even needed: some of our clients refuse to rewrite their media distribtution software to accept anything besides large tapes. The more "up-to-date" clients get the same information by FTP, but these old fogeys -- and some of these are household names -- are using completely rotted software.
My horror was complete when I discovered something I never thought anyone from my generation would ever see in use: an NEC Astra machine, from the mid-seventies! Complete with a proprietary (read: not PC-compatable in any fashion) OS, loaded from eight-inch floppies. The machine was used right up to when it died, on January 1, 2000. (70s hardware and softare isn't quite as Y2K compliant as today's ;-)
I ran and hid behind the IBM fridge racks and haven't been over to that corner since.
I guess I started ranting again. Let me try and make a point out of all that... uh, VMS still pisses me off, and DEC can go to hell. :-D
Unix and Change (Score:4)
Look at it this way Be *had* to put a level of Unix compatibility in BeOS. MacOS is now a variant of Unix and NT evolves more toward Unix every day.
On the other hand Unix/Linux must be lost to the user in the sense that Unix/Linux at the command line or Xlib level just isn't for the user.
Extremely Reliable Operating System (Score:4)
EROS is hard to describe. It's capability-based and has orthogonal persistence -- and if that doesn't mean anything to you, I'm not going to be able to explain it much better. Check out the EROS project site [eros-os.org] and read the documentation. One thing this means that I can explain, though, is this: "snapshots" are taken of the current state of the system every five minutes. If the power goes out, the system is later restored to the last good snapshot. So you could have a text editor window open, never save your file, PULL THE PLUG on your computer and then plug it back it. Within 30 seconds (or however long your BIOS POST takes to complete), your text editor window is back on the screen, and you've lost no more than five minutes of work.
EROS [eros-os.org] is cool. I think it has potential to be the Next Big Thing. Check it out, download it (it's GPL'ed), play with it. Have fun.
-----
The real meaning of the GNU GPL:
The End of Fire? (Score:4)
--
There is no answer. (Score:4)
It was only a few years ago that I was mourning the apparent *nix recession; The only game in town was Xenix/AT&T (and a wee bit of Sun, but not in my neck of the woods), and their products were both languishing and confined to minis.. Linux and the *BSDs were infant, not worth a mention outside of academia. Now it has come full circle. People are using *nix [gasp] ON THEIR DESKTOP! I can run *nix on everything from my multi-million dollar IBM to my $100 garage sale throwaway. And it is adapting again. Embedded Unix? I would have laughed my ass off if someone had suggested running Unix on a microcontroller only a few years ago..
I kind of suspect that *nix is just too adaptable to die, but to say whether or not it will be beaten back onto the mainframes by PalmOS run-PDA's in a decade is impossible.
More meaningless tripe (Score:5)
The advent of distributed, collaborative, pure-hype^H^H^H^Hjava, Active System Blaster 2000 will not make Unix obsolete. Even a revolutionary, fully distributed and autonomous network object system still needs to send bytes over the wire, still needs to access system memory, and still needs to accept input and create output. These are the things that Unix provides. This is why Unix will always be around.
I suppose that a newer operating system could supplant Unix. However, I don't seen any in the near future. Be has a bright future, because it has networking and other nice capabilities. But Unix has the trump card over BeOS: the idea of users. In a distributed network environment, the user concept becomes much more important. Information, files, interface configuration, and many other things are associated with a person. Those things must be secured from other people, and the other people must be secured from them. Therefore any OS that wishes to supplant Unix will need to supply the same kind of protection for users' information.
Cheers,
jwb
monolithic random comments (Score:5)
That's funny, I thought that Unix was based on a monolithic kernel... silly semantics
I would love to see some of the best coders and operating systems people put together a new OS from scratch using the latest techniques. Ideally this would create an ultra stable and very modular system. I would happily give up some extra CPU cycles for increased modularity and the ability to easily swap in and out OS components so that I could customize my OS to the task at hand. I find it rather ridiculous that I run the same OS when I am playing games, running a web server or working with Photoshop (etc). Rather than having a generically-good OS I would prefer a highly optimized OS for the task(s) at hand.
How often do I run run a game, Photoshop, compiler, and web server concurrently on my home box? Give me adaptibility and modularity or give me death!
The End of Troll Questions? (Score:5)
It's just as easy to ask: Is this the end of silverware, or the end of fire, or the end of any old thing that's proven to work... Is genetic engineering the end of agriculture? Is organ transplantation the end of death? Is The Bomb the end of War?
Yeah, there's micro-kernel based OSes out there like Qnx and NeXT, and Hurd... But they're still Unix. The NEW OS X from Apple is more Unix than it's predecessor. NT is more Unix than Win95.. There's new approaches like BeOS.
If one defines Unix in a very constrained way then Unix has been dead for a long time. When AT&T first released System V, and allowed it to mutate, Unix died and was reborn in a variety of ways. Umm, that was what? 1976?
If one defines Unix broadly, as a set of concepts, a layered architecture, levels of abstraction, sets of small uni-purpose tools working together, APIs and things 'grep-like' then guess what? Unix will live forever.
It's a pointless question, not because it's bad, but because it's completely subjective.
I think it's meaningless... (Score:5)
Re:More meaningless tripe (Score:5)
Re:monolithic random comments (Score:5)
It is, and the orrigional comment didn't suggest otherwise. Read again:
"other operating systems based on" implies that unix is one of a group of "operating systems based on
But that's not what I really wanted to comment on.
Hrm... read: http://www.gnu.org/software/hurd/hurd.html [gnu.org]
Read: http://www.eros-os.org/ [eros-os.org] Read: http://www.be.com/ [be.com]
Alternatives are out there. You just haven't found them.
-rt
======
Now, I think it would be GOOD to buy FIVE or SIX STUDEBAKERS
and CRUISE for ARTIFICIAL FLAVORING!!
I have SEEN the future (Score:5)
I have seen the future. The future is filled with operating systems that demand that their system binary directories be writable to all, else they fail.
I have seen the future. It has an operating system whose applications, even those written by the OS authors, can ignore the TEMP environment variable and scribble temporary files where-ever they want and fail to operate if they can not do as they wish.
I have seen the future. The future is filled with continued support for legacy drive letters and 8.3 file names with rename.ini kludges during installs.
I have seen the future. The future is an operating system where you have to shell out serious dollars to buy third-party utilities to get around security deficiences in the design of the OS. After all, why fix that pesky virus problem when so many anti-virus companies would go under without that revenue stream coming in.
I have seen the future. It has operating systems whose file systems don't support the concept of being able to delete a file yet have it not actually get deleted until the last remaining process that has it open dies first. For doing so would prevent the need to put dynamic link libraries into a temporary space and have them "installed" during a reboot. Reboots are good. They clear up sloppy OS design problems.
I have seen the future. The future is filled with grand marketing schemes like "Administration Kits" that promise all kinds of abilities to deploy corporate policy restrictions to users yet neglect to mention that these policies are applied by a program that has to write to an area of the OS that was previously recommended be R/O due to the security problems it causes if it is R/W, hence making the ability to make the scheme work as advertised impossible for all users who do not have full permissions to their workstation.
The future, my friends, is about image and not function. UNIX is ugly. It's doomed.
Or in other words, resistance is futile. At least that is what they want us to believe... :)
It's not going anywhere.. (Score:5)
Who's to say Unix won't be the OS that drives the appliances?