Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Microsoft

How Do Linux and Windows 2000 Compare? 598

fialar asks: "There seem to be plenty of older web articles comparing Microsoft Windows NT 4.0 with Linux, but there do not seem to be any out there that have fully explored Windows 2000 and do a feature by feature comparison or a chart. Does anyone know where one could find such a beast? John Kirsch's Microsoft Windows NT Server 4.0 versus UNIX is an excellent document, but it hasn't been updated in over a year. I know that Windows 2000 offers many new features over NT 4.0, but not having fully explored it, I don't know what Linux has that is comparable."
This discussion has been archived. No new comments can be posted.

How Do Linux and Windows 2000 Compare?

Comments Filter:
  • In fairness, w2k is not the only operating system that suffers from this. I recently tried to install a recent version of Macster (a Napster clone) onto my Powermac, which runs sys 8.1. The installer forcibly installed CarbonLib and did _something_ (not sure what) that killed the Wacom tablet software. At that point, through only installing software that added extensions and made mysterious system changes (patching traps, I assume) the machine was rendered unusable, the mouse pointer either twitching violently or the machine basically locking up in a characteristic manner, as if you were holding down the mouse button in the menu bar (Finder wouldn't finish loading). Toast.

    It was so bad that I had to boot the machine with shift held down, getting rid of all extensions and patches. Various attempts at moving stuff around (like removing CarbonLib) weren't effective and it eventually proved necessary to reinstall ALL the third party control panels having to do with the Apple Desktop Bus (ADB- wacom tablet, Gravis joystick). Upon doing this, the machine booted happily again with no other changes, but an attempt to run Macster returned the error that the (still present!) CarbonLib library wasn't available! At this point I ditched the new version of Macster _and_ the CarbonLib extension that had caused so much trouble.

    ...um, so I guess the point is that just installing an application _can't_ necessarily destroy an old MacOS install even when it wreaks total havoc on delicate system files and hoses things, unlike w2k. Ouch. Of course the old MacOS wasn't trying to 'fix itself', unlike the w2k.

    'nevermind!' ;)

  • Um.

    One question for you, please.

    WHY should email be executable?

    If you can't answer that you aren't even acknowledging the problem, and that's not good.

  • However, things Win2K has that Linux really should have include having to press control alt delete to log on (it stops people putting fake logon screens)

    No, it doesn't. Under Linux C-A-del has no special meaning. While the kernel understands and traps the key sequence, there's no requirement that init do anything special with it. Most distros set it up to do orderly shutdown, but you could just as easily set it to run /bin/true instead. Not to mention the fact that X takes complete control of the keyboard and can trap out any key sequence you want, including that one. Plus, that only applies to console logins anyway - sure enough, the only type that a standard install of windoze supports, but not at all the only means that Unix supports. So while you could add this feature fairly easily, it isn't really very useful.

    and the hibernate function.

    Linux already supports power management (on peecees anyway) through APM and in 2.4 through ACPI. It works about as well as it does under the microsoft environments, which is to say it depends greatly on what specific hardware you have. Most power management schemes today don't seem to work very well, regardless of OS. And of course, Unix systems are meant to be left on 100% of the time. There just isn't a lot of use for power management in a non-laptop environment. I would suggest, though, that the Lookout! problem you note is most certainly an OS bug - applications should not be affected by power management. If the OS blows it, it should affect all processes, not just one.

  • Set samba up to take encrypted aunth, or turn of NT and set it to cleartext.

    Everything was tried. It just doesn't work. The only solution is to put in an smbpasswd file that contains all the users with empty passwords. Not very good but the only way to do it. Trust me on this one. We spent DAYS.

  • No more wondering about different config file syntax (xf86config anyone?). Everything is a file...

    Unfortunately if you unify things in this way, you a) break every existing application, and more importantly b) apply a one-size-fits-all solution. Sorry, not the Unix way. What form do you pick for this? The obvious one is that each file is named for an attribute, and contains the attribute's value. But unfortunately not everything is best expressed as attribute=value.

    The current system already has everything as a file - in a format and in locations that the application(s) using it understand. While I'd agree that having lots of different config file formats is annoying, forcing the solution into kernel space isn't the answer - nor is applying the one-size-fits-all solution, whatever it might be. We don't more more damn pseudo-filesystems; there are way too many already. We've got stuff like pipefs now that are really invisible for example. Try and explain to me how that's any less magical and obscure than the registry. It's not the right solution. Besides, how is it an operating system's responsibility to manage configuration of applications anyway? As far as the OS is concerned, an application consists of one or more completely opaque processes. It doesn't know or care what they do. If you start blurring the lines, you end up with an OS that looks like Microsoft's - and works about as well.

  • Better not tell the guys doing KDE and Gnome that a single centralized place to do administration is a bad idea. Administrators looking after 30,000 desktops can't work any other way. It's not a flawed architecture - it's the ONLY scalable model to move us forward from the each system is an island in the sea of machines.

    Firstly, I should point out that KDE and gnome are both giant leaps in the wrong direction, dead away from what has made Unix so durable. But that's not really the key issue here.

    No one person actually supports 30k desktop systems. Only a small handful of the largest magacorporations even have that many computers. And they have huge teams, managing small chunks of those systems, usually in geographically diverse locations. So it's not like there's a giant warehouse somewhere filled bottom to top with desktop computers all run by one caffeine-filled sysadmin. Please. This job is challenging, but it's not that bad.

    There's a difference between having a centralised method of maintaining systems and having a centralised place on each machine where every instance of every application wants to write each user's settings. If the gnome and/or kde people are doing that, then I'm genuinely shocked at their bad judgment and lack of common sense. Systems like kickstart, jumpstart, and roboinst make installation of systems easy. Systems like cron jobs, automated log filters, and global site-specific default configurations make managing systems scalable. But systems like the registry make scalability a pipe dream.

  • What I can't seem to explain to Linux users is that in Windows, you don't NEED any of those, so it makes not one bit of difference if they are not implemented in the same way as in linux.

    In MS-Windows, you don't have them, you mean. I don't know how many times I've tried to do simple things in MS-Windows, and been stymied, and had to do it the tedious CTRL-click selection route.

    I could very well come back and say: "But Linux sux because it doesn't have a visual file explorer tool!"

    Really, I've not found any tool that comes with MS-Windows that doesn't have an equivelent graphical tool for Linux over this last year. KFM is a damned fine browser every bit as capable as MS-Explorer.

    Of course, the answer is that it doesn't need it, the command line tools are sufficient. However, I still prefer to shift-select 100 files and drag them with the mouse to writing a ten mile long command to select those (and only those!) files I want copied.

    What's so hard about

    cp *ego*.c ../ego/src

    In a directory of mixed names in which you only want to move certain types of files, the command line is 100000 times faster than a graphical tool.

    Now, suppose you wish to move all your mp3s, which are scattered all through your various subdirectories, into a common directory. You could go hunting for them with a graphical find tool, and moving them one at a time, or you can just type:

    find . -name \*.mp3 -exec mv {} ~/music \;

    Granted, that requires a bit of knowledge of your tools. But I will pit myself at a command line (which MS-Windows does not even have to any great extent) to someone running a graphical browser, for any but the most trivial file manipulation tasks.

    MS-Windows makes the job prettier. It doesn't make it easier. I know. I've used both extensively.
  • With ext2fs (and nearly all other Unix filesystems I know about) even the smallest file consumes at least 512 bytes of storage, so using them to implement a registry is horrendously inefficient.

    This is why ReiserFS is such a nice idea: it scales down to the smallest files smoothly, so lots of small files can be stored efficiently. I think if something like that gains widespread use, we may see big trees of small files for configuration and other tasks become much more common.
    --
  • Okay, I was with you for about the first half of that, but then your advocacy started overshadowing your facts.

    First, I saw Windows 2000 Beta 3 crash on INSTALLATION, something that should never happen. The only time I've seen anything like that on Linux... well, the media was physically corrupted, and it still tried to install. :)

    Linux and Windows can run each other's binaries perfectly; the problem is finding a program that completely emulates or virtualizes the x86. At the moment, I know that VMWare does a good job, and Bochs does too if you can live with how slow it is. Plex86 is in the works, and is showing promise, since it can run DOS now...

    I agree about Netscape, it isn't terribly stable; however, most graphical browsers aren't, for one reason or another, *including* IE, in my experience. But Netscape has the potential to be downright horrible about it. However, X has been pretty good to me, and in my experience I've had much more trouble with gdm. (which is a reason to use xdm...)

    BTW, Linux has great multimedia support. Okay, okay, let me explain. I have a Matrox G400, and the hardware acceleration is *sweet*. Also, my SB Live Value is just excellent, I love the hardware mixing, the multiple DSP's... I installed ALSA and now the MIDI patches ("soundfonts") work too, and they sound good.

    It's all about picking supported hardware, though, which you still have to do for Win2K, as well. My DVD drive isn't supported, because I (a) couldn't find much information about that on the net, and (b) just bought it first, figuring I'd test it out later. I'll probably ask the developers about this, since it works under Windows. I've heard it can be made to work under VMWare too, so all I really need is some debugging info. :)

    Plus support for multi-processing makes it even better. Now, I don't have more than one processor, but I might set up a dual-proc test box if I can ever find an old board for it. However, I've seen it done on Linux, and it is sweet. No paying extra for a different version that just consists of a stupid registry hack, either. It has decent multi-processing support out of the box. And I'd love to see a comparison to Win2K here, since that's one thing that's supposed to be better in 2.4.

    How about that, eh, guys? Something based around Linux Kernel 2.4.0 with a bunch of stable stuff, vs. Win2KSP1, or whatever is current and patched by then. Test multi-processing, test well-supported hardware, RAID, whatever. Just test the hell out of it.

    Being a real system administrator is based on experience. Now, I won't be one, because I'm going to graduate from college with a CSC major, and I'd rather be coding, but just because I *went* to college doesn't mean I've been idle, or don't know my stuff. Maybe not about Win2K, but I haven't really wanted to use it a whole lot. :)

    And no, you don't have to do anything *wrong* for Win2K to crash on you. Sometimes you don't have to do anything. It's better, but it isn't perfect yet. And Microsoft has been that arrogant about it from Day 1; I don't know why they even pretend to have tech support. And how is realplayer crap? Is it just not Windows Media Player? Was it not written for Windows Internals, but instead cross-platform? As for Netscape, I'd rather run Mozilla; IE is not cross-platform, and it shouldn't be integrated into the OS file browser, and it annoys the hell out of me. I haven't tested rendering yet, but I'd want to test two equivalently dated versions. (IE 4, NS4; Current IE, Current Mozilla...)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • The only true way to get an unbiased view of things is to try both, and see for yourself.

    Good point, but given history and the cost to try, a cartoon keeps running through my head:

    Come on Charlie Brown, I won't pull it away this time. Really?! Bill, do you really promise?...

  • It makes perfect sense not to put X on a box that is going to be remotely administered. Remote administration continues to be one of windows weaknesses.

    I disagree.

    I've started installing some X apps on remote machines just to ease administration... I can call up the X app from within my session and it is as if the machine is my own.

    Don't get me wrong, I love the CLI... but just as some things are easier with the CLI, some things are just easier from a GUI perspective.

    What I'd like is a good repository of X securifying documentation.

  • I wouldn't say that is true at all. I along with those I work with are very harsh critics of Microsoft.

    It's just that we see the real problems, not the FUD that is spread by the Linux zealots.

    Just because we don't agree with the FUD doesn't mean we don't have issues.

    Although to be honest, Windows 2000 has addressed nearly every major issue I had with NT 4.0, including some really annoying UI problems. Proof that Microsoft is listening.

  • Do you people even read documentation? Sheesh.

    Most of your problems are discussed in great detail from a variety of locations. For BIND integration, check Microsoft. For Samba, check samba.org...

    sheesh

    Still trying to figure out why someone thought it was a bright idea to put SQL Server and PDC on the same machine.
  • Hmm, maybe because PCAnywhere isn't a simple app?
    It does a lot of low level stuff to take over your machine.

    It might seem backwards to have more than one server, but most corporations do just that. Whether it be routers, Unix, NT, whatever... we always test a configuration change before making it in production.

  • Because Win32 doesn't have fork(). :)

  • you can still implement transitive trusts with AD if you are a masochist

    Actually, trusts in AD are transitive by default. It's the NT4-style one-way non-transitive trusts that would need to be implemented manually.

    No fair posting "But someone will port NDS...".

    It's been done; NDS 8 can already use Linux as a server platform. I have no idea what Linux client support Novell offers or intends to offer.

    --
    "Where, where is the town? Now, it's nothing but flowers!"

  • Actually, that's required to keep backward compatability with Linux Samba servers.

    I think you'll find that it is also required for backward-compatibility with all non-Windows 2000 Microsoft operating systems, too. I doubt that retaining compatibility with Samba (which runs on a hell of a lot more operating systems than Linux, remember) was a major consideration when Active Directory was designed. Please don't blame Samba for AD's design.

    The namespace IS flat, because everything is stored in a flat directory for more effecient searching.

    This was done purely for backward compatibility reasons. It is not an efficiency measure; they could easily have keys like "rmalda.sales" and "rmalda.geeks" to avoid collisions for users in the same domain but different containers. In fact, containers in AD are purely eye-candy. They're not security principals; only users and groups are.

    I believe the actual namespace (the GUID) is about 23 trillion entries max.

    An AD domain could only hold significantly fewer than 23 trillion users, even given unlimited memory, disk and network bandwidth. You can't take the largest number AD can use as an index and call that the maximum capacity.

    --
    "Where, where is the town? Now, it's nothing but flowers!"

  • The Win32 API basically is the operating system. So of course you need to access the Operating System for an application to work.

    It is not. What it is is Microsoft's preferred and best-documented interface to NT. Any Win32 API that deals with a kernel object uses native APIs (e.g. ones beginning with Nt, Ke, Ki, etc.) to do the actual dirty work.

    Whether you need to go through the overhead of a windowing system and message dispatch queue just to ping another machine(as the original poster implied) is another story.

    The original poster implied no such thing.

    CSRSS.EXE is currently required to be loaded and running to even use an app that does no GUI stuff at all. This is unnecessary overhead. The poster did not imply that one needs to use the windows system's messaging facilities to do IPC or networking.

    --
    "Where, where is the town? Now, it's nothing but flowers!"

  • > Calls to the windowing system for IPC? What
    > about named pipes? Mailslots? or TCP/IP?
    > Or shared memory?

    All of those features are accessed using calls to the Win32 API (bar TCP/IP, which is accessed via the WinSock libraries). Therefore, the Win32 subsystem has to be running for those facilities to be available to the application programmer.

    --
    "Where, where is the town? Now, it's nothing but flowers!"
  • RealPlayer however has repeatedly caused my Win2k to totally lock up, never to recover. I geuss what MS always said about 3rd party drivers being at fault actaully has an element of truth.

    RealPlayer is an application, not a kernel-level driver. If an application locks your operating system hard, your operating system has a problem.



    --
    "Where, where is the town? Now, it's nothing but flowers!"
  • You have mixed up your terms here.

    If you have a domain called 'slashdot.org' then OUs under this domain will NOT be 'sales.slashdot.org' and 'geeks.slashdot.org'. These would be child domains and then you COULD have two rmaldas.

    According to this book, Windows 2000 Server Architecture and Planning, "OUs are container objects than can be used to organize objects within a domain into logical groups".

    I've been using the incorrect syntaxes. The two user objects I refer to would have the DNs CN=rmalda/OU=sales/DC=slashdot/DC=org and CN=rmalda/OU=geeks/DC=slashdot/DC=org; however, because of the flat nature of the domain database, they would both map to a UPN of rmalda@slashdot.org, which is where we get into trouble.

    Now, I believe it is possible to have objects with the DNs I have given, but to set things up so that they have different UPNs, but that would cause headaches of its own.

    --
    "Where, where is the town? Now, it's nothing but flowers!"



  • Well, I don't want to start a flame-fest, but I think it's very clear that Windows 2000 is obviously superior. Here's a list of reasons why:

    • The patented "Reboot-safe"(tm) feature! When you make most changes to Windows 2000, the system will force you to reboot. This ensures the changes won't cause any strange problems while the system is still running. On the other hand, Linux almost never requires a reboot after changes, how stable can it possibly be?
    • The friendly "Error-notifier"(tm)! Whenever there is a problem with the system, rather then just crashing, our innovative technology ensures that a friendly blue-coloured error screen is always made available first! (And unknown to many, you can actually customize the colour!) We tested Linux for a year, on the other hand, and didn't once see any friendly error message. Obviously it's not a very user-friendly system.
    • State-of-the-art "Intrusion-avoidance"(tm) features! You're always hearing about how other operating systems are broken into and used to attack yet more systems. Not with Windows 2000! Because of our intricate and sophisticated TCP/IP stack, most attacks will simply disable the system, preventing users from gaining access to data or being able to attack others! See www.windows2000test.com [windows2000test.com] for more information. (if it's up!)

    Well, there's lots more wonderful features in Windows 2000 that you won't find in Linux, but I think I've made my point.

  • I agree totally here.

    It took me roughly a month to install Slackware back in 1995. I didn't have the internet to help me, and I had some pretty unusual hardware, but DOS and Windows didn't have any trouble at all.

    Windows 3.1 was by far more memory efficient and faster. X was unusable on an unaccellerated 640x480 ISA VGA adapter... unlike Win3.1.

    I did some Fortran programming under the environment, which I found useful... but it was easier to dial into the VAX. The fonts were terrible, printer support was almost nonexistant, so for me it was effectively a bloated programming environment.

    I administered a small dial up ISP for a short period in 1997. They were running NT3.51, and were scrambling to try to put together a decent system for software development... I put the whole operation under Linux. No big deal. Everything was completely free and worked flawlessly.

    Since 1995, I've tried many times to establish a reasonable user environment under Linux. I've tried Gnome and KDE, and they haven't done anything other than to promise me what I had on a 386 running Win3.1... that is to say a reasonable printing architecture, some decent fonts, some standard keystrokes and cut-and-paste. I've tried simplifying things... for some time I ran ICEWM (which I really do like) and then proceeded to try to set up an email client.

    There isn't anything but promises here too. Netscape appears to be the best choice... unless I go back to PINE. So I tried to configure Netscape... it crashes. I try some more, it crashes again... infact it crashes quite reliably under many different scenarios... none of which are preventable. So I give up. I tried Spruce, which though promising, was simply incomplete.

    I then try to hear multiple audio streams simultaneously. After installing ALSA, reading many FAQs and getting it running, I learn that the Advanced Linux Sound Architecture doesn't support multiple simultaneous streams... like Win3.1 did. So I installed ESD, then I installed Real Player. After a week of applying various binary patches, reinstalling, reading reams of documentation describing how Real Inc. depended upon undocumented features and broke in V5, I finally got some alpha version of the player working and started listening to some Internet radio broadcasts.

    So I began working on tracking down another email client. Inches from headed back to PINE, reading up on fetchmail and procmail, Real Player freezes.

    Many other experiments revealed that it always would do this some 15 minutes into the broadcast.

    On another forray into the Linux world, I installed Gnomehack... it worked fine for a while, and I reminisced about playing hack for long hours ... When I went to restore a game one day, X crashed.

    It was remarkable. I launch the game, Gnomehack and X crashes, taking down all my other applications, loosing all my data, reliably, reproducably, and inexplicably. It obviously is a bug in my X server.

    After many forrays, and certainly many more to come, I have come to the conclusion that the best server OS for a small company or even for a home user is Linux. It is stable, not price prohibative, will teach you a lot about how computers work, and has many other bennefits. The best GUI for that operating system is either Windows, MacOS, OS/2 or just about anything else you can think of. Linux handles itself so well in a networked environment that there is no reason to put a keyboard or monitor on the device.

    I can use Pine just as well on my remote linux box as I can locally, and I'll even be able to fill the fonts crisply out to the edge of the screen, and even cut and paste from it.... while listening to radio broadcasts, and chatting on that whore of a program called ICQ... and I will spend less time reinstalling the operating system for the life of the hardware than I will configuring Linux to do the same.

    To Linux's credit, Linux is closer to a good UI than Microsoft is to a robust multiuser operating system.

  • The "What did you do wrong?" attitude is the same defense used by people who claim that they've been running Win'95 for months without rebooting.

    I was installing off a Sony CDU31a on an RLL HDD. That's an XT drive with a proprietary CDROM. Digging up the correct kernel, finding boot parameters, and trying to figure out why I was getting CDR101 read errors while installing took a month.

    Come to think of it, I never did figure out the read errors. The drive worked flawlessly under DOS, Win3.1 and OS/2. I wound up copying packages to my FAT partition to install the base packages. Any time I had to read more than a few megs, the CDROM would throw me errors. Despite all this, it was great for installing individual packages after the base was installed.

    With an IDE HDD and IDE CDROM, it might have taken me 5 hours...



  • In many a third-world country, where a copy of a 5-client Win2000 cost almost the same as the annual salary of a general manager in a middle-size company - and we aren't counting the cost of all the apps.

    So, in this context, if we have to do a comparison between Win2000 and open-source OS such as Linux, it'd surely be having a running title such as "To Be A Pirate, Or Not To Be ?"

    And if _that's_ not enough, if you use Win2000 without any legal license - ie. to behave like a eye-patch wearing pirate - you run the risk of being prosecuted, and the fine would be at least TEN TIMES THE COST OF THE SOFTWARES you pirate.

    That essentially means, in a third world country, if you run Win2000, you either look at forking a sum of money where you can pay a general manager for ONE WHOLE YEAR up front (to buy a legal copy), or, you run the risk of having your company computer confiscated (thus disrupting your company's day-to-day business activities), PLUS you'd be fined an amount that you could've hired TEN GENERAL MANAGERS FOR ONE WHOLE YEAR, at least.

    Does _that_ count as a valid yardstick of comparing Win2000 vs. Linux ?

  • "Beware of anybody that claims to be balanced yet fails to admit that (1) Ken Cutler designed the NT kernel architecture and he's an OS god, and (2) Linux is a based on a very good, but very old operating system (UNIX). This does not automatically make NT good and Linux bad. However, most of the good things about NT and the bad things about Linux derive from those facts."

    (1) Ken Cutler left MS in disgust after the 3.51 debacle (video drivers into the kernel to gain speed at the very obvious expense of stability).

    (2) The NT kernel is based on the VMS kernel which dates from 1976. And VMS is still light-years in front of both Linux and NT/2000 in terms of scalability, manageability, flexibility, reliability and... *clustering*

    If all the good stuff that the NT team COULD have pinched from VMS had actually found its way into the operating system then (a) NT would cost a lot more and (b) you wouldn't be comparing it wth Linux.

    FWIW, most of the similarities between NT and VMS are at an architectural level, in the design of things like the process scheduler and the memory manager.
    --
  • The important question here is, how much RAM did each box have, and what speed? And how loaded was each box besides the test being run?

    These are IMPORTANT points, but EASY to miss. RAM is extremely important for caching when you're manipulating large files.

    --
  • This is one of those things that you (and I and everyone else) just has to accept. You and I know that email and usenet support would be just fine, but management will never believe it. Management needs to pay for things to feel secure about it, and above all, they need to be able to point their fingers at something.

    you're plainly admitting here that your management is just being obtuse, over-conservative, and that you *know* it would work. in other words, it's your management's problem. not linux's problem, and certainly not the problem of every other management on earth. and btw I can point you to more than a few production linux or bsd-based boxes that don't have a support contract from anyone. guess who's saving money and making smart decisions, and who isn't.

  • How could a PhD in computer science want to be an MCSE????
  • You right, and while Ive only ever thoug about it almost this way, your post has promped me to a new idea.

    I think it was Miguel de Icaza who said it best - in *nix no one takes the blame for anything. He was talking about a user friendly / programer friendly desktop enviroment (kernel hackers: "not my problem / falt", X/MIT/xFree: "We do protocolols, not our problem", wm programers: "not my problem" etc, etc.

    And we all accept this, more or less. If netscape crashes, its netscapes fault. If the Gimp crashes, its the Gimps fault. If a window manager crashes, its the wms fault. And if linux crashes, its the kernel hackers fault.

    However, if anything crashes on Windows, we prehaps unfairly blame microsoft. Now, it may be true that application crashes are far more likely to down the OS in Windows land then in *nix, but, MS is getting better in this respect. We wouldnt shit on Redhat if a module in there kernel was flakey and crashed, would we?

    I can remember back in the MS-DOS days, and mis behaving TSRs, and crappy apps would lock up the system compleatly - DOS itself never crashed, and I remember Windows 3.1, which itself crashed a lot, and bad apps downed it too. 95 was better, and 98 was better still, and from what Ive seen at my new job for a week, w2k is excelent in this respect. *nix is not the best here, and windows is not the worst - MVS/OS/390 is proably on top, and MacOS on the bottom (of OSs in use today), so everything is relative

    Yes, I use win98 on the desktop at home, and will proably use w2k on the desktop at work, where there are w2k servers (primarly because ColdFusion wasent aviable for linux until recently, and because customers have Access db's). But primarly linux on the servers (and with fridays install of CF4linux, we may ditch w2k servers...) But at my last job (well mainly volunteer), it was solaris and linux on the servers, and linux on all but 1 desktop.

    The windows/linux race is to catch up to each other in the opposits strong point, linux to get to the UI of Windows, and Windows to get to the stability of *nix.

    I dont know: whats easier, adding stability to Windows, or adding hardware support and UI to linux?

  • Ok, here is my experience:

    Open top of Compaq Proliant 6500 with HotPlug PCI slots. Insert DualNic, close PCI slot cover. 2000 Advanced server pops up the new driver window, installs just fine, and brings both ports online. Open Network properties, set IP's staticially, and go. No reboots at all.

    Same process repeated in many other servers, with the only difference being the non HotPlug ones requiring a shutdown to install the card.

    2000 shouldn't need a reboot ever when dealing with the network. If it does, you did something wrong.
  • AFAIK, the software volume management (RAID/ mirroring etc) in Win2K is a (cut down) version of Veritas Volume manager (vxvm). VxVM (via the GUI frontend vmsa) is equally easy under Unix systems, and also has a full command line interface. I'm not sure that the command line functions are there in Win2K.
    --
    "I am not a nut-bag." Millroy the Magician
  • Win 2000 is used only in rabid Windows zealot shops yet. It has not been put to the test anywhere where the question of using UNIX actually stands.

    And even most of rabid Windows zealot shops have not gone through a complete server deployment as quite a lot of things still do not work or outright break.

    If used anywhere it is used for a client now. Especially for a laptop it is better than NT. It still does not even compare to Linux or BSD but some people are required to use M$ware due to company policy you know...
  • There are two things to point out about my 650 mhz Win2K Professional system:

    - Its mostly stable, good enough to be given the name as a decent OS.

    - Its about as slow as my 300mhz Win98 machine at home.

    I see it that we've sacrificed speed for stability. Of course I hardly boot to windows anyways, so none of this really matters to me ;)
  • I've got 256MB ram on my work PC (Win2K, 650) and 128 on my home pc (Win98/RH, Dual 300).
  • As far as servers, at work we run 3x(dual 500 xeon 1GB RAM) one for pdc, bdc, and exchange

    This is just the point: You have to have unique servers for anything in Windows. One server for IIS, one for MSQL, one for Exchange... Blah, blah...

    Unix can handle all of these reliably on one box. And you don't reboot if you want to install a new component/modify a configuration. It just works!

    The time this saves me alone makes it worthwhile. Add the cost of the hardware not required, and even my boss is convinced.
  • I'm guessing you are being sarcastic to the max. I like it. It sounds like you are really playing win2k to be superior, but then when you think about it you sound more like a true Linux fan. I like the reboot part. Yeah rebooting machines make a system real stable (LOL). Just think about what it does to your uptimes too. Who needs to be up 24x7 these days too.
    ~~~~~~~~~~~~~~~~~~~~
    I don't want a lot, I just want it all ;-)
    Flame away, I have a hose!
  • Well, let's add another perspective of this. At General Motors, we've used a traditionally Unix-based CAD/CAM/CAE system known as Unigraphics [ugsolutions.com]

    Recently Unigraphics Solutions [ugsolutions.com] ported UG to Win32. Thus, there has been a recent move, both internally and externally to GM, to move to Windows NT and Windows 2000 as the CAD/CAM/CAE platform of choice.

    Why? #1 reason is that it is supposed to be somehow "cheaper." Sure, an average Unix CAD workstation costs, what? $30-40K (U.S.)? Versus a Wintel CAD workstation of about $10-15K. Sounds good right?

    Wrong. GM is learning the Total Cost of Ownership lesson the hard way. Sure, the workstation is cheaper. Is the software license cheaper? No. Is the cost of UG designer any cheaper? Of course not. Is the cost of system administration cheaper? Not at first...while NT sysadmins typically make less than their Unix counterparts, until the system is completely migrated over (which will take AT LEAST 3-5 years), GM and its suppliers have to have BOTH types of admins. And they have to support Unix-NT connectivity issues, such as the above mentioned Samba issues (they call it "CIFS" because they don't want nnyone to know they're violating their own systems administration policy by using "freeware"! )

    First off, we're in the middle of porting GM's customizations to Unigraphics (known internally as the PDL or externally as the "GM Supplier Toolkit"). This has costed HUNDREDS, possibly THOUSANDS of man-hours. And now they're doing pilots. And in the pilots the designers are finding Windows 2000 to be far less stable than Unix (we're talking HP-UX 10.20 and Slowlaris 7 here), having to reboot the stupid things AT LEAST daily. If a guy decides he's going to save once and hour, and 59 minutes into the job he crashes, that's nearly an hour's worth of work he has lost! I've witnessed this phenomenon personally. Windows 2000 crashes more than HP-UX or Slowlaris. I can't REMEMBER the last time any of the Unix boxes were booted in my building (I could always issue an uptime command to find out, I suppose :).

    (BTW--we didn't experience the printing problems with samba, because we are going with Unix servers, running Windows 2000 clients for our pilot...samba seems to work great if it is acting as the PDC).

    Fortunately, there are rumours flying around about a Linux port of unigraphics. Maybe GM management will use the same logic (cheaper is better) and invest in Linux. :)

  • Mate, you should be forced to print this out and shove it up your arse. Let's have a look at this, shall we?

    Badly written software crashes Win2k? Badly written software should crash itself. If I fsck up on FreeBSD I get a core dump I can trawl through at the exact moment of the crash. Nothing else notices. NOTHING. I run through a secure terminal to a remote machine all the time. No problem. I write network software that runs in the background at ~40Mbit/s. No problem. An operating system's job is to protect all the other stuff from fsck ups.

    With the possible exception of Win3.1

    Badly written drivers? Yeah, there's not a lot you can do about this. The stock MS CD driver is what causes our win2k box to bin out.

    All Netscape/Real products? HELLO?? Do you not, maybe, think that these people are competitors to Microsoft? Perhaps, perchance, this is a deliberate ploy... Do you really thing Microsoft got to be this much of a monopoly by playing fair?

    And finally, "significant experience with Windows 2000". I saw it was a microsoft.com link and was expecting maybe an MVP? Perhaps one of the development team? At very least a marketroid pretending to be a sysadmin? No. A twat. Playing games. At 45FPS on a GeForce. Using 340 Meg.

    Unfortuately ShootOnSight.com appears to be taken, or the world would be blessed with a new website.

    Seriously. Post it again. The whole post. Put it in the root thread to see what happens.

    Just don't go near any computer networks.
    Dave

    Hang on, is this a troll? Nah.
  • An Ultra-virtualized hell? WTF is one of those then?

    Hell is COM in MFC. Don't believe me? Try it.

    Dave :)

    Mind you, "code for X - envy the dead" still has a certain ring to it.
  • .doc -> StarOffice5.2. Surprisingly good (disclaim: only use windows version).

    Photoshop -> will gimp do it? Otherwise, fair enough.

    VC++ -> Kdevelop rocks. Give it a go.

    Dave :)
  • The only serious thing I have used Win2k for was an MP3 player at work. Bitch to install. Completely failed to recognise an s3 videocard. 95% of ripping software wouldn't run (complaining about ATAPI or something). The network browser thing doesn't appear to browse the lan correctly. It has blue screened twice in a month and then shown some dire warning about how the drive may have shit it.

    Apart from that, yeah, I guess it's pretty.

    For my stuff I use NT4 workstation and run the excellent finnish X server on it (http://www.labf.com/) that costs but is worth every penny. Then real work can take place on three FreeBSD4 boxen.

    And this whole 'crap' software thing, by which I take it you mean stuff that doesn't come with a holographic label and draconian licencing agreement. For gods' sake. The entire mission statement for Win2k is to further leverage the microsoft monopoly into selling other products - office, IE, media player, IM, some shit we haven't though of yet... You're just falling for it, man.

    Best tool for the job. Speech-free, beer-free or for pay, just get on with your work.

    Dave :)

  • Hrm.. And lets think where all these not-so-rocket-scientists have gotten us.. Wander around Attrition.org and check out the defacement mirrors. Then look at the stats. These stats are the product, not of an insecure OS, but of the bootcamp MCSE's and generally undereducated people running them.

    Both BSD/*nix and NT get compromised. And it's almost always the administrators fault. Unless some new cracker group has discovered a sploit and you're the guinea pig, every sploit is documented. Every one has either a solution or a workaround. It's in the administrators hands.

    And we see what happens when we try to say "Oh, we don't need to spend a lot on a sysadmin, we have an easy OS with low TCO." Well, lets take a look at where it got us.
  • I installed Windows 2000 Professional about 2 months ago, and it worked fine for awhile. It was as stable as NT4.0, but recently it's started crapping out on me, just like every other version of Windows. So nothing has changed. And nothing will change as long as they keep building more crap on top of an unstable base.

    In case you're wondering, the only reason I have Win2k installed at all is for 3D Studio, Painter, etc.
    --
  • How do you force non-ACPI on install? I really don't like 6 devices sharing the one interrupt, especially considering one of those devices is the MPG decoder card, and another is the SCSI drive hosting the DVD...
  • "From a user's stand point Linux is now no more difficult to get around in than windows" Oh please! You can't be serious. Talking about FUD. You should spend some time with real users, the type that does not even know how to handle a mouse.

    From the implied strawman that linux users in business are just going to be handed a Slackware 7 CD and be told "go to it", sure. This isn't the case with windows in that market. All computers need initial set-up, and if your linux (or windows) sysadmin isn't smart enough to make a template user system and clone it at setup, then you hired the wrong guy.

    I challenge anyone to provide evidence that Windows 2K is any more user friendly for what Joe End User needs, which is e-mail and web browsing, and in some cases perhaps one specific other application, than is a cluefully set up Gnome/KDE environment. (No, I don't think they're equivalent either, but I want to avoid making coals for people to walk on -ed)

    ...and if my employees are going to find any games to play on the install, I'd rather it be nethack than Solitaire.

    -- -mrex
  • The way MS sees it you can hire a very very expensive six-digit income BOFH to manage your UNIX/Linux environment. Or you can hire a 24-yr old MSCE "techie" to manage your Windows network for $40k and save upwards of 60k.

    There are plenty of young Unix/Linux techies also. The six-digit income staff are managing larger facilities -- and if those larger facilities happen to be running Unix, maybe there's a reason for it. I'm also aware of large facilities using MS products, but they require much larger support staff than a Unix facility -- if nothing else, the MS staff are kept busy pointing, clicking, and reloading machines.

  • Less than once a day is almost never? *boggle*

    IHBT? Oops.
  • That wasn't so hard. When people say things like "that's like comparing apples and oranges!" it makes me crazy.

    They're rejecting a simple quality comparision, telling you the difference is instead in kind. Usually a response to an inappropriately simple "which is better?" question.

    By saying "it's like apples and oranges", they express that the answer is largely subject to individual tastes and requirements, as one's preference for apples or oranges would be. It rejects direct and general value comparison, not qualitatively contrasting descriptions.

    This has been a public service announcement from The Straight-Faced Pedant, long may he blather on.

    --------
  • However, it is possible in the NT model to have apps "customize various aspects of [their] behavior" on a per-user basis. Under "winroot, Profiles, username, Application Data" progs can save settings that get merged into the registry (I think) when the user logs in. That being said, almost no windows apps take advantage, possibly for the sake of running on 95/8. As I check my system, only Microsoft, Rational, and MKS put anything there, and I have the whole world installed.

    Try looking under HKEY_CURRENT_USER\ for the user-specific section of the registry. User-specific data storage is a different (and much newer) mechanism, so it hasn't garnered much 3rd party support yet.

    Simon
  • how much easier it is to configure an NT system than, say, Linux

    Please define "easier".

    For _ME_ (and probably a whole lot of people here) "easier" means: simple configuration that works, and does consumes little time.

    Now, today, I just spent 3 hours trying to configure a network card under Windows98. (Yes, I know this is not W2K, but the methodology is the same.)

    It consisted of installing the drivers, removing the drivers, tweaking driver configuration options via a GUI, tweaking OS configuration options via a GUI, finding and downloading older versions of the driver (in case there is a problem with the newest rev.) and installing them, lather, rinse, repeat.

    In the end, I gave up - it just couldn't be done.

    Now, configuring this network card under linux consisted of typing the following command:

    # modprobe tulip
    # ifconfig eth0 192.168.30.4

    Now, which was easier? Granted, there was no pretty GUI, but GUI != easy.

    For the uninitiated, a GUI can be easier to navigate, but if the stuff doesn't work the first time, you're just screwed - a command line is EASIER.
  • I dont' recall saying he was a friend. :)

    -Restil
  • Both win2k and Linux are operating systems. They provide services and run software. Which one is better depends entirely upon which services you want to provide and what software you want to run.
    In general, if you're setting up a server, go with Linux or *BSD. If you're setting up a desktop/workstation, go with Win2k. Linux and the BSD's are excellent server OS's, win2k is an excellent desktop OS.

    There are of course situations where you'd want a win2k server or a linux/BSD desktop. But you're going to have to know something about all these systems and know what you're trying to accomplish before you can evaluate which one would do the better job in that situation. In short, there are no simple answers except "It depends."

    Someone mentioned not running Netscape on win2k. Netscape is all I run. Has it crashed? Of course. Has it crashed the OS, no. If it did that would be an indication of a problem in win2k, not netscape. Everyone knows that netscape crashes. Why it crashes is another question.

    Lee
  • Comment removed based on user account deletion
  • Bull. They may feel like that when they buy the license, but they quickly find that the phone support is useless and Microsoft is just as hard as everthing else. It may have a pretty GUI to point-n-click through, but a fat lot of good that does when clicking Okay causes "VBScript Error 10234: Expected CallScratchMyButtWithAStick and saw CallGagMeWithASpoon at line 143." I did a lot of work with these small companies once upon a time, and I might still be interested in doing it if they were running Linux and I could reach their boxes by modem or SSH in the event of most problems.
  • You're saying that Linux is easier to administer for beginners than NT? In my experience, it's the other way around. I remember spending a dozen frustrating hours vainly trying to get an ssh server working at all on an RH6 box, because of inadequate documentation and cryptic error messages (I later got it working on an RH6.2 system). Ditto for samba.

    And don't even get me started on sendmail: its configuration files look more cryptic than perl. Trying to get things to work properly, I ended up with a hack that accidentally caused cron to send its messages to the admin of my ISP rather than me.

    If w2k has similar problems, that just goes to confirm the theory that All OSes Suck. I'm all for complaining about windows, but I wouldn't hold up linux as a beacon of non-suckiness.

  • Oh, fine.

    s/useless/not usable/g

    For the purposes of this discussion, they're exactly the same. Linux could probably serve most, if not all, of the functions of our SCO server, yeah. I know that, you guys know that, and our support guys probably know that too. But, as hard as it is for you to grasp, newsgroups and mailing lists just don't cut it for "support options" in the Real World, and maybe where you work it's different, but in most places, there is no such thing as a vital system without a support contract.

    I can maintain it, yes. And any yahoo who's hacked together a linux box before could support anything I implement. Hell, if I documented it correctly and had all the right tools installed, my boss could do it. (And she's not even a techie, she's just the treasurer.)

    As I said before, if I could get away with it, I'd have our firewall up and running on one of the spare 486's and an install of BSD in maybe a week. It just won't get past a single level of review without any real support.

    You can sit and whine all day "You can too use linux!" like a 5-year-old who isn't getting things exactly the way he wants. That doesn't change the fact that there are situations where you just plain can't use it, even if -- technically speaking -- it is, indeed, a viable solution.

    P.S. Yeah, I know I haven't really explained *why*. There is no why. If there was, I'd gladly write volumes about it. This is one of those things that you (and I and everyone else) just has to accept. You and I know that email and usenet support would be just fine, but management will never believe it. Management needs to pay for things to feel secure about it, and above all, they need to be able to point their fingers at something.

    Disclaimer: My views are not necessarily the views of my employer. (In fact, it looks like they're the opposite here)

    --

  • Speaking as a die-hard linux fan with a job for the local town government, let me reassure you that, to put it bluntly, you're dead wrong, and the guy you replied to was right on.

    Linux is useless without a support contract. Examples: We're going to put a firewall in shortly to guard our main server from the rest of the WAN (specifically, the high school, but everywhere else too). I'd love to do this with a 486 running linux or BSD, as we can get both easily and/or freely. But if I can't get outside support for it, it'll be *really* hard for me to get that approved. Sure, maybe *I* can support it, but what if it's still there when I leave? Then what?

    Or another example: We need to implement some central file storage in my building. There's only 50 or so users, so frankly, an old box running Win95 could do the job if it had a big enough hard drive. We've got plenty of space on the (SCO Unix) server, though, so I figure I'll use that. But will I be doing it with Samba? Nope. Much as I'd love to, I'm going with one of SCO's tools instead. Why? Well, we can get support for that. (Okay, so it makes my job easier too...but my original plan *was* Samba)

    One thing's for sure, though. There will never be an NT box in my building as long as I'm here. I share duties with the town's other sysadmins sometimes, and they all run NT...I'm not gonna deal with their headaches. Whatever NT can do, my SCO box can do better =)

    --

  • I never said it was Linux's problem (or at least, didn't intend to). I said Linux can't be used in this environment.

    As for those production Linux/BSD boxes...mind giving me some company names? I'd be making a hell of a lot more working for them than I do now, and from the sounds of it, I'd like it better.

    --

  • I use it because it's just good enough is the excuse people use to run Windows 95. Why bother with an OS that doesn't take full advantage of your hardware? Why pay $200 for a graphics card and be limited to the $170 that your OS uses? SDL isn't comparable to DirectX (I've used both, trust me)(SVGAlib isn't even in the same league given that it doesn't offer hardware acceleration) OpenGL is falling behind Direct3D, OpenAL isn't even ready yet, and nobody uses it yet. Overall, multimedia on Linux is pretty pathetic. I mean you can say, "ok, it's good enough for me," but I judge the quality of something benchmarked against the best product available in that catagory. Right now, NT is the best product available in the multimedia OS catagory. Sure you may be able to run MAME in Linux, but the point is that you'll be able to run it FASTER in NT.
  • NT may take longer to boot (not appriably though) but it runs faster. DirectX blits faster than SDL, windows scroll smoother in NT than in KDE2 or GNOME, menus pop up faster, apps take less time to load (than in KDE2 and GNOME at least, regular X apps are pretty fast) and 3D runs faster.
  • Netscape's code is sh*t on every platform!
  • TW, Linux has great multimedia support. Okay, okay, let me explain. I have a Matrox G400, and the hardware
    acceleration is *sweet*. Also, my SB Live Value is just excellent, I love the hardware mixing, the multiple DSP's... I
    installed ALSA and now the MIDI patches ("soundfonts") work too, and they sound good.
    >>>>>>
    Wrong. Linux doesn't have DirectX, as such, it's multimedia support by definition cannot compare with that of Windows. Not a flame, think about it. Most consumer hardware is designed for DirectX. Aside from idiot companies (ATI!) most manufacturers expose all the hardware's features through DirectX. While alternative APIs can take advantage of these features, rarely-used, or difficult to implement things always get left behind. Right now, Windows still has the fastest 3D hardware acceleration, and on NVIDIA cards (the only ones that offer competitive Linux drivers) Windows is still a good bit faster than Windows. In the tests where Linux was close to Windows, Linux had an unfair advantage. Apparently, the cool new speed-ups in the Detonator3 drivers were already in the Linux/X4 drivers. Thus, Linux 3D is close, but on cigar. As for sound, Linux is a distant second to Windows. ALSA still doesn't offer as many features as DirectSound, and all transistors on your EMU10K1 chip used for 3D sound are totally wasted in Linux. So, Linux lacks great 3D acceleration, great audio acceleration, and lacks 3D sound and force-feedback all together. How can it possibly have "great multimedia support?"

    It's all about picking supported hardware, though, which you still have to do for Win2K, as well. My DVD drive isn't supported, because I (a) couldn't find much information about that on the net, and (b) just bought it first, figuring I'd test it out later. I'll probably ask the developers about this, since it works under Windows. I've heard it can be made to work under VMWare too, so all I really need is some debugging info. :)
    >>>>>>>>
    You have to choose the correct HW on Windows as well, but you've got more to choose from. On Windows, if I want great 3D acceleration, I can pick any number of cards from ATI to Matrox to NVIDIA to 3DFX (ugh!) On Linux, I'm pretty much limited to Matrox and NVIDIA (not 3DFx with X4.0)

    Plus support for multi-processing makes it even better. Now, I don't have more than one processor, but I might set up
    a dual-proc test box if I can ever find an old board for it. However, I've seen it done on Linux, and it is sweet. No paying extra for a different version that just consists of a stupid registry hack, either. It has decent multi-processing support out of the box. And I'd love to see a comparison to Win2K here, since that's one thing that's supposed to be
    better in 2.4.
    >>>>>>>>>
    Huh? Win2K Pro has SMP out of box. There IS no single proc limited version of Win2K.

    How about that, eh, guys? Something based around Linux Kernel 2.4.0 with a bunch of stable stuff, vs. Win2KSP1, or whatever is current and patched by then. Test multi-processing, test well-supported hardware, RAID, whatever. Just test the hell out of it.
    >>>>>>
    Windows would win a lot of the tests. The benefits of Linux aren't so much in raw benchmarks but in overall quality. Sure Windows may win total TCP/IP throughput scores, but it will probably crash under high load. Sure the FS may be able to transfer more date through the system (though I don't think NTFS is faster than ext2) but will it do that consistantly, or in spurts? Also, anything that taps DirectX will totally blow Linux away, since a DirectX application can for the most part be considered seperate from the underlying OS. (Especially since Win2K allows DirectX direct hardware access.)
  • First, one question: do you think BeOS has great multimedia support? Heck, it doesn't even play
    DVD's! ;)
    >>>>
    I never said it did. I don't have a DVD drive, so what do I care? When I go trumpeting BeOS as the be-all end-all multimedia OS, then take me to task for it. However, In Be's defense, I have to say the MediaKit blows away anything I've ever seen in the multi-media arena.

    Windows and Linux and MacOS all have SDL.
    >>>>>
    SDL is just a wrapper.

    It supports DirectX on Windows, and DGA on X,
    which is the equivalent.
    >>>>
    DGA on X is equivalent to DirectX? Where's the 3D Audio API in DGA? Or maybe you mean DGA is comparable to DirectDraw? (Which still isn't totally true, show me the DGA API that allows me to change the screen-res.) DirectDraw is just a small part of DirectX.

    The tests I saw for 3D acceleration didn't differ by much; it's way fast for me, even for Q3A. My speakers can't even take advantage of 3D sound decently, and I think I saw patches for all that stuff, but I really don't care yet. However, Linux *does* have great 3D
    acceleration, and the audio stuff is in hardware! Maybe our definitions of great differ, but the actual
    performance I'm seeing is pretty impressive. And I'm not saddened that I don't have 'force-feedback' on my Gravis Gamepad, I don't know of any games that support it, and I thought it was a dumb idea on the Playstation!
    >>>>
    Your arguement boils down to "I see no need for it so It's not important."

    Fact> Linux 3D is slower than WindowsNT 3D. The tests showed it close (but significantly slower at high res) but they were skewed due to driver differences (I'm talking about the NVIDIA drivers.)

    Fact> My speakers, and many other people's DO support 3D sound well. For these people, Linux multi-media is sub-par. Fire up Half-Life with an A3D card. It totally kicks ass and the 3D sound adds a lot to the game.

    Fact> Half the hardware on my sound card goes wasted when I'm in Linux.

    Fact> If you're impressed with Linux audio, take a look at DirectSound. You'd faint.

    Fact> Force feedback is a dumb idea. The mere fact that you're comparing it to PS (which is just a vibrator) shows you've never used it. For certain games (racing games) it's awesome. And a LOT of game support force-feedback.

    Yeah, Windows has more hardware support; that's because Microsoft doesn't have to write *ALL*
    the drivers. This situation is changing of course, and I like being able to poke around with the
    source code, but it'll take a while for this one to change--the corporate culture can be pretty
    entrenched about these things.
    >>>>>
    Corporate culture aside, MS still does have more hardware support. Excuses are excuses, and drivers are actual stuff that you can base an arguement on.

    There are many different limited versions of Win2K. I don't care if it supports 2 processors out of
    the box, or what the configuration-of-the-week is; the bottom line is, Microsoft will always sell you
    the same product for much more by just making a minor registry tweak so you get the "new
    features", and I'm fundamentally opposed to that, because it's stupid; just as stupid as CPUs and
    overclocking, nowadays.
    >>>>>>>>>>.
    Overlocking is not stupid if the consumer is the one doing it. Getting 100MHz for free by doing a little jumper manipulation? Adjusting voltages to get that perfect stability plateu? Hacking at it's finest!

    Well, I'd like to see the results of the tests before I draw my conclusions; you may be right. But if I
    did the tests, and one platform consistently crashed under certain conditions, I'd note that and put
    it in my review; that's NOT a feature.
    >>>>>>>>>
    Windows is unstable compared to Linux. True. However, WindowsNT easily has a week or two of uptime, and for most gamers or multimedia people that's enough. They reboot their machines every night (Unless they're doing a rendering or something, and WindowsNT can usually handle that.) They don't install wierd software. (Hell RealPlayer flaky on BeOS and Linux too, it's not a Windows-specific problem.) For the average gamer, crashing is not an issue. A game being playable at one res setting higher on one OS (generally, you can play Quake on Windows at one res higher than you can in Linux) IS an issue.

    Another benefit of Linux is the multitude of configuration options. Given the time and resources,
    I'd love to just benchmark Linux against itself! That is to say, configure one box with a standard
    kernel, OSS, ext2, XFree86 3.3.6, and a couple of IDE drives, and then configure another box with
    an optimized kernel, ALSA, reiserfs, XFree86 4.0.1, software RAID... well, in any case benchmark
    all the components against each other, and find out what the fastest, most stable Linux
    configuration is for a given hardware configuration. That isn't as straightforward in NT, because
    there aren't that many configuration options, and many of them aren't obvious or readily available.
    >>>>>
    Huh?
    A) DGA is nothing compared to DirectDraw.
    B) Given good driver writers (and Matrox ain't one of them) a Windows driver is rock solid. My system has a NVIDIA card and NT 4. I started out with the Detonator drivers months ago and it has yet to crash. In fact, my life in Windows-land has been pretty happy. If your Windows system is all buggy and crashy, get some NVIDIA and Creative-Labs hardware, an Intel, Asus, or Abit mobo, and a K7 or Intel chip and everything should be peachy. The idea is to be quality hardware from quality manufacturers. If you do, you enjoy not only good stability, but the increased performance of drivers that are "closer to the metal."
    ---
  • Maybe because the only decent OS for 3D is Windows? Maybe because he does Windows development. Maybe he is a DTP guy and needs Photoshop and Quark XPress. (though you'd probably use a Mac in that case.) Maybe he likes having a stable web-browser. Maybe he is an AOL user? Being smart does mean hating Windows. If you're smart, you use what suits you best, not what you need to feel elite. Linux is good, but there are many tasks for which Windows is just better suited, and to think otherwise is just deluding yourself.
  • by Chuck Milam ( 1998 ) on Monday September 04, 2000 @05:36PM (#806522) Homepage
    For me, nothing beats the Linux community's best support model: free E-Mail lists. Have a problem? Ask a question on the right list, and odds are you'll get several good responses in just a few moments that start out

    "No problem. I just worked through this problem last week...here's what I did..."

    Beats paid, per-incident-charge phone-support monkeys hands down every time.
  • by Sneakums ( 2534 ) on Monday September 04, 2000 @09:32AM (#806523)
    In particular, W2K finally has real directory services (which Linux lacks)

    Windows 2000's Active Directory is a candy-coated shell over a reworking of the old NT4-style domain system.

    An example:

    Suppose you create an AD domain called "slashdot.org". You then create two OUs, so you have sales.slashdot.org and geeks.slashdot.org. Being a hardcore geek, Rob Malda will of course be rmalda in the OU geeks.slashdot.org. His username in the underlying domain datanase will also be rmalda.

    When Richard Malda join sales, of course you will want to add a user rmalda to sales.slashnet.org. But you can't. Why? Because even if you have objects in separate OUs, their names must still be uniqiue in the domain, because the underlying domain database has a flat namespace.

    --
    "Where, where is the town? Now, it's nothing but flowers!"

  • by jjr ( 6873 ) on Monday September 04, 2000 @08:48AM (#806524) Homepage
    I would have to admit that 2000 is an improvement over NT 4. The answer is still it depends.
    What are you doing with the system?
    Do you have any applications that you can only get on NT?
    Do you have the staff to support the os of your choose?
    Add your question here.

    NT, Linux, BSD, Solaris, MacOS these are only tools. Rememeber it not a war of OS here it is about getting the job done the best way possible.

  • by jetson123 ( 13128 ) on Monday September 04, 2000 @09:27AM (#806525)
    I don't know about others, but I prefer Linux because it does less and uses "old" technologies like text mode interfaces. I also prefer Linux because its APIs change less quickly than Windows. The changes from NT 4.0 to Windows 2000 generally make the situation worse, not better.

    Besides that, I have run Windows 2000 and Linux side-by-side for a few months. Windows 2000 is a bit more stable than Windows NT 4.0, has more up-to-date Win32 APIs, runs better on laptops, and the UI is a bit more consistent. The server edition comes with more software. Other than that, I think most users won't see a lot of difference.

    The biggest thing about Windows 2000 from my point of view is the Active Directory stuff, and that's an unmitigated disaster: not only does it play havoc with mixed UNIX/Windows installations, I think its directory model is poorly suited to non-hierarchical management structures.

    Don't underestimate the marketing value of Windows, though: on the surface, it looks like a coherent solution of integrated technologies that address most of a business's needs. It's only after a company has committed to it that they discover that actually deploying and maintaining it probably requires a bigger hodgepodge of local hacks and third party tools than Linux would, and at a much higher cost. Let's hope that there will be more Linux and BSD distributions that target the Microsoft client and server market more directly. In particular, on the server side, something like RedHat isn't streamlined enough yet to have the same appeal as Windows 2000 to non-technical business folks.

  • by wilkinsm ( 13507 ) on Monday September 04, 2000 @02:50PM (#806526)
    From a developer POV, I believe Windows still has a strong win in many areas. If one negates the windows registry (I still prefer using .INIs) and everything a year or less old (Kerberos support, AD, Message Queues, etc.) Then you still have one pervasive platform:

    DirectX - Love it or hate it, it does the job. While some areas (like DirectInput and DirectMusic) are still queezy at best, you can't beat DirectDraw's flexibility.

    UI - The 2D GUI is pretty hot. Font smoothing and color management. Solid control designs and IMEs. Try and write a multilingual application and see how far you get. Ever try and copy and paste multibyte characters in X? Oops.

    COM - I've tried working with CORBA, and so far it can't cut it. COM is an incredible piece of engineering, and it shows. If I had one wish for Unix, it would be a COM implementation that could rival Windows. COM+ looks good too. Too bad it uses the registry - most of the time. We shall see what happens with SOAP.

    MS Office - Sorry, but it's got to be said. Nothing beats Office - yet. Why does office work so well? COM. It will take many a manhours to raise Koffice/Openparts to that level. Well, if we had COM for Unix, and a lightweight VM, then maybe we could 'borrow' some of the more interesting pieces...

    Yeah, the rest of Windows is crap, but you get what you pay for - DirectX, the UI, COM and Office. Everthing else is just one of those four things. Oh yeah, and IE thrown in just for fun.

  • by pointwood ( 14018 ) <jramskov@ g m a i l . com> on Monday September 04, 2000 @11:36AM (#806527) Homepage
    According to the latest Netcraft report, Hotmail.com now runs Windows 2000:

    " Hotmail Windows 2000 migration completes without incident The migration of the www.hotmail.com front end from FreeBSD to Windows 2000 seems to be complete with all recent requests from the site served from Windows 2000 machines and no evidence of any FreeBSD/Apache machines remaining in the load balancing pool. Microsoft will be pleased with this as the migration was completed inless than a month, without any reports of service disruption, and the site has previously been a beacon for open source evangelism."
  • by Master Switch ( 15115 ) on Monday September 04, 2000 @09:13AM (#806528) Homepage
    All you are going to hear here is hype, and bible
    thumping. The only true way to get an unbiased
    view of things is to try both, and see for yourself.
  • by jilles ( 20976 ) on Monday September 04, 2000 @01:08PM (#806529) Homepage
    Look, I run linux at home. Nothing is easy on this OS from getting the fucking printer to work to getting the fucking wheel on my mouse to function properly, everything requires browsing tons of badly written HOWTO's.

    Windows can be a bitch too, but at least some of it works out of the box.

    "From a user's stand point Linux is now no more difficult to get around in than windows"

    Oh please! You can't be serious. Talking about FUD. You should spend some time with real users, the type that does not even know how to handle a mouse.

    "Sorry bud, the days of shrink wraped software are numbered."

    Guess what, linux is being shrink wrapped as we speak. It's not done yet, it will probably take another few years. Or do you really think joe average will be compiling his kernel soon?
  • by jilles ( 20976 ) on Monday September 04, 2000 @09:25AM (#806530) Homepage
    And lets not forget that it's peanuts compared to the money you lose on support staff. Better spend some extra money on software rather than 100K$/year on a good system administrator.

    Companies don't care to spend 30K or so on a good server. Especially if it comes with userfriendly software and good support. Linux is free but useless without a good support contract. Of course such support is available, at roughly the same price as for commercial software. The impact of license fees can be neglected when you bring in support cost and staff cost.

    Especially for small businesses, it is not affordable to have a knowledgable sysadmin around. They have to put up with the less educated sysadmins and therefore have to make investments in usable software instead.

    Windows 2000 is ideal for this kind of companies. You don't need a rocket scientist to operate it, it supports a lot of stuff out of the box, most of which is easy to configure. If you have educated staff though, linux/unix is the best way to go.
  • by TheDullBlade ( 28998 ) on Monday September 04, 2000 @10:01AM (#806531)
    Of course, trying it yourself is always the best way to learn about anything. Ignore the millions of hours of collective experience out there, if you spend a few hours with the products, you'll learn much more about subtle incompatabilities and transient, but catastrophic, bugs.

    It's also much more economical for you to duplicate all your services, train all your personnel in both systems, and see for yourself, rather than asking some questions and hearing what other people have to say about it.

    And, of course, it's totally worth buying as many copies of W2K, and the applications you intend to run on it, as you need to test them.

    Therefore, I obviously also must heartily recommend that you go out and try both yourself. It's not like you can save lots of time, effort, and money just by asking people who already know.

    --------
  • by mouseman ( 54425 ) on Monday September 04, 2000 @10:31AM (#806532) Homepage
    When people say things like "that's like comparing apples and oranges!" it makes me crazy.
    Then you might enjoy this 1995 AIR article, Apples and Oranges: A Comparison [inno-vet.com]
  • by be-fan ( 61476 ) on Monday September 04, 2000 @01:11PM (#806533)
    You're missing several points in your oversimplification of OSs to include only Windows and UNIX.

    There was still some DOS functionality as of NT 4.0, but I beleive it was mostly removed for Win2000.
    >>>>>>
    There is no DOS functionality in NT. All DOS programs are run in a virtual machine. That virtual machine is more or less unchanged in Win2K.

    NT is also a microkernel, which means it naturaly has some extra overhead in it that Linux's monolithic
    kernel does not. The still-mostly-vapor GNU HURD is also a microkernel. If done properly, the extra
    overhead isn't that much. The question is, did Microsoft do it properly? I don't really know.
    >>>>>
    Yes MS did it properly. The things holding back NT don't tend to be core system related at all. It's all the stuff MS added on top that sucks. NT4 for example is a good bit faster than Linux for most desktop-type operations. However, when you look at Windows2000 with all the crap they added (active desktop and all) you notice it's much slower. NT doesn't suffer so much from core-system bloat and bugs, but stuff-added-on-top bloat and bugs.

    In any case, its much easier for hackers to create a stable and speedy core system, while difficult for them to make a good UI.
    >>>>>>>
    Just plain wrong. BeOS: Fast stable, good UI. QNX: Fast, stable, decent UI. There are a whole bunch of systems out there that are fast and stable, and have good UIs to boot. Even Quartz seems to be pretty fast (in so far as a DPDF system can be.)

    On the other hand, its far easier for a corperate project to make a reasonably good UI, but
    diffcult to make a stable and speedy core system.
    >>>>>
    Again, not true. QNX is probably more stable than Linux and a more stable to boot (at least according to those who've used it.) Again, BeOS is managed by a corporation, and stability and speed aren't exactly high on Be users' lists of complaints.
  • by yoel ( 63192 ) on Monday September 04, 2000 @08:14PM (#806534)
    Okay...fair enough. It doesn't seem at all unreasonable that an 8-way Intel box can beat a Sun or HP box for speed on a CPU-intensive task. I just don't see what the OS has to do with it, really. Beyond handling SMP, the OS's job, in this case, is really just to get the hell out of the way. More relevant considerations might be: is it stable? Does it play well with others? Can I administer it remotely? Sure, W2K comes with a telnet server built in. But Windows isn't and has never been command-line oriented. When I can add a new vhost to IIS via the command line, I'll be impressed.
  • by tytso ( 63275 ) on Monday September 04, 2000 @08:41AM (#806535) Homepage

    It's well written, but will it change any I/T manager's mind? A lot of the reason why people choose NT as their server is because they're used to windows as their desktop, so they understand "how to drive it". People who aren't familiar with Unix will find setting up a Linux box with apache to be more intimidating than simply clicking a few buttons using NT. And, of course, these folks also don't know what they're missing in terms of reliability. We can try to tell them all this, of course --- and we should continue the efforts to do so. But ultimately, they need to experience Linux/Unix's reliability before they really get it. This is why efforts to retake the desktop are so important, in the long run. We need to make sure that it's not only just the elite technologists who can set up a web server or a print server. We need to be able to make it easy even for a MSCE to do it.....

  • by jonnythan ( 79727 ) on Monday September 04, 2000 @10:37AM (#806536)
    Well, the question is..how do we say which tastes better? This apple or this orange? Or _this_ apple or _this_ orange?

    An apple has white flesh with less juice. That's nice. Is extra juice good? How much is too much? Which is better - white flesh that's slightly chalky and uniform, or a thick orange flesh divided into sections?

    THAT'S why we say "that's like comparing apples to oranges." You can compare apples to apples, saying "this one isn't as chalky, and chalkiness is bad" or "this one is juicier, and juice is good in an apple." You can't say "this apple is chalkier than this orange, and chalkiness is always bad, so the orange is a superior fruit."

    Apples and oranges are both fruit; Linux and Windows 2000 are both operating systems. We can compare feature sets, but we absolutely cannot conclude that one is superior to the other because they _are_ as different as apples and oranges on the inside and outside.
  • by Floyd Turbo ( 84609 ) on Tuesday September 05, 2000 @02:22AM (#806537) Journal
    you're dead wrong, and the guy you replied to was right on. Linux is useless without a support contract.

    No. Your evidence does not support your argument.

    Your first point is that your PHB won't approve use of Linux without a support contract. That goes to show that your boss is an idiot, but it says nothing about the utility of Linux .

    Your other point is (or at least appears to be) that the system won't work if the only Linux-knowledgeable employee leaves. That also doesn't show that Linux is "useless without a support contract". A boss less idiotic than yours would insist that the system is documented and handed over properly in the event that you depart.

    I sympathize with anyone who works for a PHB, but "Linux is useless without a support contract" remains complete BS that shouldn't be seen outside the M$ FUD file whence it came.
  • by Money__ ( 87045 ) on Monday September 04, 2000 @10:51AM (#806538)
    Does a company who cares so very little about security belong in your server room?
  • by yzquxnet ( 133355 ) on Monday September 04, 2000 @12:56PM (#806539) Homepage
    Anyone admining a server ought to have enough skills to handle a command line or they need a new job.

    Couldn't agree with you more. In fact recently in one of my networking classes, one smart-ass tried to 'out-wit' the instucter going over command line info. (This was a Novell Server class)

    To the best of my recollection this is what he said. "Why are we even learning about these archaic(spelling?) DOS like commands? Isn't DOS like dead? Why would anybody want to learn DOS commands when we can do everything in a GUI? This is pointless?" To that comment, I let out an audible chuckle. The instructer quickly countered by asking him to write a Novell login script, or a batch file for the server.

    The really scary part was that roughly 3/4ths of the class agreed with the nut-case. I mean I'm going to have to work with these people who have this point-and-click-can-do-everything mentality.

    Yikes

    YZ

  • by Gothmolly ( 148874 ) on Monday September 04, 2000 @11:05AM (#806540)
    I'm a networking guy, so network setup and configuration plays a big part of my vote for an OS.
    In Linux, to add an ethernet card, and assign it an IP address, you have to either recompile the kernel or modules, then:
    • insmod
    • ifconfig eth0 ip.address up
    Because Linux doesn't care about PnP-esque things, you can then move that NIC anywhere in the system and it will still work.
    If you install a NIC in Win2000 (Professional),
    • Win2000 will find it and prompt for the driver.
    • Then you reboot, assign it an IP address, etc., then reboot again.

    But, since WinX tracks PCI devices, if you MOVE that NIC, it suddenly gets ugly. You have to re-add the driver and re-configure the card, with the appropriate reboots. Then you get a message like "The IP address you assigned conflicts with the IP address assigned to another card. If that card is ever reinserted, a conflict will occur. Do you want to continue?" So Win2000 has some sort of configuration memory, and its waiting for the NIC to "come back" into the old PCI slot, at which point it will "remember" the old configuration."

    This is all a pain, IMO. I prefer Linux because the OS is almost completely decoupled from the daemons (services) you're running, so that if you need to upgrade your SMB and NMB servers or their configurations, you simply restart them. IANAWin2000 Server guy, but I can't imagine that you can simply change your workgroup name, WINS server info, or heck, the actual SMB server code, without a reboot. The same goes for Apache/IIS.
    Under WinX, all the system services are too integrated with the system itself. No wonder that my Win2000 Professional system is using 60MB at startup, without any apps running. Linux provides more of a "base platform" to run stuff ON, while Win2000 seems to assimilate your environment and daemons into one sort of ueber-OS.

    That all being said, its a wonderful improvement over NT Workstation - USB support, IRQ sharing, multiple monitors, FAT32 support, while still retaining the NT-style security and full 32bitness.

  • by kruczkowski ( 160872 ) on Monday September 04, 2000 @09:37AM (#806541) Homepage
    I have win2k on my laptop, an IBM thinkpad 600, runs very nice. As far as servers, at work we run 3x(dual 500 xeon 1GB RAM) one for pdc, bdc, and exchange. all running NT 4 with SP 5. It is amazing that they run. no crashes. But i am scared as hell to touch them. I was going to upgrade to SP6, but then I said NO. good think I didn't. hell know if it would survive? One think I don't like about NT4 is all the silly permition crap. you have a permition for every damn thing. and to get something to work you have to play with it for an entire day just to get your ftp or www server up with authentication. Thanks why I don't like NT. Yes - I am an MCSE.....
  • by zlite ( 199781 ) on Monday September 04, 2000 @08:38AM (#806542)
    I'm sure there are other differences, but that one sort of jumps out at ya...
  • by shin0r ( 208259 ) on Monday September 04, 2000 @02:25PM (#806543) Homepage

    I've used 2k pro since about mid-jan 00. I've also used (in the last year or so) rh 6.1. suse 6.3, caldera "e-desktop" (is this a *real* linux distro?) and turbolinux 6.0. Here are my observations and conclusions.

    1:) 2K seems to crash more often than any of the nix distros i have installed, though it doesnt take out the whole OS, which is a refreshing change in a m$ environment. This may well have a lot do do with all the crap i install though.

    2:) If my house experiences a power cut, and my UPS fails, 2K can be rebooted without any heartache. IME, Linux (any distro) tends to fall on its arse. This will be disputed I am sure, but as I said, its all IME.

    3:) When I download a file, I want to click it, and it installs. I don't want to have to type "gcc etc" or "tar -xvf etc", then make etc. Especially when I come home from the pub.

    4:) As a server 0S, yes, I see the advantage of a nix distro. It is efficient and stable, and will run on most "old" hardware, without too much trouble. However, as a workstation environment, I prefer 2K tbh. It "feels" softer, more malliable. I *know* it isn't in real terms, but no matter how much I tweak Gnome or KDE, the "feel" isn't quite there.

    5:) It's nice when my box stays up for months at a time. But as a workstation environment, it's not critical to be honest. 2K on this box stays up for weeks at a time without hassle, and that satisfies my needs. I guess if I was running a leet 0-day juarez ftp, I would want the box to be up for years on end, but i'm not.

    6:)Quake 3 Arena runs better under 2K (with the latest voodoo drivers) than it does on my nix distros. Perhaps it's me being lame, but thats what i have observed.

    7:) Having grown up to use the paradigm of the win(32) environment, 2K feels natural and familiar to use. This, I should imagine, is part of the reason many sysadmins choose the win* route over *nix. It's comfortable, point-and-click computing.

    I suppose a direct comparison between the two OS's is a bit ambiguous; it depends on what you use it for (or what your users demand) in a real-life situation. Also, are we comparing 2k adv serv as a web server against nix/apache?; 2k pro against say redhat 6.2 as a dtop OS? Both are scaleable, to a degree, and both depend on how *you* set them up, with regards to stability and security.

    I have no loyalties to either camp. If it's not broken, don't break it, i reckon, which is why I will stick to 2K as a workstation environment. cheers :::: /////NOMEX flame retardant posting pants \\\\\ = ON

  • by mayneMC ( 229642 ) on Monday September 04, 2000 @08:10PM (#806544)
    I have been using Linux since '96 and love it. Just recently I bought Win2k, with some extra cash, for my wife to have something to use at home that she can understand (She hates Linux. But she loves ;O) fetchmail). Well I got it home and started the install. It was brainless. No problems. So far so good. (Oh, I guess you wonder about my home system. Dual Celeron 366 (OC>450) 160MB;8.4GBhd;Voodoo 3 3000;BP6 Motherboard) After the install I was suprised to find that a stripped down version of IIS comes with it. (FTP,SMTP,e.t.c) I then configured it for my home network (PMFirewall/MASQ/PORTFW/e.t.c running on RH6.2 and various other Linux workstations). I got it to work with Samba, at this point I am hating myself for having pleasant feelings for M$. I grab some games I run on my work laptop (WIN98) and installed them thinking that this would BSOD it, worked just fine? I have had the system running for the past 5 days know problems, yet. Wife has been check her e-mail (manually ;O)), surfing internet, and various other tasks just fine. I took my Linux Q3A and used the hack from Loki's website and got it to run on WIN2K just fine. The graphics were ten times better than in Linux on the same machine (Linux needs better drivers). I hate to say it guys I am very impressed, so far.

    Bottomline: WIN2K kinda OK. Hate the price.

    Still love Linux and all of it's beautiful free complexities. Linux is still a better tool by a million miles.

    --mayneMC
  • .
    they're used to windows as their desktop, so they understand "how to drive it".

    I have historically always used Windows with a C:\bin\ directory full of unix flavor utils. Recently, I switched over to Linux on my desktop for one reason - my servers were running it, and I wanted to be immersed in the same enviroment.

    Ironically, one thing that may really help to "sell" Linux as a server OS will be Linux as a desktop OS. Once I started using Linux on my desktop, I was freed from having to be so cautious about what I did to the configuration (when you're dealing with a $15k to $80k Solaris or AIX server with mission-critical data, you can't muck about "just to see what happens").

    The vast majority of people who are NT sysadmins (in my experience, and not *all* fit into this category) are actually just Power Users who are admining for a small LAN in an office with less than 30 people. Get those Power Users to use Linux, and NT's market share will fall faster than Novell's did... but they are only going to use Linux when it can do "neat things" on their desktop.

    I'm not sure who will wind up being more influential in getting Linux into businesses - Dell or Loki.

    --
    Evan

  • by gothic ( 64149 ) on Monday September 04, 2000 @10:43AM (#806546)
    Here's the setup:
    RedHat 6.0 on a Sparc 10 (That's a single 50mhz(?) processor) with 64 megs of RAM.
    Win2k Adv Server on a Compaq dual P-Pro 166 with 128 megs of RAM.

    On the Win2k machine, we *only* do web. Therefor there are no sharing, F&P is removed. Active Directory is removed, and 99% of the sites have Front Page installed. We also have PHP installed for one website, and perl available to all. There is a total of about 50 sites, all being very small, 2 - 3 pages, 1 to 5 pics.

    On the Linux machine, we do web, mail, shells, and other management stuff. There are about a hundred personal websites, and about 75 more commercial sites. None of the sites on here are small, per say. For example, we host the official Camaro SS website, some government websites, and many normal business sites. Most sites are 20+ pages, with over 30 pics to play with. To me, that's decient size, maybe not to everyone. PHP and perl are, of course, available and widly used.

    On to performance:

    How about initial connect time. The Win2k box literally takes up to 3 seconds to start sending you data, while the linux box takes 1 - 2 seconds (Mail beats up this little machine =])

    Response time, post initial connection: Win2k box and linux box are pretty quick, usually being next to instant, though the Win box seems to take a slight more time.

    Stability: Not bad at all. We had inital problems with the Win2k box, but that was from Active Directory and that 50+ IP bug. We removed AD and have the latest patches/updates and now the box is firm. The linux box is the same, all the latest RH updates, and it never goes down, nor have real problems.

    Annoyances in Win2k: Not having a decient way to admin FP-enabled webs without the Front Page program installed, or using the silly CLI util. MMC likes to quit responding, and lock. Easy to fix, but really annoying.

    Annoyances in Linux: Er..Uhm..*thinks*..I kinda like Linux, no complaints.. =] Then again, it'd be *really* annoying if you didn't like the CLI.

    Conclusion: If we didn't get Win2kAS for free, I wouldn't of considered running it. After using it and learning about it, I still wouldn't consider running it if I had to buy it. I like the low-overhead Linux can offer, and I feel it offers better bang for the lack-of-a-buck. I can't comment on how Win2k would be on a nice fast, expensive machine, but ISPs aren't the best place to go for the latest and greatest machines. I'll stick with my linux machines, thank you. Win2k was interesting to play with though, but so is everything new. =]
  • by Floyd Turbo ( 84609 ) on Monday September 04, 2000 @12:39PM (#806547) Journal
    Linux is free but useless without a good support contract

    Excuse me, but this statement is absolute BS. (and I won't even get started on how nonsense like this gets moderated as "informative").

    I've never paid a penny on Linux support, much less "100K$/year" to hire a sysadmin. The few problems I had that couldn't be solved by RTFM'ing and checking HOWTOs were quickly fixed by asking questions of Linux users, whether in person, on newsgroups or in IRC.

    "Total cost of ownership" for a small network providing basic (and a few other) services over the net and a LAN: ZERO.

    Cost if I'd had to pay M$ prices, plus hire some M$CE to figure it our for me: several thousand dollars more.
  • by account_deleted ( 4530225 ) on Monday September 04, 2000 @09:04AM (#806548)
    Comment removed based on user account deletion
  • by gnugnugnu ( 178215 ) on Monday September 04, 2000 @08:48AM (#806549) Homepage
    i hope we can manage some sensible discussion on this, dont be surprised if the posts contain a Linux bias (but you would not have posted on Slashdot if you did not expect as much).

    I can only say i have crashed windows 2000 serveral times but Microsoft have implemented "the not my fault system" so prevelant in Linux.

    W2k still has lots of program crashes (netscape for example) just they dont take the whole operation system with them and so its not windows fault. You get to blame the specific program, much the way you hear Linux users complain about X or Netscape but rarely do they blame the OS.

    Win2k adds all the annoying advertising and stupid frilly waste of space animations and effects of win98 and office 2000 (thankfully they can be turned off, but i cant seem to change the defaults).

    RealPlayer however has repeatedly caused my Win2k to totally lock up, never to recover. I geuss what MS always said about 3rd party drivers being at fault actaully has an element of truth.

    have not checked out the game support yet.
  • by Anonymous Coward on Monday September 04, 2000 @09:25AM (#806550)
    And to people saying Windows 2000 crashes for them, your either doing something wrong, or your installing "crap"

    Why would anyone want to use an OS where you shouldn't install software because of fear of crashing the OS?
  • by The Man ( 684 ) on Monday September 04, 2000 @09:32AM (#806551) Homepage
    If you want to make a reasonable comparison you need to look at the fundamentals not just the IT-mangler-friendly checkmark sheets. I'm not going to make any direct comparisons with Linux in particular, but rather with Unix in general for the situation interesting to me. Specifically, an academic environment with about 80 workstations, a handful of servers, and about 1500 users.

    The single biggest headache regarding w2k is that its multiuser capabilities have not advanced one iota since DOS 1.0. While Terminal Server is included, and provides the necessary core functionality to allow multiple users on a single server, thus earning the check mark, the actual implementation is a nightmare. Why? Simple - the evil all-consuming Registry. Everything wants to touch it, but if you allow it then you lose 100% of whatever security you may have had. Security vs functionality is a traditional tradeoff, but this is insane. You can have a little of either, but none of the other. The simple fact is that Microsoft has no concept of how to design a multiuser system. Instead of allowing each user to customize various aspects of application behaviour with small text files in their home directory, much system behaviour is controlled instead by a single central repository. Fundamentally flawed design, plain and simple.

    The 40 systems in our lab that don't run Unix converted from using NT server to w2k server over the summer break. It's been nothing short of a nightmare. Half the applications used are either broken or spew errors. Our beautiful unified dos/unix print quota system broke because w2k refuses to authenticate for samba (as usual, another release from Microsoft containing enough changes to intentionally break competitors' products). Active Directory trashes our DNS zone files, making them unmaintainable and routinely breaking mail and NFS. The list of problems goes on and on...

    Microsoft has conclusively demonstrated that the only sane upgrade from NT4 is Unix. Don't buy the hype. w2k may crash less than its predecessors, but the headaches involved with it are no less numerous or severe. If you don't like Linux, use one of the BSD flavours. Microsoft is just not an option.

  • Not really.

    Venus is the second planet from the Sun and the sixth largest. It orbits at .72AU, and is 12,100km in diameter. It has a mass of 4.9^24kg. It has a day that lasts 243 Earth days. The average tempature is about 740K. Venus' surface looks a lot like the American midwest. Most of the planet is covered in lava flows. Venus has no satellites.

    Mars is the fourth planet from the sun and the seventh largest. It orbits at 1.5AU, and is 6,800km in diameter. It has a mass of 6.4^23kg. The average temperature is -55C, but it gets as high as 27C (80F!) during summer. Its surface area is approximately the same as the land area on Earth. Short of our planet, Mars has the most interesting terrain of any of our planets. There is excellent evidence that there was, at one point, water on Mars. It has ice caps on either end of the planet, made of carbon dioxide. It has two satellites, Deimos and Phobos.

    That wasn't so hard. When people say things like "that's like comparing apples and oranges!" it makes me crazy. Apples are red, and about 90% of the size of an orange. They're covered with a thin red (or green or yellow) skin, and have white flesh. Oranges have thick flesh, orange in color, with orange flesh that's divided into sections. It generally contains more juice than an apple.

    See? That worked out, too. Now, let's see if we can get a decent comparison of Linux and Win2K.

    -Waldo


    -------------------
  • by afc ( 12569 ) on Monday September 04, 2000 @09:40AM (#806553) Homepage
    All this talk of support costs making the TCO (argh! I hate having to resort to marketing jargon) of a Linux solution being higher than that of a similar Windows one is ingenious, but patently false and bogus.

    Believe it or not, but Windows 2000 also needs skilled administrators. Believe or not, but Linux (or Un*x in general) sysadmins are not rocket scientists. Believe it or not, both the Windows and the Linux sysadmin earn more or less the same amount of money (close enough as to be irrelevant for a large company).

    So, in the end what's left is marketing perception, or that warm, fuzzy feeling inside that some IT managers get from having all their IT solutions coming from a single vendor (be it Microsoft, Sun, IBM or Unisys). It's not economical, but rather psychological. And it is this perception that Linux companies have to tackle in order to gain marketshare. And believe it, that's what they're doing, albeit with very small strides.
    --

  • by Restil ( 31903 ) on Monday September 04, 2000 @09:24AM (#806554) Homepage
    This may be a bit off topic, but it does fall into the windows vs. linux debate as far as ease of use is considered.

    An aquantaince of mine is starting an online e-commerce site and decided to use win2k over linux since "its a naturally graphical envionment and therefore he can use it, whereas he simply CAN'T use anything with a text interface". Of course, linux has GUI capabilities, but lets assume for a moment that it didn't.

    For an entire month, he spent every spare waking moment trying to get several e-commerce packages working on win2k, spending many hundreds of dollars in the process and spending many hours on phone with tech support. He even got so desparate he came to my house and banged on my door at 3 am screaming for help because he couldn't get it working (I couldn't either for that matter, but I wasnt' foolish enough to spend a month trying).

    What I find somewhat depressing in this regard is he had also obtained a linux based e-commerce package, managed to install redhat all by himself and install the software and get it working without ANY problems, but chose instead to run win2k because he didn't know enough about linux and didn't want to spend the time learning it.

    I find it quite humorous that he could have learned quite a lot in those 30 days. I really think this is the mentality that pervades many people in a position to choose between the two.
    Pity.

    -Restil
  • by rabababoa ( 86240 ) on Monday September 04, 2000 @09:17AM (#806555)
    I have worked with both VERY extensively, from playing with every gadget Win2k server offers, to making my own linux distrobution for a standalone product (to be revieled in the future).

    Windows is meant to be pretty. It accomplishes it. And its meant to use **the right software**. I run windows 2000 professional on my home machine, have been so since the early beta days, and it HAS NOT CRASHED. You have to treat it properly. I.E. Not installing shit software. When you take the NT kernel and play the game the way it likes, you will be successful.

    I guess its possible to say roughly the same thing for linux, except that things are more clearcut. Linux/BSD (dont forget about bsd) can do everything that win2k can do, with the exception of running windows binaries perfectly (by perfectly i mean executing the code as it was meant by the developer).

    Personally, my servers are linux and bsd, and my workstations are win2k. Its all about the sysadmin creating a solution to mold the two, which i have found to be extrordinarily easy and fruitful. Why not Linux/X on the workstations? Why the hell is that a good idea? X Crashes, netscape crashes. In win2k using it over a year, my explorer has NOT crashes, and IE has NOT crashed. Ive used both extensively, and its much better to pay for win2k (150 for oem client). Its chumpchange compared to how much you will save in support costs.

    BTW, Windows 2000 has great multimedia support. Plus support for dual processors makes it even better.

    in recap, I am basically saying that when you have a REAL system administrator (NOT MCSE, NOT CERTIFIED, HELL NO COLLEGE), someone that knows things in and out, and can get things done, either solution works. Its all about the needs, and what OS 3rd party applications are made to run on.

    And to people saying Windows 2000 crashes for them, your either doing something wrong, or your installing "crap" (i.e. netscape, realplayer, etc.). Yes, theres no point in netscape when IE Renders better, renders faster, is built for the OS, and does NOT crash in Win2k. The only reason to run netscape is to show your support for it, and its too bad nobody cares anymore :)
  • by Fervent ( 178271 ) on Monday September 04, 2000 @02:35PM (#806556)
    This was posted by an AC under one of my posts, and I thought it deserved more of a spotlight, so here it is again:

    I agree - I am nothing but impressed with W2K Advanced Server. I have extensive experience with Solaris/HP/AIX in a production environment, and we have been playing with W2K recently.

    I decided to port some production solaris code to W32 to do some 'real world' tests. These apps take large (12+ GB) files from a mainframe and process them for a datawarehouse. They are C++ programs that do file processing --- file in (read) --- manipulate the data (process) --- and file out (write) (the port was simple - no code changes). They are very processor intensive (not so much disk).

    We have an 8 way P800 (for W2K AS) and a 24x E10000 and several K class HP's. I moved the data files to the W2K box and ran the fileproc app 8 times at low priority each working on a different file at the same time. This used 100% of the box. Because it was running at low priority, all other box functions worked beautifully. You could not tell they were running from a system perspective. So far just like HPUX or Solaris from a scheduling perspective. A single file test on NT4 Server has same result - the scheduling piece is not new...

    The good news is that the W2K box ran all 8 programs in 2 hr 47 minutes. The same result took 5 hr 56 min on the E10000 (6 mths old - 100% of 8 procs) and 14 hr 45 min on HP UX 11 on a 6 way K. Couldn't test AIX box :( I am predicting the results would be similar to the K

    Pretty telling - we are now in the process of moving all our mainframe file manipulation software and reporting to W32. The current plan is to sell the 10000 and buy another 8 way or possibly Datacenter. The money saved on hardware is incredible.

    BTW - for fun I tried this on Linux box (RH6.1), but the SMP and filesystem (can't handle file's that big) problems prevented any sort of real tests to work. Did work on a 2 way box, but the results were uninspiring (1 program took 4:23) on a cut 2 gb file. Same code - all optimized for the platform and processor.

    And for those that haven't used Winders since 3.1 (most of slashdot) - I can have a terminal from anywhere in the world to do remote admin. All the arguments are gone guys - I seriously believe the days of UNIX are numbered. Please check your own facts - ours are strong enough to phase unix out of our shops in the next 12 months.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...