Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Businesses The Almighty Buck Technology

Does IT Matter? 363

Posted by Cliff
from the is-the-sky-blue? dept.
geoff313 asks: "I'm sure many of you are aware of the uproar over Nicholas Carr's article 'IT Doesn't Matter' which was published in the Harvard Business Review, back in May. While many big names in the IT world have responded already to Carr's article (Ballmer has declared it 'Hogwash' and Fiorina has pronounced it 'Dead Wrong'), Carr debated vendor executives Monday at the Comdex trade show, proving that the issues he raised are still resonaating through the industry. Do you feel that corporate IT budgets should be focusing on cutting edge technology to best serve its customer's needs, or should they focus on shoring up what they have now in order to maximize its usefulness to the customer? Some background can be found from the Washington Post, InfoWorld, and ZDNet, as well as at Nicholas Carr's site."

"For those of you unfamiliar his philosophy, it can be summed up pretty thoroughly by his statement 'Follow, don't lead,' arguing that the huge advances in the IT industry over the last two decades have erased the strategic advantage to be had by corporations for staying at the cutting edge of technology. In short, he advises 'executives need to shift their attention from IT opportunities to IT risks - from offense to defense.' Of course the head honchos at IBM and Microsoft disagreed with him, citing Wal-Mart's use of RFID tags to keep track of inventory and other forward thinking IT decisions as a refutation of his thesis.

What I am interested in is the opinion of those in trenches of the IT war."

This discussion has been archived. No new comments can be posted.

Does IT Matter?

Comments Filter:
  • Balmer (Score:5, Funny)

    by glassesmonkey (684291) * on Wednesday November 19, 2003 @07:41PM (#7516163) Homepage Journal
    Developers.. Developers.. Developers.. Developers.. (Thanks Steve for the millions of smiles)
  • Just do it . . . (Score:5, Insightful)

    by bob_calder (673103) on Wednesday November 19, 2003 @07:43PM (#7516178) Homepage Journal
    Maybe people should concentrate on doing what they really have to do, and do it well. If it happens to use a computer, fine. Clay tablets might work jsut as well for some applications.
    • Losing strategy (Score:5, Insightful)

      by t0ny (590331) on Wednesday November 19, 2003 @08:43PM (#7516611)
      Personally, Ive always viewed reactionary, defensive strategies to be losing strategies. Only by having an offensive (no pun intended), proactive mindset will people generally succeed. Intelligence and creativity arent defensive traits.
      • I intended a Zen (Score:5, Insightful)

        by bob_calder (673103) on Wednesday November 19, 2003 @09:15PM (#7516785) Homepage Journal
        approach to the issue. Neither agression nor defense. Just do what needs to be done. I can't tell you how much the misuse of analogy in my industry affects me. People don't just go wrong. They do it spectacularly by thinking that life is an analog of *insert the name of a sport here*. Obviously things in life are similar, but they are separate and should stay that way. Men and women, stuff like that. :-)
        • by jedidiah (1196)
          If you can have a better mousetrap built, then you better do it before your competitors do. That's really all there is to it. Some of the examples cited (such as Walmart) are quite good examples of companies seeking out this "better mousetrap" and using it to great competitive advantage.

          • by 0x0d0a (568518) on Thursday November 20, 2003 @12:30AM (#7517692) Journal
            If you can have a better mousetrap built, then you better do it before your competitors do.

            I disagree.

            This may hold water if you're a mousetrap manufacturer. If it's your core product at stake, sure. But IT is a support system, almost always a cost center. The question is whether the latest-and-greatest from HP, Sun, and Oracle is really worth latest-and-greatest prices. Is having a product that's two years newer going to save your company how much you're dropping on it?

            The huge gains that came from computers were when, say, Levi decided to computerize inventory management. Now it's computerized. They don't have to worry about warehousing junk where it isn't needed or paper mistakes costing them huge sums of money. Are they going to save more money by using a new HP system with WhizBang-3d-with-sound GUI analysis instead of IBM's old system? Maybe some, but nothing like the gains that have already been realized.

            Furthermore, even if the IT spending is worthwhile, spending it on *products* may not be a good idea. If a CIO decides to drop five million dollars on software licenses, that's 100 man years of IT work that he's just exchanged for latest-and-greatest. If he does this every five years, he's losing 20 support personnel. Twenty people can get an awful lot of work done.

            Walmart moving to RFID tags could save a *lot* of money, because they eliminate a lot of human work. However, there are two reasons jumping ahead like this ain't necessarily a great idea. First, RFID is a potentially big money saver, but advances like it also don't come along very often. Second, a lot of IT purchases and decisions turn out to be *bad* moves. If you let your competitors lead, and let them soak up the costs of development, only copying them when they do something that works well, yes, you lose a bit of lead time. You might have to absorb some losses. But you also gain a lot of money, and have time to let competing vendors enter the market space.

            This is one reason why a lot of successful big companies are pretty conservative. Microsoft doesn't actually try very many new things for a tech company (and when it does, it tends to not do very well). Microsoft does *much* better by sitting around, waiting for masses of tiny companies to try various things, and then buy the one or two that succeed. Sure, they have to pay a hefty price for the one, but they let hordes of VCs fund their development and testing, rather than having to do it themselves. Most of Microsoft's primary products were originally developed by other companies that were then acquired.
            • by squaretorus (459130) on Thursday November 20, 2003 @03:39AM (#7518194) Homepage Journal
              All IT students should be forced to work in retail for a year, manufacturing for a year, and in business administration positions for a year before touching a keyboard. They'll see the latest systems from HP / Oracle etc... being used in the workplace to marginally improve processes while the wastage still just keeps on coming.

              But IT is a support system

              I would argue against this basic assumption of yours. I want IT to replace the majority of the management tier within Walmart etc... because the majority of the lower management tier in Walmart etc... are absolutely shit at their jobs.

              Example? I worked throughout my 4 years at University part time (full time out of term) for a major supermarket. Every Christmas for four years I spent a majority of my time in the store apologising for the lack of Milk, Beer and Bread. I had access to central inventory systems and could see vast quantities of the stuff sitting in the distribution centres, and I could see our bakery continuing to work an 8 hour day when we could have sold a 16 hour days worth of breads.

              When approached Management delivered a condescending 'its more complex than that kid' or a more honest 'I dont want to be stuck with any cluttering my warehouse following the new year'. They would rather sell 50% LESS than have a 1% stock holding after the peak season. Also, they were happy to piss off their customers at Christmas. Shit at their jobs.

              How many indicators does a decent IT system need to do a better job? The checkouts tell you that the last beer went through at 2PM every day, that the last bread went through at 11AM. The stockholding is shown as 0 for > 25% of the week.

              Some minor systems are in place to handle these things - but they are crude. (Example - a few years back my local store in Scotland took delivery of 3000 England replica football shirts - 28 sold the rest were returned).

              By solving some of these issues you can kill off a huge number of overpaid under qualified under skilled smug shits and have a BETTER company. IF the IT graduates get the idea. So - 3 years of workplace should be mandatory prior to an IT degree of any kind. Unless your planning to become a bearded AI researcher in which case please yourself!
              • by TheLink (130905) on Thursday November 20, 2003 @09:49AM (#7519591) Journal
                But that's a management problem, not an IT problem.

                As you said the info etc is all there. But people are just refusing to act on it because perhaps the perceived danger/punishment for having 1% excess stock is considered worse than the reward for selling more, factoring the perceived risks. They may be rewarded more for having no excess stock. Which indicates a management issue.

                It doesn't matter if a ship has tons of sophisticated systems, if the captain overrides all of them and slams into an iceberg.

                On the flipside pilots of planes have done amazing things despite sophisticated systems failing.

                Look at HP, just a single person (Carly) can do more damage to HP than tons of IT can ever fix or prevent.

                So while IT matters, for most organisations it really doesn't matter that much, nor should it. Accounting and Finance probably matters more. Good bosses and leaders matter far more.
            • by Bi()hazard (323405)
              I disagree with your use of the analysis of Microsoft. They do try to innovate, and they don't just sit around waiting for the next greatest mousetrap to become a success. They build the better mousetrap before their competitors do because they buy out before their sources are fully developed. Microsoft is still leading market penetration with their products because they do the second half of development.

              The VC funded startups come up with a superior core trap mechanism, but it's Microsoft that turns it in
    • This also extends to the lunacy of upgrading software products/programming languages often.

      How many people seriously need something like J2EE or .NET for software development of their data processing systems?

      How much gain is there really to be made from switching a system to ASP to ASP.NET? OK, there are advantages, but disadvantages too. Rewriting means shifting your skill base - forcing them to learn new stuff, and they lose the experience. After 10 years of COBOL, I used to be able to virtually code

  • by Matey-O (518004) * <michaeljohnmiller@mSPAMsSPAMnSPAM.com> on Wednesday November 19, 2003 @07:44PM (#7516186) Homepage Journal
    Is now into Department website activity useage and Intrusion detection...There wasn't a whole lotta that going on in 1997, and your business is pretty hosed without SOME attention being paid to security inside and outside your business walls.
  • IT... (Score:5, Insightful)

    by QueenOfSwords (179856) on Wednesday November 19, 2003 @07:46PM (#7516198) Homepage
    Is the business equivalent of Perl. It Makes Stuff Work. Worrying about IT isn't the right approach. Businesses should decide what they actually need to stay competitive.. and deploy that using only what IT infrastructure they need. IT's a means to an end. It DOES matter, but it's wrong to view it as an end in itself (and hence, an 'issue').
    • Re:IT... (Score:5, Insightful)

      by Anonymous Brave Guy (457657) on Wednesday November 19, 2003 @08:07PM (#7516388)

      Absolutely agreed. My first reaction was that the answer to this:

      Do you feel that corporate IT budgets should be focusing on cutting edge technology to best serve its customer's needs

      was this:

      No, corporate IT budgets should be focussing on technology to best serve the corporation's needs.

      Whether or not it's cutting edge is irrelevant; what matters is whether it does the necessary job. Whether it serves the customer's needs is irrelevant (to the corporation) too: there are lots of in-house needs that can be helped with good use of IT, and serving the customer's needs is certainly good business sense, but in business you do it for that reason, and not as an end in itself.

      • Whether or not it's cutting edge is irrelevant; what matters is whether it does the necessary job. Whether it serves the customer's needs is irrelevant (to the corporation) too: there are lots of in-house needs that can be helped with good use of IT, and serving the customer's needs is certainly good business sense, but in business you do it for that reason, and not as an end in itself.

        Well the problem with that statement is that it can make your company stagnant. For example, about 10 years ago the comp

    • Re:IT... (Score:4, Funny)

      by Anonymous Coward on Wednesday November 19, 2003 @09:09PM (#7516759)
      I say fuck IT.
    • Re:IT... (Score:3, Interesting)

      by sowellfan (583448)
      I'm an engineer (I design air conditioning systems), and as far as the manufacturers I deal with are concerned, this article is on the money. We've got an entire wall of catalogs of assorted types of equipment, and most of them almost never get updated. Even the common brands like Carrier & Trane that have dedicated reps in our area fall behind. I tend to specify the equipment that I can get the best support for, which includes giving me access to the newest catalogs, the most updated product selection
  • Sounds like sound bytes for neophytes, commerce drives the IT pony, always has been, always will be. Perhaps to the chagrin of Messers Bulmer & co, who'd like to think that they drive the 'supply & demand' pony.
  • Does IT Matter? (Score:2, Insightful)

    by Anonymous Coward
    It does not if you can do without these
    1. Send e-mail and instantly communicate with IM services
    2. Pay bills and manage finances online
    3. Get lot of information about anything you can think of within seconds.
    4. Manage every aspect of your life ( jobs, health, you name it) with the help of technology.
    • Re:Does IT Matter? (Score:3, Insightful)

      by ccoakley (128878)
      Please, read the article. He is saying that IT doesn't matter in the same way that the rail or electric industries don't matter. You pay your electric bills to keep the electricity flowing, but most companies should not be investing assloads of cash into, say, alternative energy sources. Instead, when an alternative energy source comes along that is better than what you've got, switch.

      As a company, I do not need to invest heavily to get any of the advantages you list. They are a commodity... they don't mat
    • Re:Does IT Matter? (Score:3, Interesting)

      by michael_cain (66650)

      Let's consider each of these in a business context, which is the article's setting.

      Send e-mail and instantly communicate with IM services

      No question that this is useful, although I have to admit that there are some downsides to having senior management with e-mail at their disposal -- since they can send a message "instantly", they expect an answer "instantly", even when that answer requires some days of effort to obtain. But to the article's point, you DON'T need the latest and greatest hardware to

  • If the next president believes in a real high speed network I.T. will become more important because everything including HDTV and Phone will go through that one fiber.
  • by drinkypoo (153816) <martin.espinoza@gmail.com> on Wednesday November 19, 2003 @07:47PM (#7516211) Homepage Journal

    When the question is whether to boldly lead, or cleverly follow, the answer is always both. You lead where you can, where you have opportunities to, because your IT department taking some initiative in expansion means that you can grow the business above it. You have resources, products, and customers, and IT sits in between all three of them to some degree, and makes them possible, just as your maintenance department does. After all it's kind of hard to have meetings if the lights are off, right? And it's hard to do business when you can't get to your databases, or if your customers don't know about your products, or whatever else that isn't possible without IT.

    The solution is always to strike a proper balance between expansion and consolidation in all of your departments, lest they grow too large and consume too much of your resources, or fail to grow enough to keep up with the rest of the company. It doesn't matter if we're talking about IT or R&D.

    • The reason IT is in and not out is the competition's need to out compete it's rivals. IT is the cutting edge capability in competition right now. As basic as it sounds, the fear of another company coming up with a competitive edge beyond your own will drive the need for IT until a greater edge is found. Until something replaces IT as that edge, IT will continue to be king.
  • RFID tags (Score:3, Insightful)

    by henryhbk (645948) on Wednesday November 19, 2003 @07:47PM (#7516214) Homepage
    The "huge advance" of RFID tags has yet to demonstrate a large competitive advantage. Although the presumed benefits of tighter inventory tracking, should result in some cost savings, it has yet to be shown that it will either revolutionize Wal-Mart (I mean, how inefficient is their current UPC laser scanner tracking?) or lower costs to the consumer. You can get a lot of milage out of a high-school student at minimum wage with an Intermec scanner... This harkens back to the debate of fancy tape robots vs. high-school students to flip tapes... (the students tended to jam less often, but could get hung-over)
    • Re:RFID tags (Score:3, Interesting)

      RFID might work at UPS. I used to load trucks there while in school.

      Every box gets scanned coming into circulation, entering the warehouse, being loaded into feeder trucks, coming out of feeder (semi) trucks, going into delivery trucks, and then when delivered by the brown-shorts.

      Every time the boxes get scanned (at each event listed above), it is by some sucker in the Teamster's union. Think Jimmy Hoffa. These guys make upwards of $9.50/hr, and get health/dental insurance.

      UPS will develop their RFID
    • Re:RFID tags (Score:2, Interesting)

      by Cramer (69040)
      Odd, I've never seen a robot drop, mis-feed, or jam a tape. And they cost far less, and work far more hours than a high-school kid.

      UPC codes take some work to scan. A smudge or bend makes it hard to read. And for self-checkout, it's much easier for people to put something through a hoop than it is to get them to find and align the UPC code for scanning. Don't laugh... I've seen people too dumb to scan their own items. (Personally, I'm too fast for the self-scanner. Gimme the real register.)
  • by crmartin (98227) on Wednesday November 19, 2003 @07:48PM (#7516216)
    Like most of these things, the answer to the question is not "yes" or "no". Having the best new technology doesn't matter: lots of companies are still running happily on one variant or another of the IBM 360 architecture.

    What does matter is that some business models that work don't work unless you have the right (new, or new-ish) technology: you can't have an Amazon.com without advanced web systems, or you can't have it feasibly and cost-effectively.

    On the other hand, having a new 20-inch iMac on every desktop doesn't much matter. (Drat.)

    The trick with IT -- and about everything else in business -- is to really figure out what does matter to the business, and to work your ass off optimizing that thing that matters.
    • by Anonymous Brave Guy (457657) on Wednesday November 19, 2003 @08:18PM (#7516458)
      On the other hand, having a new 20-inch iMac on every desktop doesn't much matter. (Drat.)

      I realise that your statement was somewhat in jest, but actually I don't think that's true a lot of the time. If this is what you mean by "new technology" (as opposed to things like "web services", XML, .NET, etc.) then there are clear benefits.

      Firstly, users with big or multiple monitors are often measurably more productive when using a computer all day. A colleague at work has just got a second monitor. It's just an old but serviceable 17" box, but it makes him more efficient, and he loves it.

      And that, of course, is a second good reason to spend that little extra on the hardware people use all day: it has a morale-boosting effect. Employers that treat their staff well get treated well in return.

      And of course, Macs are vastly superior to Windoze boxes anyway. <ducks> :-)

  • by downix (84795) on Wednesday November 19, 2003 @07:49PM (#7516230) Homepage
    In the end, it is all about consumer needs. The consumer needs more, newer, faster, better.

    Using an example I saw given, the best selling toys today are cars, much the same as from the 1940's. The difference is in what these cars can do. Take the top-notch must-have car from 1949, some metal pushcar contraption. The hot cars this years, high-end RC machines with more computing power than launched men to the moon.

    IT is more important than ever, even as its importance slowly vanishes, becomes part of the general background noise. The more important it gets, the less noticable it is.
    • Are you sure? (Score:5, Insightful)

      by Anonymous Brave Guy (457657) on Wednesday November 19, 2003 @08:22PM (#7516484)
      The consumer needs more, newer, faster, better.

      But do they? Would a secretary typing up her letters be any less productive using Word 2000, running on Windows 2000, on a PIII/500, than she is using Word 2002, running on Windows XP, on a PIV/1.6GHz?

      Sometimes upgrades have definite value; see my earlier comment in this thread about monitors. Other times, they make no real difference at all, and it's just a numbers game, where the prize is... nothing.

      Today, as always, most of the serious work is done on older, tried and tested systems. The users of the most recent toys are either the few who genuinely do require state of the art power and/or technology to do their work, or those who like to be on the bleeding edge, because.

  • IT or Engineering (Score:2, Insightful)

    by Cramer (69040)
    RFID tags aren't, IMO, "IT". That was a engineering gig maybe spawned from an IT problem (how to better manage inventory and warehouses.) How are these people defining "IT"? Anything that deals with computers and/or technology?
  • Does it matter yes (Score:5, Interesting)

    by cluge (114877) on Wednesday November 19, 2003 @07:51PM (#7516244) Homepage

    The IT budget has to be looked at the same exact way as any other departmental budget. What does your company get for the money invested. If your ebay - the money may be well spent in IT. If your bricks and mortar inc you may wish to invest in other areas. It all depends. Only an analytical ruthless, pencil to paper approach will tell you that.

    Unfortunately too many executives - scared at their own ineptness when it comes to IT think that a big IT budget and a smart (insert favoritte IT stereotype here) is going to make them a million bucks. Feast your eyes on the dot bomb waste land ladies and gentlemen.

    In the end it is the talent of the people that make it work that will be the deciding factor - as long as they were hired after a very careful and down to earth review of what was needed. There is no substitute for hard work, and good analysis.
  • True but.... (Score:5, Informative)

    by Fnkmaster (89084) * on Wednesday November 19, 2003 @07:51PM (#7516252)
    A big part of the reason software lets businesses down is that they are often paranoically afraid of change at the middle management layers (pardon, but I fucking hate the word IT, and I find it devoid of meaning so I'll stick to terms that mean something to me).


    Basically, companies don't want to change the way their fundamental "business processes" work even when these "processes" don't make any sense. So if you take the same old inefficient way of doing things, and make software to facilitate it, you're still doing it inefficiently. Especially when requirements for "visionary" systems get bogged down with specification by committee - everybody wants to make sure that their department or group level jobs are represented and that nobody designs them out of the picture. Even if a top level executive recognizes that the way things works is too costly and generally sucks, if lots of mid-tier shitheads play the bureaucracy card and bog a system down until it's in le toilette, well, no surprise when the software you end up with is no better than the way you do things now.


    It also doesn't help that "IT" is the result of years and years of evolution and almost NOBODY in the business IT world is sufficiently bright to take the big picture, generalize about it, and create a logical, functioning infrastructure to replace it. No, the people who are smart enough to do this generally work for tech-focused companies in more interesting jobs where there are tiers upon tiers of bureaucratic wretchedness breaking everything down.

  • by scarpa (105251) on Wednesday November 19, 2003 @07:52PM (#7516253) Homepage
    I work for a local government agency and I see firsthand how the promise of IT is a double-edged sword.

    In my department we recently replaced 75 green-screen terminals. Many, many people were happy to see this happen, but in reality most of the new PCs are simply running terminal emulators and are glorified dumb terminals.

    So on the face of it, we didn't really do anything but spend a lot of money and make everything prettier... ON THE SURFACE

    However, now that the infrastructure is in place, we can begin to really look forward. We are now considering projects that have the promise of eliminating hours of uneccessary work each day and of making public information much more accessible both online and at local kiosks, just to name a couple.

    The key is that you can't just implement new technology for technologies sake, which was kinda what the whole "bubble" was all about. You have to take a long term view of how and why you will leverage that technology going forward. May seem obvious to us, but not to all.
    • Wait until people discover the new 'terminals' are tempremental flaky substitutes for those 'green screens' that you could turn on, like a shredder, the telephone, or an electric stapler and just use, for years at a time, with only routine maintenance.
      • Wait until people discover the new 'terminals' are tempremental flaky substitutes for those 'green screens' that you could turn on, like a shredder, the telephone, or an electric stapler and just use, for years at a time, with only routine maintenance.

        Why would they care? From the perspective of the organization the technology cost nothing, and from the point of view of the employees a crashed system means the afternoon off.

        This is a government department, remember.
  • Already the case? (Score:2, Insightful)

    by BlueEar (550461)

    I've worked both in academia and in industry. From my perspective the industry is very slow to use the cutting edge research. The stuff I treated as pretty routine in academia is considered a cutting edge in industry. The industry is much more interested in massive projects involving well tested technologies than in, what is the domain of universities, small projects with both high risk and high intellectual value. And while some companies, such as IBM, have a significant research budget, this does not app

    • The industry is much more interested in massive projects involving well tested technologies than in, what is the domain of universities, small projects with both high risk and high intellectual value.

      And what, pray tell, is a "high risk" project in the context of the typical student's comfortable life making out and smoking pot at a university?

  • Two edged sword... (Score:5, Insightful)

    by Anonymous Coward on Wednesday November 19, 2003 @07:52PM (#7516259)
    Shoring up what you already have is always a good idea, but - should you be doing it? Firefighting is the most non-productive thing an IT department can do, yet is always required to a degree, whether it be battling the latest worm because of a flawed IT policy, or helping Jane Doe with her print problem. Ongoing shoring up is part of IT, but in many companies I've seen, they seem to go through vast periods of cutbacks and inactivity, then somehow fixate on how one new system will be introduced to fix all flaws. IT doesn't work like that.

    Then on the other hand, you have cutting edge technologies. Well, yes they can help you out if you have a problem that they solve, but there's no point trying to find a problem for them to solve because they're there. I know one company that ripped up a perfectly good CRM system built in house so they could access the database using web services. Totally pointless. Yet, I know another company that has rolled out an intranet, built a document repository and that has garnered much more immediate results.

    So, my answer is a straight 50:50. Firefight, but implement policies that make your job easier as you do so, so you can reduce overall costs, and only implement newer systems if they are required, and even then, don't be blinkered by the latest technologies. Sure, it may be cool, but early adopters always bear the price, but not necessarily the fruits.

    The thing is, some of these points are common sense, some need time, and in business you can be guaranteed that people lack both.
    • by IM6100 (692796)
      Firefighting is the most non-productive thing an IT department can do,

      Management has a way of sorting the operations of their business into two categories: One category is people who do the work to produce product, sell product, promote product, etc. The second category is the people who support those people in producing, selling, and promoting product.

      The efforts and expenses put toward the first category of people is money that earns a direct return to the business. The resources allocated to the se
  • IT doesn't matter. (Score:4, Insightful)

    by Jellybob (597204) on Wednesday November 19, 2003 @07:55PM (#7516283) Journal
    Doing the job your organisation is meant to do does.

    I work at a charity where our primary aim is to help people get back into work after long term unemployment. As a means to this end, we make extensive use of IT.

    We have an Exchange server (save the flames), does that matter? No! What matters is that we have a way of knowing when we're able to make appointments for them, it just so happens the best way we have of doing that is using Exchange.

    We also run an online centre, where people can come and get use the internet for free, and get training in how to use computers. The fact we have 20 internet connected computers doesn't matter - it's the fact that people have jobs who wouldn't otherwise do, partly thanks to the computers they had access to.

    It's all a matter of perspective, IT is just another tool in the box of things that allow you to get the job done. In the same box for us comes knowledge of writing CVs, and being able to relate to people.
  • Right on (Score:5, Insightful)

    by mveloso (325617) on Wednesday November 19, 2003 @07:55PM (#7516284)
    Anyone who knows what they're doing will tell you that IT matters only in the sense that it enables good processes. Your IT is a tool that needs to be backed by processes and people.

    Wal-mart might have realtime inventory statistics across the world, but the reason they have that is because they know what to do with that information. If you gave that capability to Kmart executives, they wouldn't have any idea what to do with it.

    The problem with IT, though, is that KMart might actually buy a system that can give them realtime inventory, then not use it. Whoops, there goes tens of millions of dollars.

    IT doesn't matter because everyone can do it now. Can anyone on /. not figure out how to build an iTunes music store from a technical perspective? Does anyone here not know how to create a scalable mail system? That knowledge (or know-how) is commodity knowledge now.

    So no, IT doesn't matter, or it matters - the way electricity matters.
    • Re:Right on (Score:3, Funny)

      by plierhead (570797)
      IT doesn't matter because everyone can do it now. Can anyone on /. not figure out how to build an iTunes music store from a technical perspective? Does anyone here not know how to create a scalable mail system? That knowledge (or know-how) is commodity knowledge now.

      I bet most /. ers could knock up an iTunes store all right. But I'll bet .01% could actually build a scalable, well-managed, backed up version that you would bet your business on.

      • by edremy (36408)
        Funny? Insightful is much closer. I'm a crappy programmer and I could knock out something like the server side of iTunes in a day or two. But it wouldn't scale, it wouldn't be maintainable, the recovery plan would be shite, etc.

        Doing the details right is hard, hard work. Witness both MS and Linux to see this: MS can't get the security thing down and Linux still fails at the ease of use thing, despite a lot of smart people working on both.

  • by curunir (98273) * on Wednesday November 19, 2003 @07:55PM (#7516287) Homepage Journal
    Will employees really want to work for a company that doesn't stay current with technology? I know I would be worried if I felt like my skillset was aging and I would be a less attractive hire to new employers.

    I've met a lot of people who got into this industry because they enjoyed the "playful" nature of their work. Without the latest "toys" to play with, many IT workers won't enjoy their work.
  • by MurrayTodd (92102) on Wednesday November 19, 2003 @07:55PM (#7516289) Homepage
    I worked a few years in the IT of a "Fortune 50" drug company. I cannot begin to tell you how many hundreds of thousands of dollars were thrown around for silly and stupid reasons, mostly so Pointy Haired Bosses could play "buzzword bingo" in order to sound important and get promotions.

    I on the other hand worked in the trenches and off everyone's radar. I set up a Linux server (I could arguably claim to be the beginning of the Linux movement at this place.) and as I learned about a new interesting technology--mostly database and web stuff--I would ponder whether I could build something that would make IT's job easier. Over three major projects I could estimate having saved at least half a million dollars in labor by leveraging "new technology" to improve operations.

    Now back to the question: what do we mean when we talk about being "offensive or defensive"? If offensive/proactive means implementing a new technology because the buzzword is hot, piss off and stop wasting money. If it means keeping a few bright people on the cutting edge, investigating whether new technologies can improve overall corporate efficientcy, then by all means YES.

    If it means investing zillions of dollars for the eventual Longhorn update and all the new applications that are upgraded to .Net when all the business needs is email and word processing, I still think W2K is sufficient.

  • Ja Nie (Score:5, Funny)

    by smchris (464899) on Wednesday November 19, 2003 @07:56PM (#7516308)

    Someday, the people who know how to use computers will rule over those who don't. And there will be a special name for them: secretaries.

    --Dilbert (as if anybody here didn't know that)
  • by halo8 (445515) on Wednesday November 19, 2003 @07:57PM (#7516309)
    In reality this is all part of what IBM's On Demand moto is all about

    increasing the USEABILITY of what you allready have, tie all your databases, CRM, and everything together... basically Middleware

    i think its safe to say that everyone has the hardware.. what they need to be defensive and to utilize it is the software (linux, DB2, webpshere or tivoli)

    MS and HPQ disagree because they want you to upgrade and they want to sell you that upgrade.. IBM makes there money on services.
    • All very true, except for the word defensive. Being smart and agressive about using the resources you have is hardly "defensive". Maybe you mean defending your wallet against yet more big ticket items in order to get a miniscule return on the investment? ;)

      Businesses today aren't saying "We got burned", they're saying "We invested, we got a bit of payback, so how do actually get the huge payback we were promised here?". This is Microsoft's biggest problem - not being able to come up with that one idea that
  • And even if we win, if we win, HAH! Even if we play so far above our heads that our noses bleed for a week to ten days; even if God in Heaven above points his hand at our side of the field; even if every man woman and child joined hands together and prayed for us to win, it just wouldn't matter because all the really good looking girls would still go out with the guys from Mohawk because they've got all the money! It just doesn't matter if we win or if we lose.

    IT JUST DOESN'T MATTER!

    Everyone: IT JUST DOES

  • by stienman (51024) <adavis AT ubasics DOT com> on Wednesday November 19, 2003 @07:58PM (#7516325) Homepage Journal
    Eventually IT will become a simple, cheap, commodity service. All the work that can be performed elsewhere, such as tech support, manufacturing, designing, etc will be farmed out to other countries. The only work performed here will be replacing bad computers. Computers will become like cell phones and other embedded devices. Bad ones will be thrown away or sent away to be repaired. Eventually saying "I work on computers" will be equivilant to saying "I clean houses." It isn't a bad thing, but it isn't the innovative, problem solving work most of us really enjoy.

    So what's to happen to us geeks? Many will go into design and project management, and liasons. Many will continue to work for a long time in interoperability. Those with PhDs will make patents so companies that don't actually produce anything can make money. Lots will support other growing fields that need custom work, such as bioelectronic technology, nanotechnology, and those other 'pie in the sky' technologies.

    Many will go into programming and hope they can sell their vision/idea to the few major content providers - who'll take it and have it developed further by programmers in lower slobovia.

    But it's still another 10-20 years along.

    -Adam
    • Lies.

      Computers are orders of magnitudes more complex than a dusty, cluttered house.

      Computers are small, they are hard to visulize. It takes time to understand the ways they work.

      I believe anyone can be trained to fix, design, program computers. Much like anyone can learn painting techniques. But it takes a degree of insight and craftiness to do good things with a brush. Not everyone can do that.

  • I thoroughly disagree with Carr. There is still PLENTY of opportunity for IT to lead, just look at Homeland Security and TIA. IT has just barely begun to bring us the Big Brother that Orwell promised us. Any smart organization, commercial or public, should be pushing the limits of what IT can do today to bring on the oppressive survelliance society!
  • RFID (Score:2, Insightful)

    by nanowyatt (196190)
    It seems that RFID is a pretty clear refutation to the thesis. RFID will slash inventory costs, while hopefully increasing accurracy. And RFID is clearly Information technology.

    RFID will also make some tech dreams closer to reality, eg a fridge that knows what's inside and what needs replacing.

    I visited the Stop and Shop with the "Shopping Buddy" that was /.ed a few weeks ago and I think that is another IT that is making a difference. That has the potential to shrink both the labor needed at a supermarket
  • It is what runs a large part of business worlds back rooms.

    However, cutting edge IT technology doesnt matter.

    Most companies are 5+ years behind.. and are quite content.. If it gets the job done and can still find somone that can support them, they have no need to change.
  • Asking if IT matters is kind of like asking if your pancreas matters. Most people aren't even aware of the function of the pancreas, aside from it being an internal organ that somehow helps the body, like the spleen. But take that pancreas away and boy, do you get someone's attention quickly.

  • by 23 (68042) on Wednesday November 19, 2003 @08:03PM (#7516360)
    What he says is, that the whole of IT is becoming a commodity, just like electricity. Having it is essential, but it doesn't give you a strategic advantage in business, since others have (to have) it also.

    I actually think he's right. IBM e.g. effectively commoditisized (if that's a word) PCs by opening up the their standards years ago, MS having the complentary product "OS/Office" that made them superrich. Consider this: Having Win+MSOffice (please no religious zealotry...:) ) might have given you an advantage 10 yrs ago, if you were one of few and could reorganize your business processes to be much more efficient using it. Nowadays, everybody has it and needs it, you loose that advantage.

    This guy Carr just generalizes that to the whole of IT, including the "new" stuff like the net. Beats me, why IBM is crying foul, since they are running this huge PR campaign of "IT as a utility" which is exactly that.

    just my 2 cents

  • Shoring up the walls (Score:3, Interesting)

    by snowlick (536497) on Wednesday November 19, 2003 @08:05PM (#7516371) Homepage
    Carr is right about the ubiquity of IT. Everyone has it, so by and large it's not really a selling point by itself. He's also right that it's really important to shift focus from buying into new ideas to making sure the old ones work.

    However, a critical component of the advancement of IT is the "new idea"(surprise). Computer science is still expanding and changing from day to day. As we all know, most ideas are way ahead of their time as far as computing power goes. We always seem to be playing catch-up with our theories. Components get faster and cheaper, and we're continually discovering new and better ways to utilize them to do what we need to do. Take the current boom in wireless technologies as an example. It will change the way a lot of companies do business. To survive and moreover to compete these companies must be able to adapt to new technologies. Of course not all businesses will have to ride that bleeding edge but the effects will trickle down.

    The bar is still being raised. I can see a leveling off happening in the future, but as the price of hardware continues to drop we can be sure that IT will still be relevant as newly affordable/feasible ideas come to light.

  • Absolutely, IT doesn't matter. IT got hyped up to the nines, I was all excited for ITs release, whatever IT was. "Revolutionize an industry", they claimed.

    And IT turned out to be a damn scooter... hmph.

  • profitability. (Score:2, Insightful)

    I don't understand why it has to be one way or the other.

    I work in a small graphics/prepress shop, and if something works, we keep it until it no longer works.

    When a new problem comes up, we see if our existing architecture will solve it, if not, then we start researching the newest choices out there.

    For example, our server was a used Sun, which we picked up from an imploded dot.bomb (god bless San Francisco's used equipment market.)It works fine for our 30 or so people, however, we need a high powere

  • by OldCrasher (254629)
    Years ago NME opined of the band "Wild Horses," While we have Electricity we will have Bands like this! Today we have the Harvard Business School, and while it exists we will have err, gentlemen, like Carr.

    IT does matter. It will continue to matter till such time some far more advanced concept sweeps it aside, just as the computer finally nudged Caxton's Press to one side in the last decade.

    The example of using RFID tags at Walmart is actually proving the point that IT does matter. Walmart is one of the m
    • I think you're on to something. I think all of us could point to a successful company that owes its success to their IT implementation, Dell and Wal~Mart are both excellent examples of why IT matters. Both utilized many technologies to significantly reduce their costs and destroy their competitors. However, I doubt that on a weighted basis, K-Mart and HP (or Compaq or any of the grocery stores Wal~Mart is now beating) spent less on IT than the succesful companies did. My personal favorite is Milken, the
  • Easy test (Score:5, Insightful)

    by EvilStein (414640) <.spam. .at. .pbp.net.> on Wednesday November 19, 2003 @08:18PM (#7516456) Homepage
    Unplug a whole bunch of shit. Watch the chaos.

    THEN ask people if IT matters. :P
  • IT veteran (Score:3, Interesting)

    by Usquebaugh (230216) on Wednesday November 19, 2003 @08:20PM (#7516466)
    I've worked in the trenches for almost 20yrs. I have to say he hit the nail on the head.

    Why should a business spend a shed load of money to gain no advantage? They shouldn't they should look to buy IT as a utility or infrasturture.

    Why is Open Source booming? A lot of devleopers/managers realise that all they need is software that does the job, anything extra is supurflous. They'll glady help/pay to write the software as long as they only pay once.

    The first business that adopts the less is more approach to software will dramatically reduce IT costs. 90% functionality is more than enough for anybody :-)

    List the applications a business needs and then see if they are available. The race is on, which Open Source projects are going to be the 800lb gorilla.

    My list

    Low level PL C/C++
    Business PL (Java/Perl/Python)
    OS Linux (Debian)
    Desktop (KDE,Gnome)
    Web Server Apache
    Browser Mozilla
    Office Suite Open Office
    Database SAP DB
    Accounts Package (Gnu Cash)
    ERP Compiere
    CRM
    BI

    You'll notice there are no commercial products. Business will require open source in the future, why pay for upgrades when you can get it at low cost.

    A closed source company will be required to defend itself with ip law. Expect to see more and more patent wars waged, rather like the pharma companies. A patent is a license to print money but when it expires so does your money. Generic Drugs=Generic software.

    The question is do you want to work in IT when your only job is gluing other peoples code together? If not you'd better start thinking about which project you wish to work on.

    You think vertical markets are going to help you, think again. How many forms of banking are there? How many types of insurance.. etc etc The first big OS project in these markets will probably never be over taken.

    IT will only ever be a large expense to those companies that can derive profit from that expense.
    • Re:IT veteran (Score:2, Insightful)

      by 9Nails (634052)
      Open source doesn't have phone tech support! That's it's short comming. It's a wonderful idea, but if you don't staff a guru in your office, it is harder to implement because the resource pool is shallow. Especially to someone who is just diving in. And even if you're a developer, open source only means that it is (sometimes) free code. It doesn't mean you can hack the code and made Widgets 2.0 power your Shopping Cart needs for Web Server 3.5...
  • by scorp1us (235526) on Wednesday November 19, 2003 @08:27PM (#7516513) Journal
    Technology is supposed to enable us to do more. As long as it does that, tech is a success. The hardware companies wil convince you that that means more Mhz and more disk.

    In reality though, if you can put to use that PIII-700 to do something productive, then it is a success.

    For an analogy consider a Cray vs a TRS-80. A Cray running a bubble sort will be beaten by a TRS-80 running QuickSort for just a few thousand elements. The same is true for tech in general. Work smart, not hard.

    There are times where raw Mhz are needed, these are real-time requirements or due to lag creating some kind of penalty (if it takes 20 mins to get an answwr back, you'll be more selective in your questions, where as if it took milli-senconds, you'd take time to ask more creative questions - this was the prupoe behind Beowolf clusters)

    And again, we see work smart not hard. Put those PIII-700 to work as a cluster, working smart, not hard.

    Better processes are key. Brute force allows you to compensate for lack of a good process, but you pay a premium.

    I see all too often PDAs being used instead of note pads. PDAs many be status simbles and nifty, but I can put notes in and read notes back off a pad of paper faster than the fastest PDA users.

  • by YouHaveSnail (202852) on Wednesday November 19, 2003 @08:27PM (#7516518)
    We all need to stop thinking of IT as a field apart from the rest of the world, and start looking at the world and to see how it can be made better. Sometimes IT will be part of that solution. Sometimes not. Sometimes the solution will be to remove (gasp!) high tech "solutions" that failed to deliver.

    Most of the IT companies spent the last 25 years convincing companies and consumers to buy desktop computers, laptop computers, servers, and software in order to boost productivity. Plenty of low tech jobs were replaced by a bunch of high tech solutions and a smaller number of high tech jobs. Companies now routinely process zillions of transactions very quickly using very few workers. It may be that productivity can be increased even further in this way, but the huge gains of the last two decades are probably tapped out. IT may not be the motivation for the next huge runup in stock prices, but saying that it doesn't matter is like saying that mutual funds don't matter in the market. They may have fallen out of favor, but they're still a huge force to be reckoned with.

    I'd like to see IT applied in ways that really make our world better, instead of (just) more efficient. IT has long promised to improve health care, and has largely delivered on that, but it has also lead in part to the increased cost of health care. Let's use it to drive down costs. IT has made a lot of things much more convenient, but at the cost of privacy. Let's use IT to protect privacy and better control our own information.

    There are a million directions that IT can go in, and thanks to massive parallelism in our society it can go in those directions all at once. Let's get on it.
  • by randall_burns (108052) <randall_burns.hotmail@com> on Wednesday November 19, 2003 @08:30PM (#7516524)
    In the film, Tucker, a man and his dream, an auto exec testified "you never innovate until your competitors force you do". Now consider what has happened to the US auto industry. If the US doesn't get guys like that out of positions of leadership, the US will go the way of the US auto industry.
  • Once we define what IT is then we would have a better chance at finding out whether it matters or not. IT would be easier to put IT in its proper place of context. Until then the eternal question of whether IT matters will remain answered...
  • Jobs (Score:3, Insightful)

    by mrnick (108356) on Wednesday November 19, 2003 @08:33PM (#7516543) Homepage
    My main concern with the IT market is jobs. We tried damn the torpedoes full steam ahead and look where it got us. Me along with a bunch of qualified IT people searching with not much luck for a job any job. That's why I am focusing my job search hunt for IT positions in non-IT companies because I believe that IT as a product alone has failed.
  • by swordgeek (112599) on Wednesday November 19, 2003 @08:35PM (#7516554) Journal
    Steve Ballmer says it matters?
    Carly "Jet Babe" Fiorina says it matters?

    All we need is Darl McBride to join with these two twits, and we've got a quorum of incorrect opinions!

    (Well come on, it's not like they've got anything ELSE right so far)
  • It's a good point (Score:3, Insightful)

    by rstultz (146201) on Wednesday November 19, 2003 @08:37PM (#7516568) Homepage
    I just took over IT at the medium advertising agency I work for. The previous guy had entirely subscribed to the theory of "New. Bigger. Better." If I could get back the last 2 years budget, I'd be in heaven. He bought because "New technology means advancement."

    Whereas I've been purchasing based on what technology is actually going to improve our production, which suprisingly isn't much. We've got piles of Dual G4s that were top of the line last year, and I'm purchasing eMacs, because with the money I save (and no actual difference for what my employees do with the machine), I can invest in better networking, and invest in people (yah raises!).

    For some reason people think that the latest upgrade will always increase productivity. We have machines that have been around for the last 3 years doing nothing, and I now have our back-end MySQL databases running off of them, have our web-server running off of them, using them as file-servers.

    And it makes a real difference in my budget, when I can make do with current equipment, it gives me a lot more room for expansion, compared to the practice of replacing non-obsolete equipment every other year.

    I think corporate America wastes more technology by not utilizing it to its fullest (which sometimes means having decade old equipment) than it could possibly realize.

    Ryan Stultz
  • Does IT matter? LOL, well, everybody thought it did, but apparently it doesn't........oooo....they're NOT talking about the segway?

  • Whether a company has enough IT or needs more depends on whether the IT staff can convince the business people of the ROI of IT projects. The prior ROI of recent projects will determine how much cred the IT staff have with the business side. The projected future ROI of proposed IT projects will determine what the business should do, assuming they take the advice of IT department. There are four combinations for the prior ROI of IT and the future ROI of IT. They are:

    1. High prior ROI, high future ROI:
  • Bang on! (Score:4, Insightful)

    by Ridgelift (228977) on Wednesday November 19, 2003 @08:45PM (#7516620)
    Carr countered saying that he is not advocating complacency but skepticism. "Companies shouldn't be complacent; the word I would use is skeptical," he said

    Bang on! I've often referred to Information Technology as "Data Technology". The main difference is the ability to act on data. If you can make better decisions based on processing data, it becomes information.

    Too much of Information Technology is wasted on learning tools. New versions of software is release which precipitates a purchase of books, seminars and (more importantly) time.

    Linux appeals to me because it uses a fairly static set of tools, which can be combined to solve problems. It isn't as pretty as Windows, but the time I've invested in learning tools well has paid off in less time learning stuff I don't need (TMTOWTDI). I focus more on my boring old ASCII files with the business information I need and less on figuring out why my latest version of Word is causing a GPF in a newly reloaded version of Windows XP that can't open the Word doc that I typed up last week, which precipitated the reload in the first place because Word locked up when I was trying to extract information from the data I'd collected... ...you get the idea.
  • Case Study (Score:3, Informative)

    by Brandybuck (704397) on Wednesday November 19, 2003 @08:46PM (#7516627) Homepage Journal
    My company would make a good case study.

    Three years ago we had a Solaris network, Sun workstations, Netscape mail and calendar server, etc. There were five sysadmins for 1200 employees. An "Intro to UNIX" class was held twice a year, and an "Advanced UNIX" class once a year. All in all, it was a traditional, stable, robust, and boring infrastructure.

    Then we got bought out by a huge multinational. We went Microsoft-only. We're now on a Windows network, Win2K and WinXP workstations, Exchange server, etc. We have 20 MCSE's for 1000 employees. Introduction to Windows classes are held quarterly, with additional classes in Word, Access, PowerPoint, etc. Our network is now unstable, frequently down, and very exciting. We have to reboot our workstations to apply patches about twice a week.

    If it ain't broke, don't fix it!
  • by RiffRafff (234408) on Wednesday November 19, 2003 @09:00PM (#7516701) Homepage
    As a metrologist, I am acutely aware that my job is not a "valued added" function. I work for an aerospace company that is one of the 30 companies whose stocks make up the Dow Jones Industrial Average. What I do, day in and day out, does not help sell product. What I mean is, consumers expect their avionics to meet spec. They do not expect to pay extra to have the test equipment that was used to align these devices calibrated; they take that for granted. As such, it adds no saleable quality to the end product. If aerospace companies could do without a calibration department, they would certainly do so. Luckily (from my point of view) certain agencies DO understand the importance of having measurements based on traceable national and international standards (like the FAA, and ISO). And hopefully it will help make the difference between your airliner landing on that runway in the fog, and touching down in the swamp just to the left. But still, my job function is considered "indirect" and does not help sell the product.

    I see IT as being in the same same boat...companies NEED an effective IT department to stay competitive, but consumers are not willing to pay extra for it. It is a foregone conclusion by consumers that effcient companies have an effective IT infrastructure.

    Like calibration, IT is not likely to be missed until the effects of its disappearance are noticed.

  • by Slime-dogg (120473) on Wednesday November 19, 2003 @10:20PM (#7517088) Journal

    I can say that IT isn't necessary, but sure makes things more convenient. There are a number of businesses that exist without computers, they are mainly retail shops and the like (lemonade stands,cash/check only antiques, yada). In order for IT to actually be necessary would mean that you cannot do business at all without it.

    If we didn't have the internet, we'd resort to telephones and fax machines for long distance communication. Snail mail is another option, although not as fast. Databases are a way more convenient form of file cabinets full of binders, but for some reason most accounting departments keep the paper along with the electric.

    I'm not particularly worried about losing my job in IT, or afraid of someone calling my job unnecessary. I don't think my job is necessary. The whole point of my job is to ease the burden of my co-workers, by making their payrolls go faster and easier, by eliminating as much paper from the interoffice ordering and communication, or by providing support for co-workers when Outlook is barfing. All of it has functioned without computers before, it could very well do it without them again. Perhaps they wouldn't be quite so efficient, but that wouldn't hinder the actual function of the business.

    The only places where IT matters are in those businesses that have bet the bank on IT. MS, IBM, and HP are all places that look at such a paper as detrimental to their position. If people in business realize that innovation does not necessarily mean upgrade, but also includes better internal programming and process auditing, all of those big tech companies will take a hit in sales.

    Look at the years after the bubble burst, for instance. The business community proved how unnecessay IT really was, which my bretheren are still very sore about. Businesses found that they needed to focus on cost cutting and efficiency, both were things that didn't need bleeding edge hardware and platforms to accomplish. Cost cutting came into being through the massive release of IT workers, through limiting the spending on new servers and pipes, and through reorganization. Efficiency came when reorganization forced workers to do their jobs better, and not be so distracted by the nerf balls and ping pong tables.

    Granted, many excellent workers were cut, and many poor workers were kept. This is the nature of the upper level management beast. Eventually, those people will get rolled out of their jobs, and those positions will be re-filled by the competant ones. There will not be so many positions, though, mostly because the focus of IT will be on maintaining regular operations and on optimizing current applications. There will be little room for creativity, but it can be sneaked in in the name of "easing the burden of co-workers."

    IT has lost it's glamour, and everyone (including the deluded IT guys) has finally realized that IT hasn't changed much since the 70's and 80's. The mid-late 90's were an enigma, perhaps the actual recognition of this strange section of business, but it was just everything blown out of proportion in the end.

    IT really isn't as necessary as IBM, MS, and HP would like people to think. These businesses have excelled recently in creating extra expense in business, just so that they can show how it could be cut with bleeding edge technology. The stuff is nice to drool over, but technology that is two-ten years old can still fulfill that role, and the cost has already been paid. IT's role is now to be intelligent with the data, to work with it efficiently, and to maximize the effectiveness of this hardware.

  • true and False (Score:3, Insightful)

    by samantha (68231) * on Wednesday November 19, 2003 @10:30PM (#7517123) Homepage
    It is true that a lot of the IT budget is wasted. It is true that most companies spend more than they should have to, especially on software. On the other hand I have been in many shops whose software developers were wasting a fair amount of time using obsolete and too slow machines. But I think much of the problem even for developers is not the hardware but the over-inflated price of inadequate software.

    I especially find it very sad that so many shops are wed to overpriced MS products even when perfectly workable OS alternatives exist. Microsoft seeks and always will seek to draw the most dollars for the least real innovation and benefit. The way Windows works itself requires local copies of many megabytes of software when most of that software could have its components shared and brought in as needed on a fast Lan. Users should not be blamed because the dominant player makes a defective file sharing OS!

    Real competitive advantage will come with real software advantage, not bloated safe-buy hype. This hype is seen clearly in enterprise software also. J2EE has become a buzzword without really delivering any competitive advantage that I have witnessed. A few large players get tremendous press and trust and rake in big bugs when the innovation, if any, is most likely in the little known new offerings from relatively unknown individuals and companies. The trick is finding these true innovations and using them in a safe and competent way. This cannot be done easily if at all if the source is closed.
  • Irony... (Score:5, Insightful)

    by SmurfButcher Bob (313810) on Wednesday November 19, 2003 @10:49PM (#7517226) Journal
    This story, right next to 500,000 jobs lost.

    A few years ago, there was this huge .com era. And most of us laughed at it, because it was stupid, pointless, and irrelevent. The bulk of the things developed was, to be blunt, total hogwash. Period.

    The hogwash wasn't just limited to .coms, however. I remember almost spitting my coffee when I hit a headline, "New 3 Dimensional Database" in some trade rag. And yes, it was just a new buzzword for crap we've been doing in FoxBase, Foxpro, DBase, C, APPLE BASIC, and even COBOL for years. Yep, real new. Had a big price tag on it, too. And people bought it.

    Then, "Client Server" was also a big buzzword. Yep, real new. Uh...

    Of course, "client server" got worn out, and people hated it. So, they invented a new word - "Thin Client". THAT was NEW! In fact, if you got one REVOLUTIONARY enough, it'd even have VT100 emulation! Or Wyse50! Oh, hell, that's not new at all. ...and so on. Morons selling the same old trash, with a newer and bigger buzzword. Idiots buying it, because they're too lazy to learn, over their head, or just plain useless.

    Then the website craze began. Morons charging $500/hr to "write html code", and be "html coders". Colleges actually offer degrees in this crap today. Students actually major in it. "Yes, I have a degree in Notepad, with a minor in writing code in HTML". Uh huh. Could you mail me that MAKE file for that new webpage? The linker is puking on mine...

    So, the product industry is full of useless junk, and they repackage that exact same junk every xxx months with a new name (q.v. PocketPC 2002, vs Windows Mobile 2003. Or, MS Word 95, 97, 98, 99, 2000, 2001, 2002, XP, ETC, ET AL. Or, HP's new marketing campaign. Perfect example.)

    And, the consumer (corporate) market is full of useless dolts, who buy the new versions thinking it'll deliver the product they were hoping to get, 12 versions ago when they originally bought it. And meanwhile, they didn't actually *need* it in the first place, because it didn't solve any real problems or enable anything new.

    And the reason the consumer (corporate) market is full of these dolts, is because most of them actually THINK that the annotation of text with little bracket signs is "coding". They're clueless, intellectually lazy, and they're only in it because "Mom said it'd be a great career!"

    They think the definition of an expert is someone who knows one more buzzword than you. In other words, they're suckers.

    Eventually, they get fired / downsized / put out of business. And, they flood the market with all of these credentials, and they DEVALUE those credentials because they themselves are too stupid or lazy to fulfill the roles those credentials allow. "PhD for L1 Tech Support?" "Well, the guy with the 4 BS degrees couldn't handle HTML Programming. I think we need an expert in Notepad this time."

    So, the market gets pissed off, and tries to normalize itself. We don't need an upgrade every week... we need a toaster, that does exactly what we need to fulfill our business requirements. Period. Once we get it, it should last for decades without being touched.

    I've still got a pair of '286s floating around my shop, crunching away at whatever... because they do the job, and that's all they do, and there is NO POINT in changing them. And when someone discovers them, they stare in horror. "Why don't you upgrade them!!?" Uh, that 286 is a telenet interface to a black-box that has exactly one 300 baud serial interface. The reason it's a 286 is because I couldn't find an XT with a working floppy.

    And oddly, usually they "get it" at that point... but they don't like it. They'd literally drop $1500 for a loaded 4Ghz WinXP Pro box with 20 gigs of ram, 100 gigs of drive space... to do nothing more than trap a TCP packet, and pump it out a serial port at 300 baud. Most of you reading this are probably thinking the same thing... "Jesus, dump that piece of $#%^".
  • by Proudrooster (580120) on Wednesday November 19, 2003 @11:16PM (#7517363) Homepage
    I was actually very pissed off about this today. I call it the widgetizing of America. Everyone and everyjob is neatly classified and catergorized so that it can be easy outsourced or "right shored". "Right Shore" is the new buzzword for shipping jobs to Asia/India. Anyone is High School or College should really be thinking hard about two things 1) voting and 2) what job can they prepare for that can't be "right-shored".

    IT, manufacturing, technical design work, medical diagnostic work, financial analysis, text book proofing, medical research .... and according to CEO's all this stuff is commidity (sic).

    What's really happening is that UnFree Trade is killing America. As long as overcompensated CEO's can achieve their quarterly earning targets, they could care less what long-term damage they do to the country or companies they are running. This is nothing more than blind, stupid, greed. The only question is "How hard is America going to crash?" I wonder how many people will lose their jobs, pentions, and healthcare this year as our CEO's continue to commoditize, outsource, and offshore with the blessing of our politicians (and ultimately consumers).

    It's tragic that a country founded on the pioneer spirit, hard work ethic, and innovation is now dismissing everything that made it great, calling it a commidity and then getting the service performed by what amounts to slave labor. I am personally trying to break my additiction to cheap foreign goods, but can't seem to find anything made here. Recently I found a manufacturer called Jaton [jaton.com] that makes nvidia video cards in California. I was so impressed I bought 4 of them.
  • Cutting edge? (Score:3, Insightful)

    by mnmn (145599) on Thursday November 20, 2003 @12:43AM (#7517733) Homepage
    ...shoring up what they have now in order to maximize its usefulness...

    I would shore up the usefulness of the IT infrastructure of my company rather than slap another requisition on the table for a brand new server. This is a Microsoft based network with the network and servers taking good pressure from the users. A few years ago I would've tried to push management for better servers and gigabit ethernet, but one should see such problems as challenges to their skills.

    Smarter placement of switches, interswitch links moved to gigabit, tuning the MS SQL server, replacing the winproxy firewall with openbsd, getting a cisco 1700, moving services to makeshift servers, stuff like that will allow a company to wait another year before investing in cutting edge. It will also require some creativity and activity on behalf of the IT guys.
  • The way forward... (Score:3, Informative)

    by Yonder Way (603108) on Thursday November 20, 2003 @05:54AM (#7518509)
    ...is sometimes the way back.

    X Terminals were a great idea but in a time where machines and network infrastructure were too slow to support them. They have pretty much gone away.

    Today your average desktop class machine is really enough to support several dozen regular business users.

    Add openMOSIX [sourceforge.net] into the mix, and one virtual machine made up of a small handful of real machines can suddenly support hundreds of users' desktops. New machines can easily be rotated into the cluster (live) while old machines are rotated out when they become obsolete.

    On the actual desk itself, something like a VNC terminal appliance is all one needs. Lifespan of one of these units is several times what a PC would last.

    A sysadmin with 300 users is now really supporting only one workstation (whose processes are being migrated to maybe a dozen or two other workstations who have direct access to the master node's file system).

    This isn't pie in the sky. It's based on very old ideas re-applied using new technologies that weren't available when the ideas were first tried. It actually works very well using the hardware and software available to us today.

    I have to laugh when my users think that what I'm doing is bleeding edge. This is old school UNIX administration.
  • by aphor (99965) on Thursday November 20, 2003 @11:03AM (#7520305) Journal

    While most people thought it was magic unleashed on the world, I saw the Internet as a developed captive military technology released into the commercial sector. It was like a grant from the DoD. Is there more where that came from?

    If you can sit back and receive defense technology and all you have to do is product development, then of course it makes no sense to burn cash on long-term R&D. Many economists and political scientists think that R&D is more efficient and effective in the public sector anyway. R&D in the private sector yields patents, which end up limiting the economic impact of the technology developed.

    I can swallow Carr's ideas if he means companies should pool their resources into a creative commons for serious R&D, and then everyone can share the intellectual harvest. I can't swallow his recommendations if he means that advancing IT is too risky or otherwise unprofitable for anyone to bother with.

    Ironically, IT has no value when you don't know how it reduces waste and generates economic benefits. If you do what Carr recommends, sooner or later you will think yourself right through the "IT is pointless" argument because you will understand the needs and know your idea coming from practical business operational knowledge, is a good IT risk.

    I'm not saying I agree with Carr, but I wholeheartedly agree with him promoting this kind of debate! Also, are the people in commercial sector IT qualified to distinguish between a good idea and a bad one? I still believe we have a serious shortage of smart people. I think we should slow down IT to the rate at which smart people can deliver good IT.

  • by voodoo1man (594237) on Thursday November 20, 2003 @11:11AM (#7520384)
    And here's why. I think the whole reason the article was written was because "IT" never really mattered - it was crushed by snake-oil salesmen and charlatans on both sides of the fence before it ever got off the ground. Just look at what happened with the AI boom and bust of the 80s - it's almost a direct parallel of the dot com spike.

    Hopefully, Presidents and CEOs everywhere will take this article seriously, stop buying shit from Microsoft, Oracle, etc. and lay off as many "IT" middle managers and system babysitters as they can (friendly casualties will be inevitable, but they always are!). The throngs of chair-warmers streaming to university CS faculties will finally stop (because it sure as hell hasn't slowed down enough!)! While companies everywhere are starting to put real solutions to work and getting back the levels of database stability and utility others have enjoyed since the 70s, real CS research will resume at universities (hopefully the Homeland Security Department will channel some of it's money into this and not into the pockets of private "IT" solutions firms), and Free Software will become the dominant licensing paradigm. Because suddenly you can't milk money out of them anymore, people will abandon languages and tools that were hot in the 70s, and the state of the art of the 80s will be reborn and flourish. Hell, maybe even AI will resurface (it sure seems to be now with the natural-language search problems getting so much attention).

    This is when the real productivity gains start. Businesses adopting these technologies will once again possess competitive advantage over those that don't. As more and more people start using it, overzealous reporters start hyping it, and all sorts of carpetbaggers leech on. The bubble inflates under their hot wind, and subsequently bursts. Most everybody realizes this was all a scam, someone or another finally gets the idea that "technology doesn't matter," everyone who was doing the real work is out of a job, and the unchangeable circle of market life goes on spinning!

"Consider a spherical bear, in simple harmonic motion..." -- Professor in the UCB physics department

Working...