Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Almighty Buck

Is Today's IT an Undervalued Asset? 672

mwillems asks: "I work in the technology industry, as a CTO. What I have increasingly seen in the last year, both in North America and Europe, is that IT has ceased to be a valid way to spend corporate money. IT spending used to be looked at as a way to gain competitive advantages. Since the .com bust, the arguments I hear everywhere is 'IT has now been proven to be a waste of money'. At many companies it is now easier to get a corporate account at a strip club than a new PC. Or a budget to develop a much-needed corporate app. If any spending is done it is on hardware - at least that is 'real'. Do Slashdot readers recognise that? Are there going to be many techies left ten years from now? What can we do to keep the spirit of innovation alive while this 'IT is bad' era lasts, and how can we make it end? And, how do you prove the value of IT? This is not as simple as it seems. Try it with a spreadsheet: as your typical CTO has to do so, every day." How do you feel about the cost benefits of IT? Is it worth what your company spends on it, especially if the advantages can't be reduced to a simple dollars-and-cents figure?
This discussion has been archived. No new comments can be posted.

Is Today's IT an Undervalued Asset?

Comments Filter:
  • The way I see it.. (Score:5, Insightful)

    by Xerithane ( 13482 ) <xerithane.nerdfarm@org> on Monday August 12, 2002 @05:32PM (#4057011) Homepage Journal
    Most people in the tech industry are going to fade out. Thus, leaving the majority of workers those who have been around before the .com boom. Bigger salaries and more work, instead of the bloated staff in a lot of IT departments that you saw during the .com boom. Personally, I'm glad to see it. I know plenty of people who shouldn't be in the IT field. Luckily, those are the ones finding other professions or reverting to their previous professions.
    • by God! Awful ( 181117 ) on Monday August 12, 2002 @05:48PM (#4057169) Journal

      Most people in the tech industry are going to fade out. Thus, leaving the majority of workers those who have been around before the .com boom.

      "Bigger salaries and more work"? Puh-leez!! Since when has the collapse of an industry caused salaries to go up? The whole reason why unqualified people flocked to IT in the first place was the high salaries. The high salaries were the reason why companies pressured the US government to relax immigration controls. Frankly, I think we will see smaller salaries and more work as proprietary programmers struggle to compete with open source.

      -a
      • Since when has the collapse of an industry caused salaries to go up?

        These things go in cycles. If there's enough public disinterest caused by the "tech bust" that university enrollments go down past a certain point, it's only a matter of time before demand once again exceeds supply. This can happen in any industry. Remember, certain kinds of tech jobs are difficult, and there's many a person who can't stand being a knowledge worker behind a desk all day, thinking about problems and solving them. If the pay goes too low, interest will be lost disportionately fast, because many kinds of IT jobs simply aren't very fun.

        C//

        • These things go in cycles. If there's enough public disinterest caused by the "tech bust" that university enrollments go down past a certain point, it's only a matter of time before demand once again exceeds supply.

          Theoretically true, but I doubt that argument applies in this case. The industry crashed by such an order of magnitude that there will be ex-tech workers ready to jump on the bandwagon for years. And there are so many more kids growing up with computers these days that it is very unlikely that university enrollment in related fields will wane.

          Remember, certain kinds of tech jobs are difficult, and there's many a person who can't stand being a knowledge worker behind a desk all day, thinking about problems and solving them. If the pay goes too low, interest will be lost disportionately fast, because many kinds of IT jobs simply aren't very fun.

          Oh, I have no doubt that there will be some high-paying sysadmin jobs available in the future, but I don't see many high-paying developer jobs opening up. Much of the most demanding, coolest development will be on free software projects. Adapting an existing app to a specific situation is way less fun (and takes less skill and fewer man-hours) than writing something completely new. Hence, there will be fewer high paying development jobs.

          -a
          • The industry crashed by such an order of magnitude that there will be ex-tech workers ready to jump on the bandwagon for years.

            There's a name for a technical worker who's been out of work for years: "plumber". :) Seriously, though, have you looked at history? There was a tech bubble in the early 90's. When we were hiring, we'd get 50 _qualified_ resumes for just one opening. It was phenomenal. The people we could have hired then were 3 times better than most of the "wanna code Java web pages, dude!" guys that were later the bread and butter of our interviewees.

            Anyway, the ultimate long term consequence of "a computer in every electronic device, home appliance, and automobile" is a lot of computers, dude. Moreover, corporations need automation and still do. Enormous amounts of data pass through corporations.

            C//
      • by Skyshadow ( 508 ) on Monday August 12, 2002 @06:02PM (#4057284) Homepage
        Smart companies will find good people and pay whatever it takes to keep them.

        Stupid companies will offer low pay and deal with the people they attract with that, who will then go out and make boneheaded decisions and toss the whole company into well-deserved chaos.

        Now, I agree that the average salary has gone down, however that's more just from there being a reasonable pool of people to choose from rather than 1% unemployment.

      • "Bigger salaries and more work"? Puh-leez!! Since when has the collapse of an industry caused salaries to go up? The whole reason why unqualified people flocked to IT in the first place was the high salaries. The high salaries were the reason why companies pressured the US government to relax immigration controls. Frankly, I think we will see smaller salaries and more work as proprietary programmers struggle to compete with open source.


        Why would people take more pay cuts? I'm in a secured position, for the most part. The only way I can go is up. In 5 years, you better expect me to want a lot more money. The IT field is a high paying field. End of story.

        As for your statement, "Since when has the collapse of an industry caused salaries to go up?" that is full of troll-fodder. First, the industry has no collapsed. It has merely gone on a diet, that was much needed. Since the creation of the Ford automobile, there has been 15,000 failed automobile manufacturers. You think when it was a boom people were getting paid a lot to design cars? Yeah, they were. Then, it started to decline and no automobile designers are again making very good salaries and most of them have been in the business for a very long time. Same thing in the corporate world. And saying that open source software is going to compete with developers is just utter I've-never-worked-in-a-large-company-bull-shit. In the company I work at, 90% of the server software (Excluding Oracle and WebLogic) we use is open source. Same with other companies I've worked at, for the most part. You know what keeps me employed? I know a lot about open source software. I know how to code to it, and expand on tools that are already available to better serve the corporation. That's why I still have a job. Most open source software is not ready to be used in an enterprise environment. However, with a few code modifications, and some clever front end GUI applications, they become very usable. I'll never worry about being put out of a job because of open source software. Purely because you need to have a coder in house to understand it.

        • Why would people take more pay cuts?

          Because they will get laid off otherwise? Because unemployed IT guys are willing to work for less?

          I'm in a secured position, for the most part. The only way I can go is up. In 5 years, you better expect me to want a lot more money. The IT field is a high paying field. End of story.

          Wishing *will* make it so. Wishing *will* make it so.

          Since the creation of the Ford automobile, there has been 15,000 failed automobile manufacturers. You think when it was a boom people were getting paid a lot to design cars? Yeah, they were. Then, it started to decline and no automobile designers are again making very good salaries and most of them have been in the business for a very long time.

          I have no knowledge of the car industry with which to dispute your comment. However, you haven't really made any attempt to explain why the industry goes up and down, just that salaries track the market.

          And saying that open source software is going to compete with developers is just utter I've-never-worked-in-a-large-company-bull-shit.

          I work for a large company.

          In the company I work at, 90% of the server software (Excluding Oracle and WebLogic) we use is open source.

          And I work on product that is 90% open source and developed using 100% open source tools.

          Most open source software is not ready to be used in an enterprise environment.

          I disagree. We're not gullible enough to spend $50 per seat (or $1000s in developer-hours) on an improved product when we can get the stock version for free.

          However, with a few code modifications, and some clever front end GUI applications, they become very usable.

          I make some productivity tools and a few people use them, but if it was the kind of thing everyone wanted, it would already be in the free version. Some people want a CLI, others a GUI, others a TUI. You can't please everyone.

          I'll never worry about being put out of a job because of open source software. Purely because you need to have a coder in house to understand it.

          I guess because open documentation isn't particularly glamorous work.

          -a
      • by br00tus ( 528477 )
        I agree, the ITAA has very consciously been changing laws regarding H1-Bs, FLSA, section 1706 and so forth for years. Most programmers and admins never heard of these laws, ignorance is bliss but they are one reason why wages will decrease, the ITAA was and is changing laws with very little organized resistance from IT workers. Many of them are like the top parent poster, who is happy that positions have dried up and companies are laying people off. They seem to have this idea that there is a skill line which they are on the right side of which protects their salary. This doesn't explain why IT salary surveys for the past two years have shown all salaries coming down - factoring in inflation, it looks even worse.

        Salaries and wages are coming down for a long time, something which was planned by the ITAA (financed by Microsoft, IBM, Intel etc.) who has been spreading millions around Washington for years. In most professional industries - doctors, lawyers, dentists, whatever, you have skilled workers, who work many hours, but who are still concerned enough about the profession to form the AMA, ADA, ABA and so forth. These "super genius" programmers think that economics doesn't effect them. How many dentists say "I'm the best dentist there is, I don't care about what the HMO's are doing"? These people are displaying a lack of professionalism and calling it professionalism. And they always get modded up high.

        Anyhow, these people exist, but the important thing are that people who are more clued in exist. The best thing to do in these dark times where the ITAA is very powerful, and organizations like IEEE are beyond hope are to get into contact with one another (eg. the clued in people talk to one another) and go from there. I think the Programmers Guild is a good organization, and there are some interesting Usenet newsgroups although we probably need one moderated by one of us. Since this directly involves my lifestyle I have put a lot of thought into this, this is not esoteric to me, it is very important as it severely affects my life for years to come. Here is my web page [geocities.com] on various IT work things which hopefully will help point people in a positive direction to do something constructive.

    • by diverman ( 55324 ) on Monday August 12, 2002 @06:26PM (#4057450)
      Well, while I agree that it would be nice to have those that caused the .com bloat, I don't think that things will happen the way you see it... not without other effects.

      I think that one effect of having a bloated industry of underqualified individuals, is that those who ARE qualified are being lost in the shuffle. Many of those that came into the industry because of the money, with business degrees, a few tech courses, and little to no real experience also had one thing going for them... something taught in business schools. That is "Networking". No, not packet or switch networks... but people networking. They cling to people they know, thus needing to rely on their own abilities less. Unfortunately, that "networking" mentality is something that many qualified IT professionals tried to avoid being dragged into.

      I am always trying to keep an eye out for the friends I have that I would really like to see remain in the industry. Not just because they are friends, but because they are the type of professional that I think can contribute positively to the industry, with their experience and their potential.

      So, while I agree that those .com'ers need to go... let's not forget that that shouldn't include people who were validly entering into the industry at that time, regardless of the boom. Many people who were caught up in the mess chose their majors before the boom occured, or before they were aware of such things. They may be a .com generation, but they aren't necessarily the ones that made the Net go to hell.

      So, in conclusion... I know plenty of people who should be in the IT field. Unluckily, many of those are also ones finding other professions.

      Just my $0.02...

      -Alex
  • re: (Score:5, Funny)

    by vanlomez ( 600422 ) on Monday August 12, 2002 @05:32PM (#4057015)
    "...At many companies it is now easier to get a corporate account at a strip club than a new PC"

    where are YOU working? :)
    • Accounts at strip clubs and such is a common thing at most companies and even government. Ask accounting. From experience I know that the state of Florida had regular lunch meetings at Hooters on the taxpayer's bill. Far more common are executive clubs, tables at fine restaurants and the like. Same basic thing, though - bonding among executives to work as a team.

      Note that I'm not really saying that it's right or wrong; it certainly encourages the glass ceiling for women. It's just a fact of business.

      --
      Evan (no reference)

  • by Anonymous Coward on Monday August 12, 2002 @05:33PM (#4057025)
    Ignorant computer folk that went into IT because IT was a big $$$ maker. Now there is such a flood of morons out there that companies are afraid to even look for good talent.

    Simple. Concise. Correct.
    • Simple. Concise. Correct.

      This sounds like one of those phrases that ought to be followed by, "Pick two."

    • I feel so strongly about this, I'm even willing to burn Karma for it (not that Karma really matters anyway). I agree with you completely. "IT" has come to be the banner of the lazy, ignorant masses who thought, "Hey, I don't need to get a college degree or learn any real skills! I'm am a h4X0r, and I can administer a Linux server, so I will just set up www.burninvestmentcapitallikeitsafossilfuel.com and I will become an internet millionaire overnight. Just look at Bill Gates, Larry Ellison and Michael Dell -- all high-tech college dropouts who together are worth more than many small nations." (Hint: for every one of them, there are about 100,000 college drop-outs who really did get nowhere, just like everybody said they would) The whole premise was so clearly flawed from the beginning, the only reason it ever caught steam to begin with was that the venture capitalists were even bigger morons than the .com kiddies. They saw "high-tech" and heard all of the nifty buzz words, and thought that somehow more money would magically turn money burning websites into huge cash cows. When reality hit home and everybody realized what we all knew all along -- that you need not only money, but an actual business plan to make money, the .com bubble suddenly burst and all the morons were out big time. Now all those pimply-faced kids who were running around with their noses so high in the air that they couldn't see solid ground for the clouds are out of work, and somehow think that their previous ability to burn investment capital qualifies them to work in the real world, so they're flooding companies with their resumes as "IT professionals," but their employment history reads like a listing from f&@*edcompany.com, so nobody wants them. I, for one, couldn't be happier. While they were all congratulating themselves on how 31337 they were, I was studying my arse off trying to get a EE degree with a wife and a kid in tow, and now that the bubble has burst, I'm within spitting distance of my degree and I already have a stable, well-paying job in an industry that has grown quite steadily for a very, very long time (defense). So, excuse me while I gloat and tell all you .com bums that if you ever want any respect from the business world again, get your noses out of the air, admit you don't have any marketable skills, get off of your lazy butts, get back to school, learn some real skills, and I'll see you in the real world a few years from now when you're done.
    • by fwr ( 69372 ) on Monday August 12, 2002 @06:33PM (#4057511)
      I'm not sure what type of business you are in, but we are definately looking for good talent (drop me a line if you are an experienced Cisco router expert that can manage large WAN environments for a "five 9's" type service - NY/North NJ area). I don't think companies are afraid to look. I think they are fed up that there seems to be a lack of good talent out there, even though everyone is complaining about IT layoffs. If they are afraid of anything it is wasting their time to sift through all the morons to find the good talent (but that's HR's job anyway). You either get the hopeful person that isn't really qualified and tells the truth or the imposter that basically lies about their abilities and you have to end up terminating them anyway. Find me some good talent and I'll find you some good jobs. Note that I'm not a manager, rather I'm a techie myself that gets to / has to "approve" a candidate before they are hired. If they can't get past interviews with fellow techies then there's not much chance that management will even entertain a relationship.

      You know - that's what Slashdot really needs. A Slashdot job section instead of just a URL to OSDN Job Seeker. Maybe people that want to comment on a story could click a check mark or something that says either they have a position open that may be relevant to the story or are seeking a position that is relavent to the story, and not have to make a blatent troll for talent/jobs. Something integrated into the system. Possibly another rating like Karma but for technical knowledge. That way when a techie makes a informative comment people can increase their "tech score" or decrease it if they make a technical blunder. Or maybe not...
  • Security (Score:2, Insightful)

    by HerrGlock ( 141750 )
    Hopefully the management will realize that having 10 minimum wage types working on their systems is not the way to go. They will then wrap up that money and hire one person who is worth a darn to do network security.

    That is where they need to spend money, not on sub-vice-assistant-coffee-boys in charge of creamer for the network.

    DanH
    • But as things get more stable and programmers learn how to avoid things like buffer overruns, who's going to need keep somebody on staff for network security?
  • No, OVERVALUED (Score:5, Insightful)

    by Ars-Fartsica ( 166957 ) on Monday August 12, 2002 @05:34PM (#4057037)
    We have succesfully put to rest the cult of the superstar CEO, now it is time to put to bed the myth/cult of IT.

    Most firms now realize that they can survive another year without upgrading their router or servers, which were either so expenseive originally that they simply must sit in the rack room longer, or are "good enough" even if they aren't the latest model.

    Software is a whole other story. Most companies realize now that upgrades are a scam.

    On top of all of this, many buyers realize that the latest tech will simply make them part of a large beta testing mob, where their old tech is now largely debugged and productive. Certainly MS users understand this.

    • Re:No, OVERVALUED (Score:2, Insightful)

      by daemones ( 188271 )
      Timing the jump from version to version, however, is important. Given their way, MS might make getting anything other than the newest version of their software patently impossibly. Legally impossible too, if they can buy enough senators.
    • Re:No, OVERVALUED (Score:2, Insightful)

      by SpamJunkie ( 557825 )
      Superstar CEOs haven't gone away! We've just realised that they are more rare than we thought. Look at Steve Jobs, he's a superstar CEO is ever there was one. Gates would be too, if he was still CEO.

      IT is the same. You can't just say it's overvalued. It isn't that simple. Your points that upgrades are a scam simply suggest that commercial software is overvalued. With OSS software you aren't part of the beta testing mob unless you want to, by downloading the unstable branch.
      • Yes, Gates and Jobs are exceptional, but as a general rule, corporations have edged back of ridiculous expectations for their CEOs. Certainly in Silicon Valley during the goldrush times, almost all CEOs were seen as walking on water by virtue of their title.

        I'm not talking about the exceptions (Gates, Jobs, Buffett, Weill), I'm talking about the other ten thousand.

    • Re:No, OVERVALUED (Score:2, Interesting)

      by Enzondio ( 110173 )
      Software is a whole other story. Most companies realize now that upgrades are a scam.

      I'm not sure this is accurate, I think in many cases management is afraid of getting behind the times. This is especially true in companies who deliver computer/Internet related products. Our CTO is constantly trying to keep our company on the bleeding edge of technology (sometimes trying a bit too hard if you ask me). Many times this is simply so we can impress a client by saying that we're working with the latest greatest stuff.

    • Re:No, OVERVALUED (Score:5, Insightful)

      by Otter ( 3800 ) on Monday August 12, 2002 @05:46PM (#4057147) Journal
      Yeah, the whole tone of the question is indicative of what you're talking about.

      Can IT be used in to do things that will be productive? Sure! Can IT spending help your bottom line? Definitely. But those gains come from coming up with a good application, not from implementing an XML-based intranet system and having money magically fall out of the sky.

      Asking "What is the value of IT?" reminds me of the admins we always have to deal with who think the fundamental activity here is having a network and computers, and that all those things everyone tries to do with their computers are just irritating distractions from idiot lusers.
    • I'm not saying that what your saying isn't true. I just don't believe that is has anything to do with anything. If there is ANY lesson to learn from the last few years, it is that people do not operate (either personally or professionally) in anything approaching a rational way. People operate on the principles of greed and fear. Right now we are in the fear state. Sometime later we will be in greed state again. I always snort derisively when stock market "analysts" say that investors have returned to "value" investing (not that the analysts themselves has anything to do with people doing "non-value" investing...).
    • This is absolutely true. Businesses don't need better technology, unless their business IS technology. An average firm only needs one or a few big databases, a network of pcs, office tools, e-mail, a corporate web site, and other stuff like acrobat.
      Win2k or NT running on Pentium IIs is all anyone in a basic office really needs. The only use for higher technology is technological work. Scientists, programmers, animators, etc.
      Finally they have realized that faster computers don't increase productivity or profits. It takes "joe" the same amount of time to prepare his presentation whether he's got 400mhz or 2ghz. Powerpoint 97 isn't more or less productive than the newest one(XP?).
      When upgrades become necessary, which is about every 3 years I'd say (my current pc has lasted me about 3 years, and just now I'm finally unable to do things I want to do), that is when more spending will go to IT. Other than that the only IT spending will be limited to salaries for the employess needed to support the existing system.
      If a company is spending more than that on IT, they are wasting their money, and it looks like they finally realize it.
    • Certainly MS users understand this.

      sure - but isn't it much more convenient to blame the economy, or piracy, than to blame one's own crappy products or policies?
    • No, Undervalued (Score:2, Insightful)

      by ergo98 ( 9391 )
      One mistake that people often make when looking at industry is presuming that there is some "stopping point" at which the company has been tech enabled: How utterly insane. Business is competition, and if you're in a billion dollar field and spending $20 million on a new database system will allow you to proactively respond to your customers quicker, gaining more marketshare, then it's likely worth it. If you got replaced a database server with a new, super hyper database server that gives you a CRM system that allows you to capitalize on every call, and keep customer satisfaction at its pinnacle, then not only is it a good investment, competitively it is likely crucial for you to survive as a company. There are countless examples like this where competition is the driving force behind technology: Sure, righteously deride technology in a luddite fashion, but remember your words fondly when you're in the unemployment line.

      Economic slowdowns cause a cessation in spending, and the reality is that often the spending reduction turns out to be disasterous for those companies that fall behind.
      • Well, the general point is that people weren't waiting for technology that would fruitfully improve profits (no one is disputing the value in that)...its the issue of people buying whatever was being pitched to them because it was new.

        As for your example of CRM systems, there are dozens of companies who have blown huge budgets trying to make flaky enterprise apps work. Its often a suckers game.

    • Re:No, OVERVALUED (Score:3, Insightful)

      by stwrtpj ( 518864 )

      My post delves a bit into corporate politics and I'm risking an offtopic mod, but what the hell ...

      Most firms now realize that they can survive another year without upgrading their router or servers, which were either so expenseive originally that they simply must sit in the rack room longer, or are "good enough" even if they aren't the latest model.

      This is a true enough statement, but the problem that I have seen several times in the industry is that some companies take IT cost-cutting to a ridiculous extent. I have worked at two other companies before my present job, and in talking with others that stayed on, in each case the company followed these actions:

      1. Get into a budget crunch
      2. Look for ways to cut budget, finally notice IT budget bloat
      3. Some brilliant manager who is not even in the IT sector says "Why do we even need to keep this in-house? Let's out-source!"
      4. Company outsources IT department
      5. Budget problem solved, managers pat themselves on the back.
      6. Meanwhile, the rest of the company is now stuck with the bozos from the outsourcing company, a contract that went to the lowest bidder, and as they say, you get what you pay for.
      7. Productivity and morale nosedive and the managers never make the connection, instead blaming it on factors that are not even related,and they wonder why people keep leaving the company in droves.

      The point of this is that, yes, some IT budgets are horrendously inflated, but many companies wind up throwing out the baby with the bathwater. Trimming the IT budget intelligently takes time and effort, and a lot of upper management does not want to deal with this. They have a "just get it done" mentality, and get kudos for the money they save each year and not necessarily for making smart decisions about how they got the budget down.

      "You reached your goal of slashing costs by 25%? Excellent! Here's your bonus"

      Meanwhile, it takes the average developer about twice as long to get someone to do something simple like set up a database or add a new user account.

  • IT is overrated (Score:3, Insightful)

    by adamiis111 ( 525750 ) <adam@varudNETBSD.com minus bsd> on Monday August 12, 2002 @05:34PM (#4057038)
    I love working on IT, but let's face it, this is just like any other department in a company. Many of us have seen the total waste of $$$ that an IT manager will sell to the higher ups - sometimes just to work with new technology, etc.. The fact of the matter is that at a typical company, IT budget is not seen with an eye on monetary rewards. That has changed recently. Business rules state that if a secretary does something well for 30k a year, don't spend 200k to eliminate his/her position as it is not cost effective (even 100k is too much because it probably doesn't include maintence costs and the cost of changing business rules (which is much more expensive for software than a secretary)).
  • Innovation? (Score:5, Insightful)

    by kin_korn_karn ( 466864 ) on Monday August 12, 2002 @05:35PM (#4057042) Homepage
    What can we do to keep the spirit of innovation alive while this 'IT is bad' era lasts


    This is easy - innovate. Don't just buy new hardware and upgrade software, do something that IMPROVES life at the office. There's more to IT than scalable switches and making sure that you can ping the server. Come up with new applications of the technology and make yourselves valuable.

    Create your own value, the rest of us have to do it.
    • Agreed (Score:5, Funny)

      by Christopher Bibbs ( 14 ) on Monday August 12, 2002 @05:41PM (#4057092) Homepage Journal
      I work in software development and in speaking to our customers (yes, I actually talk to my users directly so I know what they want/need) many of them are working on much better and more useful applications than they were two years ago.

      Less online phone directories more online report generation from divergent systems. Moving data from paper booklets to online is cute, but what does it save? Create real time reports using data from different systems (internal and external) and suddenly you have something that makes life much better.
    • There's more to IT than scalable switches and making sure that you can ping the server. Come up with new applications of the technology and make yourselves valuable.

      I personally don't think that should be called IT. IT - Information Technology - is making the data flow. It's not designing new ways to use networks, or creating innovative applications. Instead it's building the same old database application with different field names, or configuring a router with a wizard. To me, this is just the image that things called IT have put out. I won't call myself an IT worker because of this. I don't do any of those things, and that is what people who write paychecks see when you say "IT".

      Those things that are called "IT" really are essentially dead from the view of the person asking this question, because the work is 99% done. There aren't any innovative things you can do with a database that are essential to corporate activity. Most places have systems in place that work, and don't need to waste money on new crap. Companies that are thinking this way are right. Too much money was wasted on buying infrastructure over and over again. Lets wait 10 years before we do that again.

      In the mean time, there is plenty of money to be made in the market if you can think of new, innovative things, and are capable of designing high performance technology. Hopefully the people who were going along on the IT ride for the extra cash will be weeded out, and progress can be made again without wasting money on salarys for people with barely enough knowledge to get in the door. If you want in, and I shouldn't have to tell you this I suppose, the first thing you have to do is forget about "IT" and calling yourself an "IT" worker. Be an engineer. The buzzwords matter when you have to convince somebody who doesn't really understand what you do to write you a big fat check every week.
  • Cause and Effect (Score:2, Insightful)

    by Fehson ( 579442 )
    This is the outcome of the insane IT boom of the 90's. I see many companies that purchased false promises and have nothing to show for it now. The company I work for has boxes full of software from various vendors. Most of which doesn't do what it claims, or does it so poorly its not worth running.

    I don't think the attitude we're seeing the the death of IT, rather the maturing of it. There is now a need to justify purchases. The "we need it cause it's cool" attitude is long gone. People now look at a product and say, how will this help me.

    Innovation hasn't slowed down, false promises are no longer considered innovation. I can envision the next few years to be some of the most innovative we've seen in quite a while. The smoke is cleared and the mirrors are gone, tis time to do real work.
    • I agree, and would even put forward the hypothesis that this is a sign that the bust is getting closer to ending. There will still be many shocks in IT, but this along with other signs of irrational apathy towards IT. Another sign will be ignoring all benefits, even if the project would have positive value to a disinterested observer.
      At that point those who invest in the good projects can get an advantage in the short term and this advantage, as well as the associated recovery, will spark the beginning of the recover.
  • The circle of life (Score:2, Interesting)

    by Wrexs0ul ( 515885 )
    This is just a low cycle. Because IT as it is today is such a new field this is the first low end of a new business structure that'll rebound in the next months/years. There's still contracts out there to be had and still companies willing to pay good money for them, you just have to look harder and be more innovative to outlive, outlast, and out play the other IT survivors until life gets easy again.

    -Matt
  • you want the truth? (Score:5, Interesting)

    by EnderWiggnz ( 39214 ) on Monday August 12, 2002 @05:41PM (#4057083)
    the truth is - nope, IT spending in the vast majority of comapnies out there should be drastically reduced.

    I know of a NE Power Company that spends 10k/year/employee on IT expenses.

    Which is insane. They arent a software company, they arent a development company, these expenses are a pure expense that generates no revenue.

    none. nada, zilch.

    how can you justify paying a HS graduate with a "certification" that tells people to reboot their machine as a fix for everything souble what you would pay a marketing person with a college degree?

    you cant - there is no justification. Then you consider perpetual hardware upgrades and software licensing, you get an even worse picture.

    Look at it this way - if you spend $1500 on a home appliance, like a fridge, washing machine, how long do you expect it to last? 15 years? 20? more?

    and you want businesses, who arent in the computer industry, to buy new equipt every 2-3 years? not gonna happen.

    then theres MS's latest yearly tax. if MS would have had it ready 3 years ago, it would have worked, but not now, no way - businesses dont care what they run, they just want to keep their expenses down.

  • New PC's (Score:5, Insightful)

    by captain_craptacular ( 580116 ) on Monday August 12, 2002 @05:41PM (#4057084)
    This may not be a popular opinion but I really don't think the majority of corporate users NEED a new pc very ofter. I'm a full time software developer and I'm perfectly happy with my "ancient" PII 400... Granted there are always exceptions to the rule, but for the most part I think new PC purchases should be scrutinized.
    • Granted there are always exceptions to the rule
      Video editing, DTP, graphics etc.

      However most of the time a new PC has to be an overpowerd box - when was the last time you saw a PII400 arround? The slowest boxes I've seen are durons, at arround 1100mhz.

      IF a new employee comes to town, or a machine breaks (or even part of a machine), a new box is needed.
    • Out of curousity, what is it that you develop?

      Personally, I'm happy with my PIII800/256 at work, but only because the compile times give me plenty of time to read /.

  • by sterno ( 16320 ) on Monday August 12, 2002 @05:41PM (#4057086) Homepage
    The late 90's: All Tech Spending is Good
    The early 00's: All Tech Spending is Bad

    It's oversimplistic reactions to the problems that came from tech spending in the 90's. Many people were creating products that were full of pizazz that didn't work for crap and people bought them because they thought technology was their salvation. Well guess what, technology isn't a magic pill, and anybody who claims ANYTHING is a magic pill should be taken out back and shot.

    So now today, everybody is gun shy and overcautious. A company gets burned in the 90's converting their billing system to some flaky electronic system that has cost more money to keep together than your old system. Today they get the choice of buying yet another new system, taking the same risks again, or sticking with the known quantity. At this point, with money tight, few are willing to take that risk to get it right the second time because they can't afford to get burned this time.

    Over the next few years as a recovery slowly works its way into the system, some people will feel that they can take some risks again. Those flaky systems will have long since been purged from the software gene pool and there will be good products that people will be able to trust. We'll actually begin to see those efficiency gains that were supposed to happen during the 90's hype and the world of IT will be back in business.

    Until then, batton down the hatches and hang on tight :)
  • I'm the head of IT for a small Financial Management company in the midwest. There are 23 computers on my network connecting to two Win2000 servers for print and file sharing. I see 99% of my entire IT budget being poured into Microsoft.

    Look at every new machine purchased. How much of that cost goes directly to Microsoft for their Operating systems and even worse their Office suite? There was an article here dealing with it, but since it's almost time to go home I'm not going to take the time to look it up exactly. It was almost 80% of new computer costs go to license your Microsoft apps. And what's really sad is we are stuck on Windows. All our mutual funds send us their prospectuses in some crappy VB program or their brochures in Word format.

    So don't fool yourself with where your IT costs are going. They're going to Microsoft and the trash for all the IT people who sit and read Slashdot all day when they should be working ;)
  • ah, the fickle... (Score:2, Interesting)

    by macsox ( 236590 )
    this, unfortunately, is the flip side to the internet-bubble coin.

    remember how, in middle school, people followed the trends initiated by the cool kids? same thing works in business. in the late 90s, people thought 'hey -- the joneses are buying nobusinessmodel.com, so i will'. now people think 'wow -- i lost my shirt on dot-coms -- computers are a waste, as that fellow in usa today pointed out.'

    wired magazine had an article two months ago that pointed out that all tech developments went through a curve -- early adopters got people excited about a tech, but then excitement waned because people couldn't see the use. as applications became apparent, adoption and excitement picked back up. technology in general is going through the same trend.
  • Ever heard of that? I know far too many IS projects that have costs, but there is no way to quantify the benefits. There is one very simple thing to understand here: costs come out of IT, benefits come out of the customer department. If you can't get the customer to step up to the plate and provide a believable justification, followed up by demonstrated results, you're sunk. The next layer here is time frame. A project that starts paying for itself (i.e. is deployed) three months or less from inception gives everyone a benchmark to know whether the customer's estimate of benefits, as well as IS's ability to deliver those benefits. A project that won't be deployed for three years is, frankly, silly. It's an exercise in blind faith, not an exercise in rational development strategy. If your development methodology delivers things infrequently in large lumps, you've got real problems. If it allows you to break large projects down into two or three month chunks that can be deployed to start getting a return on that investment, then you've got something. The last layer is risk. Those massive three year projects have unacceptable business risks. If you break them into two or three month deployables, you've limited the risk your customer (internal or external) faces. John Roth
  • by Anonymous Coward
    I work for a VBC that's known as one of the best managed companies in America. We love companies with the attitude you describe. We call them "also-rans". Each year our businesses are given a target cost they must remove using IT projects. Spending on the projects comes from projected savings. The business owns the budget, but has to spend it through a centralized IT group. In our business, the number was $270 M (our revenues are about $10 B). Miss the number one year and the VP's don't get bonuses; miss it twice and the VP's don't have jobs. The trick is measuring the benefits, and auditing after project implementation to make sure they aren't playing games with the numbers. Does it work? This year, we expect to save [eweek.com] $1.6 Billion (across all business) while increasing IT spending 12%.
  • one bean, or two? (Score:2, Informative)

    by mojogojo ( 577421 )

    However, alternatively, I've been thinking upper managers aren't worth their salary and the overall burden placed on IT (i.e. regular staff meetings, and various time wasting activities that could be devoted to streamlining the business... or handling a data conversion from a merger, etc).

    And besides, what is the value to assign to cost savings for automating processes so individual peons don't have to press buttons and screw up data? One bean, or two? When I convert a customer, order, and trouble ticket history database from one company to ours... and it allows a single support person to intelligently answer a new customers question quickly - is that worth one, two, or three piles of beans?

    I hate justifying my existence, and generally never have to when I deliver on projects that simplify processes and do things quicker than would have to be done by hand...

    I'm not sure, but I really don't think the sky is falling either...

  • IT is OUT (Score:2, Insightful)

    by godemon ( 599146 )
    Lets face it, the networks are in place, the companies have a computer (no matter how old) on every desk, and they've already run their network lines throughout - and if they haven't yet, then they aren't going to.

    They age of innovation and upgrading is over, the workers of IT have built a solid IT foundation, and now that it's constructed, budget is cut to simple maintenance. Of course the IT dept isn't disapearing, it's just no longer expanding.
  • I don't know too much about valuation -- I'm just a software engineer. What I do know, is that I'm vastly underpaid for my position, even in today's economy. I also know that while most of my colleagues use lots of worthless IDEs that cost $400 a pop and run them on machines that cost $4000 or so. I'm sitting over here working twice as fast on the same PII366 that they issued to me back in 1999 LONG before the bubble burst, and y'know, I've never needed an upgrade because vi and jdb just aren't that intensive.

    I've noticed that as my colleages machines have been upgraded, their code has gotten slower and slower. Maybe your admin assistent needs a GHz machine to run your finance spread sheet, but I know a good developer doesn't need the $4000 machine to hack code. I also know that our new enterprise Solaris box isn't as fast as the old duel processor Linux box that we used to use.

    Okay, now that I'm going to lose my job for whining about corporate decision making... everyone have a nice day.

    -brian

  • Today's IT undervalued, to a small degree yes, a little bit of an overreaction to yesterday's IT being overvalued. Spending probably should be reigned in compared to previous years, more justification for purchases and projects offered, etc.

    People should realize that the IT of a couple of years ago was inflated just like the stock prices, that today is more normal than then. Apologies for the bad news.
  • I think our company (I work for a college in the UK) don't quite realise the importance of IT in their establishment. I think the problem is, at least from the point of view at my place, is that people just do not realise how much computers have taken over the 'behind the scenes' aspect of things. Computers are responsible for our critical finance data, running our telephone system (Cisco IP Telephony), running staff payroll - the list is endless. Yet people still see network backbones as a few pieces of old BNC strung across the back of a wall into a small repeater under the desk.. what they dont see if the miles and miles of UTP and fiber we have linking the various blocks and departments together, not including the various switches and routers connecting networks.

    Is it because IT systems have become so reliable, and so transparent to the average user that they give it any thought any more? We keep pondering flicking the power switch on the core switch one day, just to see how much people suddenly realise their IT network means to them.. I'd give it about 1 minute before we got hassle from senior management asking when the network was due back up.
    • by NineNine ( 235196 ) on Monday August 12, 2002 @06:02PM (#4057286)
      What about all of the power to power your precious machines? You ever thought about that? I'd guess that 80% of all companies today could *get by* without a computer system, just as they did years and years ago. I'd guess that 0% could get by without power. Yet, you don't hear the electrical engineers saying "I bet they'd appreciate me if I shut down this transformer". Please. Grow up.
  • Tangible benefits? (Score:2, Interesting)

    by cheezehead ( 167366 )
    ...especially if the advantages can't be reduced to a simple dollars-and-cents figure?

    This hits the nail on the head. I make software for a living, and I can't imagine how I ever did my work without the Internet. E-mail, Web access, it all saves me quite a bit of time, and therefore the company money, but I would have a hard time trying to quantify these savings in dollars and cents.

    I don't think IT departments will die. Virus checking, bug fixes, etc., it'll all still be necessary. People are getting more and more dependent on IT technology (wireless e-mail, web access). There is no way we are going to go back to snail mail and typewriters.

    Now, delaying buying the latest and greatest hardware, and the latest and greatest version of MS Office, that I can see happening...
  • Just a theory:

    Could it be that management has finally caught on to the trick of buying hardware and software that requires constant expert maintenance? As an IT dept. head, wouldn't you rather buy stuff that you know would keep those under you busy and employed? You sure wouldn't want to be the one that made the decision to buy hardware/software that made the IT dept. unneeded.

    (I'm not saying this happens often, but it does happen. Ever had to try to convince some MSCE to let you buy the right tool for the job, instead of the electronic version of the spork?)

    Unfortunately, thanks to M$ licensing schemes, hardware that's outdated 3 minutes after the box is opened, hardware that is underdesigned and runs hot, vulnerable software, etc., IT is a money pit. This especially true in those organizations where some PHB just has to have the latest/greatest as determined by some marketing wiz.

    It's a pendulum that has swung way to far one way, and will swing far the other way. Hopefully it will settle in a comfortable position soon.

  • In any era of disconfidence, there will be upheaval--this one is no different. If we're lucky ('we' meaning people in software and hardware development), the number of positions will shrink, but the workload and pay increase.

    I'll go on the unpopular edge and say I think IT has been overhyped, personally, as if it were the fundamentals of all industry. I think it's important, but certainly not as much as true business. IT is infrastructure and support. All it can do is act as a lubricant for other processes, not create on its own. To have a massive IT department with huge budgets, particularly if the business is not directly in an information-dependent business, is like paying structural engineers 6-figure salaries to make sure the building is still level.

    Someday, when software and hardware become truly stable and less bug-ridden, IT departments may just go the way of the contract skilled labor. Until then, we have dedicated personnel and associated costs.

    One of the major issues with having in-house people for IT is that they look for work to keep themselves on payroll. It's natural. The next big thing is XYZ, and we need to have it! That typically spells investment in time and money, but not necessarily a justified need being filled. I've seen it done in past companies many times, and thought nothing of it because we had money to pay for it. Not anymore. For instance, I got 3 different machine upgrades in one year because our IT department kept raising the bar on machines and didn't want to keep older ones around, because it was more work for them to maintain different configurations. It caused more upheaval for the whole company than not doing any upgrades at all, and kept them mighty busy doing conversions. We had 6 people on staff to serve less than 80 employees. Tail wagging the dog syndrome.
  • Nice question (Score:5, Insightful)

    by daviskw ( 32827 ) on Monday August 12, 2002 @05:47PM (#4057164)
    IT is a value added resource in most companies but, sadly, in most companies it really doesn't directly contribute to the bottom line of profits vs. losses. IT's value is in making the employees lives easier without intruding on the day to day operations of the company. This tends to a be a cyclical trend based on two factors. The first is arrogance and the second is repentance.

    The arrogance factor is what drove IT spending a couple of years ago. In essence, it is drawn from the idea that for the vast majority of corporate America IT organizations have tended to view themselves as being "The Reason for all Existance." CIOs, and the organizations they represent, develop an over exagerated opinion of their place in the world. The inevitable happens when the CEO realizes that spending a third of the total corporate budget on new computers still means he has to use Microsoft Office.

    The repentance factor happens when after the arrogance factor has disappated and IT spending has flushed itself down the toilet. Computers start breaking and the two guys who program in COLBOL either retired or died. The peasants rise up in arms and the CEO takes notice, realizing that just maybe he needs to up the dollar count before he drives his company out of business.

    These two cycles make up the Hebrew Cycle of Corporate Management, or HCCM for short. This is named after the relationship that God's chosen people have developed with God.

    In a couple of years, when processes start breaking and computers get older causing more downtime than otherwise necessary the trend will turn around.

  • We're not done bursting the bubble yet. On a fundamental level, I believe in the value of computing and information technology. Any money and effort spent there is bound to reward any large company. The problem is that the effort/money isn't spent well. We've been in a long era (going back much further than .com) where IT is 90% bullshit and 10% reality. It's time to drop the excess weight and get back to doing things.

    Among any technical job title out there, be it "Java Developer", "Web Guru", "Network Designer", "Oracle DBA", or any of the other millions, I firmly believe that a large chunk of the people holding a given title (75%, higher in some areas) aren't worth their salaries. They should be making $30,000 for the amount of real benefit they are providing, not $90,000.

    Companies need to get real. They need to spend IT HR money into two basic categories, and wisely: Production Support (mostly human macro tasks, shouldn't pay all that well), R&D (coders, testers, app designers, systems/network engineers - should pay well, but you should have about 10% of the staff you have now, and they should all be skilled and worth their money).

    With a clear vision and none of the bullshit overspending fluff this industry continues to see, I bet the average IT dept could run on 20-30% of their normal budgets.
  • That's funny, I just heard in class today that customers are telling us [ibm.com] that while they recognize that B2C is pretty much dead, B2B and infrastructure update plans are still going forward full force. Bandwidth updgrades, SAN strategies, moving legacy apps to the Intranet -- they're all continuing to move ahead, even in the current economic conditions.

    Sure the .bomb was an interesting picture show, but more importantly, I think it helped companies realize that you really *can* dramatically alter -- and improve -- the way you do business, with the right technology. I'm not talking about betting your business on ad revenue, either.
  • I do not believe that any, well argumented, IT purchase will be left undone if it has business sense. The difference is that before the "fall" everything could have business sense, even those ideas that you yourself considered somewhat insane had potentially some potential, because there was always someone who thought they could make some easy quick money with it. Many did. Also, at that point, companies had extra fat to spend on those adventures. And to even by the world's smallest, fastest and sexiest laptop for every nerd so that they can watch pron.

    Now, it is just not considered as magic anymore (like in add sulphurous ash, ginseng and garlic and dot com) but is on same line as any investment, just where it should be.

  • This post, combined with the story yesterday about IT workers heading for restaurant jobs points out how important it is to be consistant.

    The companies that are doing the best in the current .com slump are the one's which kept their bearings and didn't overspend. Likewise, heading too far in the non-IT direction will end up destroying a good number of companies which find out, only too late, that they can't keep up with their competition. Consistancy and keeping to a logical business plan is the key to a successfull business. Always.

    I've heard this story in the comparison of Wal-Mart vs. K-mart....one spends consistantly on IT projects that result in overall savings (i.e. not boondoggle projects), the other simply flaps with the breeze. K-Mart is now trying to catch up with inventory/tracking/pricing/shipping software and improvements that take years to implement correctly. They are also on the brink of insolvency.....not exactly the time/place to be making lots of rushed and critical decisions....

    Perhaps there will be a downtime for IT, but it will come back as soon as things start to break around the office....
  • Waste of money? (Score:5, Interesting)

    by ocbwilg ( 259828 ) on Monday August 12, 2002 @05:51PM (#4057192)
    And, how do you prove the value of IT? This is not as simple as it seems. Try it with a spreadsheet: as your typical CTO has to do so, every day."

    Screw that spreadsheet nonsense. If I EVER hear a calculator monkey where I work say something as half-assed as:

    'IT has now been proven to be a waste of money'

    I'll be headed straight for the wiring closets and pulling the plugs on all the switches and routers I can find. Shortly thereafter I'm sure he'd figure out that IT actually does have value, though he may still be hard-pressed to quantify it.

    The real problem with IT is that we were promising people the wrong thing. We promised them that it would make workers more efficient, allowing us to get the same amount of work done in less time. What really ended up happening is that now we get several times as much work done in the same amount of time. We don't work shorter hours, but we do get more done. That's a good thing.

    The company that I work for has done several projects for businesses and government agencies that seemed prohibitively expensive at first, but usually ended up paying for themselves in savings after 6 to 12 months. We've done computerized inventory and supply chain projects and tied it all together with wireless PDAs resulting in a faster and more accurate accounting of inventory, reduced labor costs, and the near total elimination of paper documention that required costly and inefficient storage solutions.

    It's can somewhat difficult to understand, so I can see where someone might deceive themselves into thinking that IT is a waste of money. It's much easier to see when you have a specific task that is being moved to a computerized system. But honestly, I have to think that someone who sees IT as a waste of money is either a) not using it properly, b) paying far too much for it, or c) not really thinking about it.
  • Companies buy crap that they don't need, don't use, is unnecessarily complex, or that can't work in that environment.

    They get suckered by sweet sales talk, then complain when it does not work.

    If you shoot yourself in the foot because you don't know how to aim, don't complain about the quality of the gun.

    We need a return to K.I.S.S. and a more level-headed evaluation and planning approach.

    I don't know what the answer is. BS experts are just too damned good at what they do. I just read an article in the WSJ about how some sales experts are having a substance injected into their face which reduces certain facial expressioins by deadening nerves. IOW, they have cosmetic surgery to hide their lies.

    Get the Goddamned suits out of my face, and I can build pretty good systems that are realatively simple and maintenance-friendly that solve real problems that I will happily stand behind. However, when the suits start stick their grimmey fingers into the mess, then all bets are off.

    True geeks solve problems, true PHB's manipulate problems. Control over key decisions tend to gravitate toward the manipulator's region for some reason.
  • Will remain so (Score:3, Insightful)

    by jag164 ( 309858 ) on Monday August 12, 2002 @05:52PM (#4057207)
    Fortunately (unfortunaely for some) this will be the trend for a few more years while the untalented IT workers fade away. Before the boom, IT and IS had qualified workers. Then the web took off and my grandmother became a VB programmer b/c she's one of a few americans who can program her VCR. This led to calling HTML monkeys 'programmers'. Then the era of dot bombs and an increase in failed IT/IS projects came along.

    Of course companies are tenative on dumping money into IT becuase the money still has a better chance winding up in /dev/null then it does being productive. Basically the same reason why the majority of americans are afraid to invest money in the market today.

    A few more years when the unqualified IT/IS staff go back to ringing up Big Macs(tm), faith in IT/IS will return to normal. In the mean time, if you are good, just hang on and and do your best.

    Also, as we get a few years older, more and more people (employers, co-workers, and ourselves) will understand the role of IT and our field will be better defined...thus better 'trusted'.
  • IT Needs weeding (Score:3, Insightful)

    by corwinss ( 523546 ) on Monday August 12, 2002 @05:52PM (#4057208)
    There are many problems that have erupted in IT since the .com boom. These may or may not have been around before, but they are definitely known now.

    One of the biggest problems I see is that there are many managers of IT departments who are just that - managers. Think of the pointy-haired boss in Dilbert. Just because someone is a manager doesn't mean they know the first thing about IT. I'm not advocating the promotion of your average IT nerd in place of them, but there are always a few people who have both management skills and IT knowledge. A good manager passes things off to the big bosses as good ideas. If he (or she) understands what he's working on, then they will probably be good ideas. If he doesn't, then they will be things that look good on paper and get him more funding. Nevermind that it makes the people below him aggrivated. It makes him look good, and gives him more money to spend on his desktop toys.
    I have seen this problem in action. It's always fun to get a blank look when you try and explain the simplest of tasks to these people. It's like trying to explain matrix algebra to a 3rd grader, only with less chance of success.

    Another problem I've seen is that, in the name of saving money, people buy inferior products. Some manufacturers are more reliable than others. Before ordering 100 systems from a company because its "cheap", it might be a good idea to order 5 of them and test them for a month or two, and see how well they perform. Maybe even order 5 from another company to compare them to. Also, ask the people who regularly maintain the ystems which kind of systems they have the most problems with. It might be a good idea to get their advice on who to order systems from. Then you will avoid problems like the one I have seen recently, which involves losing more than 1 computer a week to hardware failure. These computers are not even a year old, and still under warranty, but it still causes problems when you have 4 more break before the first warranty part gets there.

    Hopefully, the cutting of funding to IT departments will drive off people who are "in it for the money", like these managers without IT skills, and also will cause people to take more care when selecting computers.
  • by Xzisted ( 559004 ) on Monday August 12, 2002 @05:55PM (#4057226) Homepage
    I run the IT dept. at my company. I am the IT Director, the Systems Admin, and the Network Engineer. It is a small company of 30-35 people. We spend more money on IT than anything else except salary compensation for our employees. When I have to justify something like a computer or peripherals...I usually do it by simply explaining that we are spending 100k on an employee in salary and benefits and that he has to have an effective working environment in order to be productive. We can't just give a programmer some 3 year old used PC and expect the same level of productivity from them as one with a new PC and an ergonomic mouse/keyboard with a nice monitor. Now I'm not talking top of the line stuff like an Aeron chair and a computer with a Geforce4 Ti4600 card...thats just plain ridiculous. But actually investing in hardware and infrastructure that can VISUALLY be seen benefiting the users.

    On another note, due to the fact that I am the only person in our IT dept. at the current time I have been able to keep costs down in other areas of my dept. I don't have to pay for training for any other IT employees or for more computers for them. The fact that I have kept my dept. streamlined and directly on task for what it's purposes are has garnered me alot of faith and responsibility from the higher ups, which means more freedom with the budget.

    IT shops that just spend and show poor price/performance and hence have trouble getting things done is a symptom that there are some really ineffective people in the IT field. I'm sorry, but a degree from DeVry's is not going to get you a job working for me (I am looking to hire someone soon to allieviate some of the upcoming strain on my time). I have been in this field since I was 15 and working for Ericsson during high school as an asst. network admin. I did this because I loved the work not just because it paid well. If an IT person can't show me that they not only know computers but that they understand the underlying purpose of an IT dept. (which is generally to help the company get its work done) then they will be ushered right out of my office and back on the street.
  • Maybe these companies have realized the power of Open Source. The disproportionate amounts spent on hardware and software are a telltale sign of Free Software infiltrating the IT industry. This is a Good Thing for Free Software. If this is not a sign of increased use of Free Software, it will definately be incentive for IT workers to use Linux and other Free Software instead of commercially licensed software.
  • <BILE>
    'IT' may be a bit undervalued at the moment, that is to be expected after a long period of overvaluation, but not by much. In general, IT departments are dominated by badly trained monkeys who either couldn't make it in real technical fields, or wanted to make lots of money without doing any real work. (actually, there is a third category: the megalomaniacs looking for a way to lord thier power over the poor unfortunates who have to work with them) It is awfully hard to undervalue such folk.

    The simple fact of the matter is that companies, even today, spend far too much money and effort on supporting their information systems. Some of this is due to the promulgation of inappropriate solutions (read: M$ bloatware). Some is due to the rank incompetance of the run-of-the-mill 'IT' worker (read: MC*E-bearing monkey). Obviously, a large part can be laid at the feet of CTOs and CIOs who don't put proper effort into either hiring or purchasing decisions.

    So, the party's over and the bean counters are no longer willing to squander lots of money on faster machines to run more bloated software just to tickle some geek-wannabe's thirst for chrome. Boo-flippin-hoo! Good for the bean-counters! Putting 800MHz PIIIs on a secretary's desk just to type memos and emails was ridiculous in the first place. Now, anyone who recomended such silliness should be thankfull they haven't been tossed out on their ear.

    As for how this will affect 'techies': don't make me laugh! 'IT' folk are not, in general, 'techies', they are administrative personnel. The real techies are off doing technical work and are in little or no danger from the 'IT' downturn. In fact, if the 'IT' downturn gets rid of the more obnoxious admins, all the better for the techs who don't have to deal with their crap.
    </BILE>

  • Too many high-level managers I've met miss the real key to an effective IT organization: the people.

    Think about the "bad" IT spending (both in terms of dollars spent and dollars not spent costing productivity) you've seen in the last five years. Here are some of mine: A main file server that keeps filling up. An email server that can handle 8x the level of traffic it will ever see. An IP conference system brought in-house to serve 2-3 conference calls a day. Running fiber to the desktop at great expense to serve people whose primary application is email. Bringing in four different version control systems for one development effort. Replacing one outdated computer at a time, so that suddenly you're supporting a hundred slightly different machines.

    I could go on, but you get the point. All of these are examples of "bad" IT spending that would have been prevented if you'd had good people and good managers, people who understand that the first rule of IT is to know why you need a given thing.

    Now, it's hard to get people who are smart without being arrogant, careful without being overly conservative and etc. Moreover, even if you assemble a good team, it can be expensive to keep it together. The gains of a good group of people are realized in the long-term, and this is why so many otherwise intelligent businesses have incompetent people wasting money.

  • Just shows you that people who aren't activily involved in IT would rather beleive it's some magic that is always there ua know. like air travel, or the sun rising.

    Best way to prove the need for IT? let the hackers and script kiddies run wild and then see what happens when this supernatual force of IT crumbles around them. Who is need then =]

    (Don't mind me I'm going insane)
  • by Spencerian ( 465343 ) on Monday August 12, 2002 @05:57PM (#4057258) Homepage Journal
    This was going to happen since nothing can stay badly broken forever, not even Microsoft Windows.

    The success of IT rested on three assumptions:

    1) The Internet was a cash cow that needed only to be milked.
    2) Microsoft Windows was the key to all things in the computer world.
    3) IT staffing is always needed to service the legions of PCs in business.

    But each of these failed to pan out for logical reasons. The Internet was a cash strategy, but was abused by stupid people placing money into businesses without a business plan and no real product--dot-coms. Bye-bye, they said to their money. Screwed up the stock market, that.

    Microsoft Windows was indeed the way to all things computer-related, from apps to training. And quite a few businesses contracted with "kitchen-sink" computer service companies who could buy, service, or administrate all kinds of PCs (unless you're Mac OS or Linux--that's another sad story in most locations). And training would guarantee most everyone with certification the chance to submit their resumes.

    But this business was based on the fact that Microsoft Windows was ALWAYS in need of maintenance and companies would ALWAYS upgrade their systems for the "latest and greatest."

    Enter Windows 2000--the first Windows OS whose stability and performance claims were justified. Microsoft built this OS with greater strengths as word spread of a newcomer that was free and just as stable: Linux.

    As budgets tightened, managers again asked the budget questions, but weren't accepting the usual answers. "Why do we need to upgrade?" IT managers were able to answer firmly in the past that these upgrades would improve performance, or administration. But managers knew, now, from personal experience that their computer running Windows 98 or 2000 was just fine, and didn't want their copies of Office 2000 messed with for now.

    As the IT monies dried up, IT managers (and contractor companies) tightened their belts and downsized, kicking out some experienced techs but quite a few inexperienced (but certified!) techs to the curb. Windows didn't need armies to support any longer. Servers didn't either--a few new technologies consolidated some sysadmin functions.

    And now we're back to the availability of techs and sysadmins with real experience, talent, and diversity. You could be a Windows NT admin, but you may also know Linux. No longer was there room for "computer religion." You might do Mac desktops, but also know PC desktops. It's a screwy kind of Darwinism (no pun intended for the OS X folks), but the competition between the stable UNIX operating systems vs. all things Microsoft have brought a new (or rediscovered?) dawn to the personal computing world: the generally stable computer.

    Are techs still needed? Sure. However, if all you have are a bunch of certification certificates beyond you and little experience, those papers and 50 cents are probably worth a cup of coffee at McDonalds.
  • by t0qer ( 230538 ) on Monday August 12, 2002 @05:59PM (#4057264) Homepage Journal
    Man, I love to spin a good yarn.

    The year was 1998. I had just scored a job as an IT guy for a small silicon valley company that was going to revelutionize the world by building a netmeeting for radiologists.

    Well somewhere along the line, my paranoid overly mormon CTO began to think I was satan and thusly ordered all my root priveledges be taken from all servers..

    I'm not going to go into too much detail about why, we'll just leave it at he was a lunatic. I could no longer add users to the mail system, apply patches or do anything a person in my job would normally do, so I just sat there browsing the web all day. Surfing the web and getting paid is pretty fun to say the least.

    This psychotic CTO thought it would be a good idea to put the burden of sysadmin'ing to the coders beneath him. That lasted about 2 weeks.

    Secretary calls, "I forgot my mail password"
    me, "Sorry but since I had my admin rights taken away I can no longer fix those problems"
    CEO calls, "I just got an email saying 1 million porn spams for dildos just passed through our unprotected SMTP server is this true?"
    Me, "Sorry sir, but without access to the logs I cannot verify this, here talk to the coder kendyl put in charge of that"

    Where everyone was used to issues being resolved in 10 to 15 minutes with one phone call now turned into a trapeze act of phone calls trying to track down which coder was in charge of what system. It prevented me from doing my job, it made the coders jobs harder from fielding stupid questions, and the CEO was very pissed off about the whole thing. Coders were wasting up to 2 hours a day each to deal with stupid network shit.

    Well eventually the CTO was fired for being a stark raving lunatic. The coders that held alligence with him blamed me for his downfall. One in paticular would do shit like run a samba server to fuck with my PDC, due to oslevel=1000000 my NT box would never become PDC.

    The company brought in a new group to rewrite the product from scratch, and they brought with them a very wise admin named Ed Goldthorpe (If his resume ever crosses your desk, hire him, he's worth whatever he's asking) Ed slowly but surely got the coders to co-operate with him and got the network turned around in about 3 months. We had VPN, started running qmail, and basically everything was good.

    I sort of faded into the background from then on. I still fielded support calls from our socal office and the one I worked out of.

    The office moved 2 hours away from my house, where before it had been only 15 minutes. I put up with the 4 hour commutes by spending less time in the office. Eventually the company threatened to put me on hourly, I told them to fuck off and went to find another job. Maybe i'll write about the next job if we get an on topic story for it.

    So going back to the point i'm trying to make. Most of these companies that are ditching the full time IT staff and doubling the load on their engineers will feel the burn in about 6 months. They will realize that an engineer pestered by idiots who can't change a font all day isn't a happy engineer.

    IT acts as that buffer too keep the animals from eating the engineers/coders alive. When you throw away IT, you'll be losing a few good engineers too. It happened at my company, and it will happen at yours.
  • "How do you feel about the cost benefits of IT?"

    as compared to what...

    A customer of mine had to capture a massive amount of data. Estimates showed it would have taken a 2-person team 700 days to finish the job. Instead, we thought about the problem a little, and devised an automated solution that took a single person 5 days to implement, and then 3 days to process. So instead of 1400 person-days, it took 8.

    I guess if companies DONT want that kind of efficiency, they can ditch their IT departments. Just dont whine then the companies that exploit IT to be more efficient kick the shit out of the IT-less ones.
  • While I used my last post a joke I do have a real opinion on the matter.

    I beleive this is an example of an "US vs Them" mantality. Non IT workers have unions and lobby groups crying for thier every little thing while IT workers don't really have such things. So of course thoughs without representation get shit on.

    just my thoughts

  • Perhaps the CFO would like to go back to the days of hiring a room full of clerks with mechanical calculators and handwritten ledgers to balance the books?

    The only people who think they don't need to invest in an IT infrastructure these days are idiots who take for granted all the positive influences computing has had on business in the past 50 years.
  • Either simulate or cause a disaster. Also simulate that there are no IT people in the organization.

    Hmmmm, should this simulation be scheduled?

    Now ask the people who would normally be in charge of IT (usually a CFO, CIO, or COO) at the director/exec level to fix the problem propperly.

    The checkbook for contractor/vendor/consulting dollars will probably fly open.

    IT is the nervous system of any company. You know you've got one somewhere, but unless you get hurt in a major way, you'll never know it's there.

  • 1) Having them say it "isn't my job to do..." Bullshit. You're paid to support my machine. You can set boundaries (don't install chat software, for example) but when I need a tool to get a job done, or I need a machine running, damn it I need it.

    2) Being told "I'll get on it soon," then waiting weeks for the solution. Hey, I'll do it myself. But if I have to do it myself, you're not doing your job (and I'm probably not doing my job).

    3) Trying to help out by "fixing" working systems, especially during crunch times. Our IT guy decided to upgrade a linux kernel on a working laptops the day before a major demo. The kernel didn't work, and the collective blood pressure in the lab went up quite a bit.

    4) Being a tactless prick. "Why are you using X? Only halfwits use X! Use Y instead." I use X because it works. Now fuck off.

    I could go on, but for the most part it seems like IT people get in my way more than they help.

    hrm
  • by imadork ( 226897 ) on Monday August 12, 2002 @06:14PM (#4057383) Homepage
    Of course IT is overrated... would you pay that much for a scooter? [google.com]
  • by teamhasnoi ( 554944 ) <teamhasnoi AT yahoo DOT com> on Monday August 12, 2002 @06:19PM (#4057417) Journal
    With IT going for $3000+, I think IT is far *overvalued*. I mean, c'mon! It's a SCOOTER!
  • IT Undervalued? (Score:3, Insightful)

    by 4of12 ( 97621 ) on Monday August 12, 2002 @06:21PM (#4057429) Homepage Journal

    Only at some companies.

    I'm guessing that some of the big decision makers got burned by some bad decisions during the heyday of the .com boom.

    You have to admit they have a point. They were sold on something which

    • they know absolutely nothing about (have an MBA, not an BS CS)
    • which turned out to be a dud.
    Why should they believe the next hyped up set of buzzwords coming from the IT community? (Certainly they should be skeptical of the same vendors that sold them that previous pig in a poke, whoever they happen to be - hopefully not you!)

    So first dispel any illusions that every new buzzword technology is a good thing.

    Also, gain some credibility with those skeptics by validating their skepticism where it was well-founded. Yes, we sunk thousands of dollars into that supposed cure-all and it was nothing but headaches. It was mistake and you're right to call it a mistake. But also point out where things went right, or perhaps unexpectedly right (eg, Joe put in an open source proxy server that was the bees knees.)

    If a vendor comes painting a picture, demand references to current users, and then dig down to the worker bee level in that organization to see if things really are working. Why dig? There's probably plenty of upper level folks in the showcase example company that want to look as if they made a good decision to go with vendor Y and technology Z. The CIO doesn't want to look bad to the other C level people, so definitely dig down. I can't tell you how much money has been poured down holes as a result of an uninformed decision coming down from on high, where there is too much insulation from everyday reality of things like hung servers.

    You need to back things up with solid arguments showing non-IT folks how introducing some technology helps their business' bottom line.

    A worthy competitor that has implemented technology X where you can show it has had a beneficial effect is one good argument. Another argument is a detailed analysis of a small low-budget prototype roll-out: eg, we created an XML based mechanism for tech-friendly Salesman Fred to access the manufacturing database so he could know how much leadtime to let a customer know he could expect. Etc.

    In the big overall scheme of things I've heard an argument made, and I believe most of it, that the unexpected growth in productivity over the past 15 years or so has been largely due to the adoption of IT. (Some growth is due to improved business processes, but I would argue that many of those processes wouldn't be possible 20 years ago given the technology of that day.) If you believe that, then stopping all further investment in IT will likely lead to a stagnation in productivity growth and profitability.

    Nothing's ever that simple, of course, and there's no iron-clad arguments for adopting new technologies. There's risk, no two ways about it. But taking the risk earlier than others leads to more substantial rewards, if you can afford the investment.

  • it spending (Score:5, Insightful)

    by psin psycle ( 118560 ) <psinpsycle.yahoo@com> on Monday August 12, 2002 @06:22PM (#4057430) Homepage
    There are two types of IT spending.
    1. Infrastructure Maintenance and Development.

      This one is pretty obvious - you have an IT network, you have to keep it running. You have existing software that is providing some sort of value, you must keep it running. Maintenance means replacing broken/failing equipment, it does not mean constantly upgrading to the latest gadgets. This type of IT spending is currently way to high. NT4.0 is good enough. Office 97 is good enough. There is no compelling need to upgrade to newer versions. When we upgraded from office 95 to office 2000 the only reason to upgrade was because of the compatibility issues. And this only affected about 1% of the office! This is not money well spent.

    2. Re engineering Processes

      This is a bit more complicated. Actually it's a lot more complicated. Most businesses have been doing things one way since the beginning of time. When asked why they do something the answer is usually "that's how we've always done it."

      Spend a few days looking around your corporation. Maybe you can apprentice with some of the Secretaries/Assistances for a few days. Learn their processes. Find out why they do things... question everything and look for redundancies. It probably won't take you long to find cases where people are re-keying information that has already been in the PC once. This type of work adds no value - if there is a significant amount of time spent doing this you can easily build a case to correct this. Start a project to extract the data from its original home, put it in a format that your clients will be able to use. These are the easy projects to get approval for - low political risk, high pay out.

      Another way to find potential IT projects is to spend some time trying to trace processes from the original entry into the company, to the final delivery to the customer. Create a complete processes map. Find out what the original input is, and what the eventual output is supposed to be. As an outsider you should be able to look at the process and see major areas that could be improved. Anything that doesn't add value to the final product (in the eyes of the customer) probably does not need to be done.

      Focus on areas where a customer request passes between many different people before it is filled. There are usually ways to improve or eliminate hand off time. Possibly there are many 'specialists' involved in a process when really they need one generalist and an expert system. These types of projects are difficult because many business units will need to cooperate to accomplish an improvement - but at the same time this is where the highest return can be.

  • by Nonesuch ( 90847 ) on Monday August 12, 2002 @06:33PM (#4057499) Homepage Journal
    A major gripe I have seen at a number of large corporations is that the in-house "IT" groups (web development, server administration, software engineering,etc) become:
    1. Greedy.
    2. Lazy.
    3. Incompetent.
    Greedy. I constantly see internal web development groups quote even a tiny, simple web site as dozens of hours and thousands of dollars, a price that would have been outrageous even in the pre dot-bomb days. Then they have the never to say "Why do you care how much we bill? It's all internal chargebacks, so it's really just 'play money'!"

    Lazy. All too often, in order to complete a project on time, I end up building and maintaining my own servers instead of handing off server installs and maintenance to the in-house "server management group". Why should the internal sysadmins be pro-active when there is no penalty for slow response time, no competition for customers, and when they know that by doing nothing, the most demanding customers will eventually just go away and solve their own problems?

    Incompetent. As firms cut down on staff and cut out the perks, their most qualified web developers and sysadmins are recruited by headhunters or flee to better, more stable positions as each round of downsizing takes it's toll on morale. In the end, with very few exceptions, the only staff who remain are those not talented enough (or too apathetic) to move on to a better job.

    In my experience, in many larger organizations, IT staff might once have been an undervalued asset, but in the past year, most of the valuable staff have fled for greener pastures.

  • by bwt ( 68845 ) on Monday August 12, 2002 @06:33PM (#4057500)
    "IT" always represented this flashy thing where the company was paying a bazillion dollars to move everything to platform X with some flashy vendor and new "in" technology. This type of IT is dead. It's dead because people started to see it as an end in iteself, and somebody finally asked "where's the bang for the buck in our core business operations?"

    I've always felt that simply polishing the hell out of your internal apps is the best way to spend money on IT. It's pretty easy in most business system to find ways to take business functions that take a minute or two but are done by several hundred / thousand people a day (or more) and reduce them by 50-90% time-wise. If those take a month or two each of programmer time, that is big-time ROI.

    Lets say the programmer makes double what the typical business process user costs. If it takes the programmer two months to do a project and the net result is that a business function/transaction occurring 480 times a day is cut from 90 seconds to 30 seconds, then the project pays for itself in four months. That kind of work isn't sexy, but it sure does pay for itself, especially when departments can delay hiring more people because their existing folks are more productive.

    There are a lot of crappy apps out there that waste user/customer time, especially because IT managers were hell-bent to shove new apps out so they could claim victory in the time-to-delivery game. The whole IT industry needs to step back and focus back on the end user experience and business fundamentals: eliminate waste in core business processes.
  • by Infonaut ( 96956 ) <infonaut@gmail.com> on Monday August 12, 2002 @06:43PM (#4057555) Homepage Journal
    Here's why: The desire to "innovate" purely for the sake of innovation is something that all geeks innately love. It's not something that everyone else innately loves. In fact, "innovation" often gets in the way of business, because the actual human beings who have to make use of the innovative new systems have enough to pack into their work day already.

    IT has a reputation in corporate America as the most unresponsive and least human-centered department in any organization. Here are the stereotypes I've encountered:

    1) IT people are more interested in their machines than in helping me do my job.

    2) IT people have no understanding of what I do on a daily basis.

    3) IT people are penny-wise and pound foolish. They won't pay $200 so I can have a Zip drive that will allow me to take my work home, but they'll spend $1.5M on a VPN that will take a year and a half to implement and won't work properly when finished.

    I've been on both sides of the fence, serving as IT support and being one of those people griping at poor IT support. It seems to me that if more IT departments thought of themselves as enablers rather than as an end to themselves, they'd receive much more respect.

    Want to see a good example of how it works in a good support organization (and IT is always support)? Go to your nearest Air Force base and talk to the pilots and crew chiefs. Sure, the pilots get all the glory, because missions are oriented around flying the aircraft and hitting the target. But the crew chiefs are given tremendous respect, because they are responsible for making sure the aircraft fly properly. They understand and take pride in their role.

    Many IT folks seem to have the opinion that they're smarter than the people they serve. They may be smarter, but that doesn't change the fact that people above them in the organization have to make the truly difficult decisions about hiring and firing, where to spend money, how to stay competitive, and so on. It's not that IT decisions aren't difficult, but in any organization, the more important the decisions you make, the bigger your salary.

    If more IT departments realized that they actually are part of a larger team supporting the same goal, and took off their wizard's hats, they might find a lot more acceptance on a human level.

    That's where IT folks commonly fall flat on their faces. They don't realize that business people make decisions based almost exclusively on human factors, only secondarily on money, and a distant third on technical factors.

    IT departments that grasp the human factors, take care of the other people in the company, and bend over backwards to help people go about their daily tasks are far more likely to get the money they need to conduct glamorous "innovative" projects.

  • by Herkum01 ( 592704 ) on Monday August 12, 2002 @07:00PM (#4057654)

    The biggest reason that IT spending gets out of control and produces no real results is becuase managers just don't understand the technology or their technical people enough to know what they are doing and are unwilling to admit their mistakes letting the problem get worse and worse.

    An example here at my current job is working with the monitoring department. The decided to use Netview and Robomon to monitor Windows servers. They hired two contractors who had no experience with either product. The manager who hired them had no experience with those products, project management or even basic programming skills. She simply trusted the judgement of the people who were under her.

    The could not get Netview working, they could not get Robomon working and their usual complaint was to blame it on the system owners. When the system did work, they had done such a poor job of writing monitoring scripts it was not uncommon for our deparment to see on the screen, about 50 to 100 error messages a second scrolling through events window.

    After a while upper-management broke down and hired a outside contractor to tell them what was wrong. I talked with her and she said, it was the contractors that they hired. I asked, are you going to tell management that? She said no, they did not want to hear that they had made a bad decision so she was going to give them what they wanted. Saying they may need some training.

    Finally they decided that it was the Robomon software that they had purchased was the problem. So the contractors, who by this time, being so effective, were hired as employees. They had decided they wanted HP Openview, nevermind that they had not worked with that either. They wrote a report detailing their methodology for determining the best monitoring software and listed HP Openview at the top of the list. I took at look at their report. It seems that made up a number and then assigned it to the packages that they were suppposed to review. Of course Openview was at the top.

    So now, after two years, they still have not been able to configure an effective monitoring tool but lets look at the total costs.

    1. Netview Upgrade $50,000
    2. Robonmon Licensing $1,000/server * 200 servers
    3. Outside Contractor for 3 weeks, $40,000
    4. HP Openview License $250,000
    5. OPenview Server Licensing $1,000/server * 235 servers
    6. 2 Top of the scale employees, $70,000/each * 2 years

    $1.055 million dollars in support and software, don't even know about the hardware. The manager had been told that their were problems with their employees but she brushes them off, and always backs them up. So is it any wonder, with this attitude about technology that people don't want to put more into that bucket!

  • IT inflation (Score:3, Insightful)

    by starX ( 306011 ) on Monday August 12, 2002 @07:10PM (#4057714) Homepage
    One of the reasons for this bust is that people were spending money on IT stupidly. I'm sure we all know a few people who have insisted that they need a new computer because their old pentium is "impairing their productivity" when all they really do is word process. And let us not forget this sub-moronic idea that just because M$ comes out with a new version of office, you need to have it in order to keep up. The result is that companies have way more features in house than they can typically use, which translates into wated money.

    Now tell me who out there is naive enough to think that the people running the show are sdmart enough to learn that IT is a worthwhile investment as long as it is well thought out and carefully implimented? Reactionary attitudes tends to be the norm in just about everything everywhere. Right now the pendulum is in the process of swinging back towards a cut corners mentality, which is good to a certain degree.

    "We can't install the latest release of Windows/Office on our old k6-2? Why can't we just use the old version?" That's intelligent thought, and as techs, it should be up to us to answer that question. I truly believe there will come a time when we are no longer in a recssion, and invester confidence has returned, and when that time comes, the people who approve budgets might be willing to listen to and consider your answers.

    Until that time comes, you're just going to have to accentuate the negative. If you need to develop some app, but there is no budget, then make sure you accurately predict how, in the long run, not devbeloping it will actually cost your company money. When enough techs are proved right often enough, then the pendulum will start to swing back the other way, of course this gives us all an even mightier responsibility; to learn from the lessons of the past 6 years and NOT try to solve every problem with something newer and "better."
  • by parabyte ( 61793 ) on Monday August 12, 2002 @07:20PM (#4057786) Homepage
    In early times, you had to make your own PCB design for your machine, solder it together, write your own kernel, write some Libraries and and put together almost any app within a reasonable time.

    In the late eigthies, you still had to write almost every application if you required something like a workflow and wanted the system to be used by mere mortals.

    Today, one person alone can set up an integrated office system with document management, customer database, accounting, reporting, E-mail access, firewalls, printing, backup, telephone integration, banking, video and audio editing, ftp and webserver, dns, dhcp, wavelan, gigabit backbone, raids, computing clusters and mobile vpn access within weeks, and several hundredthousand apps are available for free or very little money. You can easyly afford a database software or an office-suite worth a few hundred million dollars of develepment effort, and get even millions of man -months delivered on a $50 Linux Distribution, including the source code. Or if you buy a Mac or a PC today, you can actually do useful things without even installing additional software - remember, what you could do with an AppleII out of the Box ?

    So today, if you want to roll your own stuff, or even spend a few days on improving your customer database access, you need many thousand customers to justify even using a real database instead of MS Outlook contacts or a simple spreadsheet.

    Trying to build a new 3D-Engine, a Web-Browser, a database engine or a new GUI library is almost insane from a business point of view, so the deeper you descend into the swamps of IT-Development, the better your justification has to be for shelving out that money and taking the risk of failure.

    I try to ride Moore's law and aim for something unique that does not exist so far because it was impossible or too hard to do in the past. I try to stay current by spending a lot of time hands on new technology, and I steadyly improve my and my team's knowledge and skills, but I admit: it is increasingly harder to find and exploit those niches where you have both: Fun and Profit.

    OTOH, there will always be the Linus way: Build something for the sheer fun and knowledge, and the worst case is that you are happier and smarter afterwards.

    And if you don't mind to listen to someone who has been around for a while and covered some distance: Never do things for the only purpose of profit. It will minimize your chances, but even if you succeed, you will not be any happier than today. And probably have fewer friends.

    parabyte

  • by buss_error ( 142273 ) on Monday August 12, 2002 @09:09PM (#4058553) Homepage Journal
    Since the .com bust, the arguments I hear everywhere is 'IT has now been proven to be a waste of money'

    If this were true, please explain:

    why we are using word processors instead of typewriters or movable type presses.

    Why spread sheets are needed? We could use register paper instead of an expensive computer.

    Index cards are cheaper than databases. Lets go pull the plug on that expensive DB server.

    Customers and vendors don't really need to do stuff with our web site, they can call in to our customer service lines. Oh, we'll need more bodies in customer service...

    Who needs e-mail. Snail mail is fine for what we do...

    Why should we search the web for the best prices, just order catalogs once a year and go to the public library more often to do research.

    Now that we've deflated the hype around computers, lets talk about telephones, fax machines, pagers, and cell phones, and why we don't need them anymore.

    After that, if they back off, then ask a simple easy question: Do you think any of that stuff runs itself?

    Seriously, if anyone said that IT has been proven to be a waste of money, I'd look for an ulterior motive. Fast.

    Now, if they mean that a lot of people went overboard, well, I don't think I could argue against them there. One only needs to look at Darwin Awards [darwinawards.com] to see that a lot of people do go overboard... and kill themselves doing it. The trick of it is not to be a lemming.

Scientists will study your brain to learn more about your distant cousin, Man.

Working...