geoff313 asks:
"I'm sure many of you are aware of the uproar over Nicholas Carr's article 'IT
Doesn't Matter' which was published in the Harvard Business Review, back in May. While many big names in the IT world have responded already to Carr's article (Ballmer has declared it 'Hogwash'
and Fiorina has pronounced it 'Dead Wrong'),
Carr debated vendor executives Monday at the Comdex trade show, proving that the issues he raised are still resonaating through the industry. Do you feel that corporate IT budgets
should be focusing on cutting edge technology to best serve its customer's needs, or should they focus on shoring up what they have now in order to maximize its usefulness to the customer? Some background can be found from the Washington Post,
InfoWorld,
and ZDNet, as well as at Nicholas Carr's site."
"For those of you unfamiliar his philosophy, it can be summed up pretty thoroughly by his statement 'Follow, don't lead,' arguing that the huge advances in the IT industry over the last two decades have erased the strategic advantage to be had by corporations for staying at the cutting edge of technology. In short, he advises 'executives need to shift their attention from IT opportunities to IT risks - from offense
to defense.' Of course the head honchos at IBM and Microsoft disagreed with him, citing Wal-Mart's use of RFID tags to keep track of inventory and other forward thinking IT decisions as a refutation of his thesis.
What I am interested in is the opinion of those in trenches of the IT war."
Just do it . . . (Score:5, Insightful)
Products dont matter (Score:0, Insightful)
Just look at the advances over the past few years in toys: Are they really all that fundamentaly diffrent? I mean, cards, cars, and dolls are still the top sellers. Why bother getting a new line of products for the holdiay season? It's not like there's anything new there.
I think I've pointed out the logical flaw sufficently.
IT... (Score:5, Insightful)
Does IT Matter? (Score:2, Insightful)
1. Send e-mail and instantly communicate with IM services
2. Pay bills and manage finances online
3. Get lot of information about anything you can think of within seconds.
4. Manage every aspect of your life ( jobs, health, you name it) with the help of technology.
I.T. will matter in the next election (Score:2, Insightful)
RFID tags (Score:3, Insightful)
IT doesn't matter -- but not being a moron matters (Score:5, Insightful)
What does matter is that some business models that work don't work unless you have the right (new, or new-ish) technology: you can't have an Amazon.com without advanced web systems, or you can't have it feasibly and cost-effectively.
On the other hand, having a new 20-inch iMac on every desktop doesn't much matter. (Drat.)
The trick with IT -- and about everything else in business -- is to really figure out what does matter to the business, and to work your ass off optimizing that thing that matters.
Computer Science? (Score:1, Insightful)
Egads, noone gets it (Score:5, Insightful)
Using an example I saw given, the best selling toys today are cars, much the same as from the 1940's. The difference is in what these cars can do. Take the top-notch must-have car from 1949, some metal pushcar contraption. The hot cars this years, high-end RC machines with more computing power than launched men to the moon.
IT is more important than ever, even as its importance slowly vanishes, becomes part of the general background noise. The more important it gets, the less noticable it is.
IT or Engineering (Score:2, Insightful)
View from a government agency (Score:5, Insightful)
In my department we recently replaced 75 green-screen terminals. Many, many people were happy to see this happen, but in reality most of the new PCs are simply running terminal emulators and are glorified dumb terminals.
So on the face of it, we didn't really do anything but spend a lot of money and make everything prettier... ON THE SURFACE
However, now that the infrastructure is in place, we can begin to really look forward. We are now considering projects that have the promise of eliminating hours of uneccessary work each day and of making public information much more accessible both online and at local kiosks, just to name a couple.
The key is that you can't just implement new technology for technologies sake, which was kinda what the whole "bubble" was all about. You have to take a long term view of how and why you will leverage that technology going forward. May seem obvious to us, but not to all.
Already the case? (Score:2, Insightful)
I've worked both in academia and in industry. From my perspective the industry is very slow to use the cutting edge research. The stuff I treated as pretty routine in academia is considered a cutting edge in industry. The industry is much more interested in massive projects involving well tested technologies than in, what is the domain of universities, small projects with both high risk and high intellectual value. And while some companies, such as IBM, have a significant research budget, this does not apply to many other high tech leaders.
Two edged sword... (Score:5, Insightful)
Then on the other hand, you have cutting edge technologies. Well, yes they can help you out if you have a problem that they solve, but there's no point trying to find a problem for them to solve because they're there. I know one company that ripped up a perfectly good CRM system built in house so they could access the database using web services. Totally pointless. Yet, I know another company that has rolled out an intranet, built a document repository and that has garnered much more immediate results.
So, my answer is a straight 50:50. Firefight, but implement policies that make your job easier as you do so, so you can reduce overall costs, and only implement newer systems if they are required, and even then, don't be blinkered by the latest technologies. Sure, it may be cool, but early adopters always bear the price, but not necessarily the fruits.
The thing is, some of these points are common sense, some need time, and in business you can be guaranteed that people lack both.
IT doesn't matter. (Score:4, Insightful)
I work at a charity where our primary aim is to help people get back into work after long term unemployment. As a means to this end, we make extensive use of IT.
We have an Exchange server (save the flames), does that matter? No! What matters is that we have a way of knowing when we're able to make appointments for them, it just so happens the best way we have of doing that is using Exchange.
We also run an online centre, where people can come and get use the internet for free, and get training in how to use computers. The fact we have 20 internet connected computers doesn't matter - it's the fact that people have jobs who wouldn't otherwise do, partly thanks to the computers they had access to.
It's all a matter of perspective, IT is just another tool in the box of things that allow you to get the job done. In the same box for us comes knowledge of writing CVs, and being able to relate to people.
Right on (Score:5, Insightful)
Wal-mart might have realtime inventory statistics across the world, but the reason they have that is because they know what to do with that information. If you gave that capability to Kmart executives, they wouldn't have any idea what to do with it.
The problem with IT, though, is that KMart might actually buy a system that can give them realtime inventory, then not use it. Whoops, there goes tens of millions of dollars.
IT doesn't matter because everyone can do it now. Can anyone on
So no, IT doesn't matter, or it matters - the way electricity matters.
isnt this what "On Demand" is? (Score:3, Insightful)
increasing the USEABILITY of what you allready have, tie all your databases, CRM, and everything together... basically Middleware
i think its safe to say that everyone has the hardware.. what they need to be defensive and to utilize it is the software (linux, DB2, webpshere or tivoli)
MS and HPQ disagree because they want you to upgrade and they want to sell you that upgrade.. IBM makes there money on services.
IT doesn't matter - but not how you think... (Score:5, Insightful)
So what's to happen to us geeks? Many will go into design and project management, and liasons. Many will continue to work for a long time in interoperability. Those with PhDs will make patents so companies that don't actually produce anything can make money. Lots will support other growing fields that need custom work, such as bioelectronic technology, nanotechnology, and those other 'pie in the sky' technologies.
Many will go into programming and hope they can sell their vision/idea to the few major content providers - who'll take it and have it developed further by programmers in lower slobovia.
But it's still another 10-20 years along.
-Adam
RFID (Score:2, Insightful)
RFID will also make some tech dreams closer to reality, eg a fridge that knows what's inside and what needs replacing.
I visited the Stop and Shop with the "Shopping Buddy" that was
All of the RFID worship is meant to provide a counterpoint to the idea that IT doesn't matter. RFID matters and RFID is IT. IT still matters.
Do you value your organs? (Score:2, Insightful)
Think of it this way (Score:1, Insightful)
And why does this happen? Because companies like MS & Apple want you to keep buying new computers. They bloat their software. You need ever more to do the same thing. A modern P4 system with WinXP gives a slower user experience to do the _exact_ same things as my 1995 486 system & its software did.
But the largest blame of all rests with us programmers and techy types. Most of us, when it comes to this sort of thing, are morons. I don't say this very often about people, but it is very true. We upgrade all the damn time for no good reason. We bloat our code horribly because we don't care a _damn_ how it runs on any computer more than 2 years old. So we've got something in the order of half a _billion_ 'obselete' computers sitting in attics around the world. All because we can't be bothered to write good code, and we're pathetic in wanting to be 'cool' in having the latest GHz hardware all the time.
Please don't give me that rubbish about how you need all this ram and mhz for modern systems, it just isn't true. Take a look at any Acorn / RISC OS computer for proof.
actually, he's correct (Score:5, Insightful)
I actually think he's right. IBM e.g. effectively commoditisized (if that's a word) PCs by opening up the their standards years ago, MS having the complentary product "OS/Office" that made them superrich. Consider this: Having Win+MSOffice (please no religious zealotry...:) ) might have given you an advantage 10 yrs ago, if you were one of few and could reorganize your business processes to be much more efficient using it. Nowadays, everybody has it and needs it, you loose that advantage.
This guy Carr just generalizes that to the whole of IT, including the "new" stuff like the net. Beats me, why IBM is crying foul, since they are running this huge PR campaign of "IT as a utility" which is exactly that.
just my 2 cents
Re:IT... (Score:5, Insightful)
Absolutely agreed. My first reaction was that the answer to this:
was this:
Whether or not it's cutting edge is irrelevant; what matters is whether it does the necessary job. Whether it serves the customer's needs is irrelevant (to the corporation) too: there are lots of in-house needs that can be helped with good use of IT, and serving the customer's needs is certainly good business sense, but in business you do it for that reason, and not as an end in itself.
Re:The question is when will IT blow itself out? (Score:5, Insightful)
profitability. (Score:2, Insightful)
I work in a small graphics/prepress shop, and if something works, we keep it until it no longer works.
When a new problem comes up, we see if our existing architecture will solve it, if not, then we start researching the newest choices out there.
For example, our server was a used Sun, which we picked up from an imploded dot.bomb (god bless San Francisco's used equipment market.)It works fine for our 30 or so people, however, we need a high powered rip to deal with all the various post script that comes through here, as quickly as possible. So we spent an assload of time researching the various rips and bought the latest greatest of the brand we chose. Runs fine on that poor ol' used Sun.
Granted this is a simple example, but, if it ain't broke why fix it?
Re:Customers and budgets (Score:3, Insightful)
Clients ask for things and you have to deliver what they want on their timetable, using whatever tech they want. Sometimes you have to save them from their irrational selves, in process.
I do things for clients the way they want them, that I don't think make sense a fair amount of the time. I've certainly built applications and databases in ways I wouldn't do them for myself.
I, in turn, get to make vendors who do work for me (on behlaf of my clients) to do it on my timetable using the tech I want.
It's a nice happy circle. As long as you don't let your clients get you to do too much for nothing, who cares if the tech you used is cutting edge or not.
Easy test (Score:5, Insightful)
THEN ask people if IT matters.
Why extra inches *really* matter :-) (Score:4, Insightful)
I realise that your statement was somewhat in jest, but actually I don't think that's true a lot of the time. If this is what you mean by "new technology" (as opposed to things like "web services", XML, .NET, etc.) then there are clear benefits.
Firstly, users with big or multiple monitors are often measurably more productive when using a computer all day. A colleague at work has just got a second monitor. It's just an old but serviceable 17" box, but it makes him more efficient, and he loves it.
And that, of course, is a second good reason to spend that little extra on the hardware people use all day: it has a morale-boosting effect. Employers that treat their staff well get treated well in return.
And of course, Macs are vastly superior to Windoze boxes anyway. <ducks> :-)
Are you sure? (Score:5, Insightful)
But do they? Would a secretary typing up her letters be any less productive using Word 2000, running on Windows 2000, on a PIII/500, than she is using Word 2002, running on Windows XP, on a PIV/1.6GHz?
Sometimes upgrades have definite value; see my earlier comment in this thread about monitors. Other times, they make no real difference at all, and it's just a numbers game, where the prize is... nothing.
Today, as always, most of the serious work is done on older, tried and tested systems. The users of the most recent toys are either the few who genuinely do require state of the art power and/or technology to do their work, or those who like to be on the bleeding edge, because.
Re:Two edged sword... (Score:2, Insightful)
Management has a way of sorting the operations of their business into two categories: One category is people who do the work to produce product, sell product, promote product, etc. The second category is the people who support those people in producing, selling, and promoting product.
The efforts and expenses put toward the first category of people is money that earns a direct return to the business. The resources allocated to the second group are a negative sink on the buiness.
When things go wrong in the information flow that the people in the first category need to do their work, the second category of people do some of the only work that justifies the business keeping them employed.
It's hard to see how you can consider this work 'the least productive work' when it's actually the only work IT does that isn't money down a sinkhole.
Ask your local PHB. The above info isn't obscure, it's how things are seen.
IT is a ubiquitous part of a larger world (Score:3, Insightful)
Most of the IT companies spent the last 25 years convincing companies and consumers to buy desktop computers, laptop computers, servers, and software in order to boost productivity. Plenty of low tech jobs were replaced by a bunch of high tech solutions and a smaller number of high tech jobs. Companies now routinely process zillions of transactions very quickly using very few workers. It may be that productivity can be increased even further in this way, but the huge gains of the last two decades are probably tapped out. IT may not be the motivation for the next huge runup in stock prices, but saying that it doesn't matter is like saying that mutual funds don't matter in the market. They may have fallen out of favor, but they're still a huge force to be reckoned with.
I'd like to see IT applied in ways that really make our world better, instead of (just) more efficient. IT has long promised to improve health care, and has largely delivered on that, but it has also lead in part to the increased cost of health care. Let's use it to drive down costs. IT has made a lot of things much more convenient, but at the cost of privacy. Let's use IT to protect privacy and better control our own information.
There are a million directions that IT can go in, and thanks to massive parallelism in our society it can go in those directions all at once. Let's get on it.
Lessons from Tucker (Score:3, Insightful)
Jobs (Score:3, Insightful)
Re:IT veteran (Score:2, Insightful)
It's a good point (Score:3, Insightful)
Whereas I've been purchasing based on what technology is actually going to improve our production, which suprisingly isn't much. We've got piles of Dual G4s that were top of the line last year, and I'm purchasing eMacs, because with the money I save (and no actual difference for what my employees do with the machine), I can invest in better networking, and invest in people (yah raises!).
For some reason people think that the latest upgrade will always increase productivity. We have machines that have been around for the last 3 years doing nothing, and I now have our back-end MySQL databases running off of them, have our web-server running off of them, using them as file-servers.
And it makes a real difference in my budget, when I can make do with current equipment, it gives me a lot more room for expansion, compared to the practice of replacing non-obsolete equipment every other year.
I think corporate America wastes more technology by not utilizing it to its fullest (which sometimes means having decade old equipment) than it could possibly realize.
Ryan Stultz
Re:The answer is both, duh. (Score:3, Insightful)
Losing strategy (Score:5, Insightful)
Bang on! (Score:4, Insightful)
Bang on! I've often referred to Information Technology as "Data Technology". The main difference is the ability to act on data. If you can make better decisions based on processing data, it becomes information.
Too much of Information Technology is wasted on learning tools. New versions of software is release which precipitates a purchase of books, seminars and (more importantly) time.
Linux appeals to me because it uses a fairly static set of tools, which can be combined to solve problems. It isn't as pretty as Windows, but the time I've invested in learning tools well has paid off in less time learning stuff I don't need (TMTOWTDI). I focus more on my boring old ASCII files with the business information I need and less on figuring out why my latest version of Word is causing a GPF in a newly reloaded version of Windows XP that can't open the Word doc that I typed up last week, which precipitated the reload in the first place because Word locked up when I was trying to extract information from the data I'd collected...
I intended a Zen (Score:5, Insightful)
'Decision Makers' for IT (Score:2, Insightful)
costs have come way down for IT (Score:1, Insightful)
Re:Does IT Matter? (Score:3, Insightful)
As a company, I do not need to invest heavily to get any of the advantages you list. They are a commodity... they don't matter. I am not going to invest a quarter of my R&D budget to improve my email service.
This isn't saying that there isn't a market for companies that DO invest heavily in IT. Certainly there are companies that invest heavily in improving other commodities. The point is that not every company needs to do this. During the 90's, there was the impression that you had to invest in building new technology, even if your business was making wicker baskets. The article argues that a basket making company should not invest heavily in IT. It should outsource IT and pay the dues it needs to keep its systems running.
OK, Now that I have had a rather lengthy RTFA, let me give my views. I agree that basket making companies should not invest to heavily in IT. However, I think that there are still enough aspects to IT that make IT still matter. Most businesses suffer from a lot of redundancy that could be automated, and that redundancy differs from business to business. In other words, there is an opportunity for that business to use its IT budget to save money, but that openning is not worth someone else building the infrastructure for it. Businesses must invest in themselves, and IT still promises to have one of the highest returns on investment for most businesses. Of course, the basket company should set that as a second priority to investing in baskets, but IT is still import--it can yeild a competitive advantage.
I may be feeding the IT doesn't matter argument--I am not as convinced that other companies wouldn't be able to be followers and acheive the automation cheaper than the company that did the initial investment. However, that is what intellectual property laws may prove beneficial for. Then companies still have an incentive to be the first. However, I am sure that at some point the patent office will get tired of pattent applications claiming "A device to automate X", where X is the business process of the month.
Re:I intended a Zen (Score:3, Insightful)
As one who works in IT... (Score:5, Insightful)
I can say that IT isn't necessary, but sure makes things more convenient. There are a number of businesses that exist without computers, they are mainly retail shops and the like (lemonade stands,cash/check only antiques, yada). In order for IT to actually be necessary would mean that you cannot do business at all without it.
If we didn't have the internet, we'd resort to telephones and fax machines for long distance communication. Snail mail is another option, although not as fast. Databases are a way more convenient form of file cabinets full of binders, but for some reason most accounting departments keep the paper along with the electric.
I'm not particularly worried about losing my job in IT, or afraid of someone calling my job unnecessary. I don't think my job is necessary. The whole point of my job is to ease the burden of my co-workers, by making their payrolls go faster and easier, by eliminating as much paper from the interoffice ordering and communication, or by providing support for co-workers when Outlook is barfing. All of it has functioned without computers before, it could very well do it without them again. Perhaps they wouldn't be quite so efficient, but that wouldn't hinder the actual function of the business.
The only places where IT matters are in those businesses that have bet the bank on IT. MS, IBM, and HP are all places that look at such a paper as detrimental to their position. If people in business realize that innovation does not necessarily mean upgrade, but also includes better internal programming and process auditing, all of those big tech companies will take a hit in sales.
Look at the years after the bubble burst, for instance. The business community proved how unnecessay IT really was, which my bretheren are still very sore about. Businesses found that they needed to focus on cost cutting and efficiency, both were things that didn't need bleeding edge hardware and platforms to accomplish. Cost cutting came into being through the massive release of IT workers, through limiting the spending on new servers and pipes, and through reorganization. Efficiency came when reorganization forced workers to do their jobs better, and not be so distracted by the nerf balls and ping pong tables.
Granted, many excellent workers were cut, and many poor workers were kept. This is the nature of the upper level management beast. Eventually, those people will get rolled out of their jobs, and those positions will be re-filled by the competant ones. There will not be so many positions, though, mostly because the focus of IT will be on maintaining regular operations and on optimizing current applications. There will be little room for creativity, but it can be sneaked in in the name of "easing the burden of co-workers."
IT has lost it's glamour, and everyone (including the deluded IT guys) has finally realized that IT hasn't changed much since the 70's and 80's. The mid-late 90's were an enigma, perhaps the actual recognition of this strange section of business, but it was just everything blown out of proportion in the end.
IT really isn't as necessary as IBM, MS, and HP would like people to think. These businesses have excelled recently in creating extra expense in business, just so that they can show how it could be cut with bleeding edge technology. The stuff is nice to drool over, but technology that is two-ten years old can still fulfill that role, and the cost has already been paid. IT's role is now to be intelligent with the data, to work with it efficiently, and to maximize the effectiveness of this hardware.
true and False (Score:3, Insightful)
I especially find it very sad that so many shops are wed to overpriced MS products even when perfectly workable OS alternatives exist. Microsoft seeks and always will seek to draw the most dollars for the least real innovation and benefit. The way Windows works itself requires local copies of many megabytes of software when most of that software could have its components shared and brought in as needed on a fast Lan. Users should not be blamed because the dominant player makes a defective file sharing OS!
Real competitive advantage will come with real software advantage, not bloated safe-buy hype. This hype is seen clearly in enterprise software also. J2EE has become a buzzword without really delivering any competitive advantage that I have witnessed. A few large players get tremendous press and trust and rake in big bugs when the innovation, if any, is most likely in the little known new offerings from relatively unknown individuals and companies. The trick is finding these true innovations and using them in a safe and competent way. This cannot be done easily if at all if the source is closed.
Re:Are you sure? (Score:1, Insightful)
Modern graphical word processors have done their duty and lowered training costs and the salaries of secretarial help. In fact, they've almost eliminated "secretaries" all together -- most managers type their own memos, and "assistants" usually have more business-oriented job duties.
So, even though MS Word is maligned to hell around here, it's a great example of how advances in IT can help business.
Irony... (Score:5, Insightful)
A few years ago, there was this huge
The hogwash wasn't just limited to
Then, "Client Server" was also a big buzzword. Yep, real new. Uh...
Of course, "client server" got worn out, and people hated it. So, they invented a new word - "Thin Client". THAT was NEW! In fact, if you got one REVOLUTIONARY enough, it'd even have VT100 emulation! Or Wyse50! Oh, hell, that's not new at all.
Then the website craze began. Morons charging $500/hr to "write html code", and be "html coders". Colleges actually offer degrees in this crap today. Students actually major in it. "Yes, I have a degree in Notepad, with a minor in writing code in HTML". Uh huh. Could you mail me that MAKE file for that new webpage? The linker is puking on mine...
So, the product industry is full of useless junk, and they repackage that exact same junk every xxx months with a new name (q.v. PocketPC 2002, vs Windows Mobile 2003. Or, MS Word 95, 97, 98, 99, 2000, 2001, 2002, XP, ETC, ET AL. Or, HP's new marketing campaign. Perfect example.)
And, the consumer (corporate) market is full of useless dolts, who buy the new versions thinking it'll deliver the product they were hoping to get, 12 versions ago when they originally bought it. And meanwhile, they didn't actually *need* it in the first place, because it didn't solve any real problems or enable anything new.
And the reason the consumer (corporate) market is full of these dolts, is because most of them actually THINK that the annotation of text with little bracket signs is "coding". They're clueless, intellectually lazy, and they're only in it because "Mom said it'd be a great career!"
They think the definition of an expert is someone who knows one more buzzword than you. In other words, they're suckers.
Eventually, they get fired / downsized / put out of business. And, they flood the market with all of these credentials, and they DEVALUE those credentials because they themselves are too stupid or lazy to fulfill the roles those credentials allow. "PhD for L1 Tech Support?" "Well, the guy with the 4 BS degrees couldn't handle HTML Programming. I think we need an expert in Notepad this time."
So, the market gets pissed off, and tries to normalize itself. We don't need an upgrade every week... we need a toaster, that does exactly what we need to fulfill our business requirements. Period. Once we get it, it should last for decades without being touched.
I've still got a pair of '286s floating around my shop, crunching away at whatever... because they do the job, and that's all they do, and there is NO POINT in changing them. And when someone discovers them, they stare in horror. "Why don't you upgrade them!!?" Uh, that 286 is a telenet interface to a black-box that has exactly one 300 baud serial interface. The reason it's a 286 is because I couldn't find an XT with a working floppy.
And oddly, usually they "get it" at that point... but they don't like it. They'd literally drop $1500 for a loaded 4Ghz WinXP Pro box with 20 gigs of ram, 100 gigs of drive space... to do nothing more than trap a TCP packet, and pump it out a serial port at 300 baud. Most of you reading this are probably thinking the same thing... "Jesus, dump that piece of $#%^".
Re:I intended a Zen (Score:2, Insightful)
Re:RFID tags (Score:1, Insightful)
Everyone is a widget, nothing matters anymore (Score:3, Insightful)
IT, manufacturing, technical design work, medical diagnostic work, financial analysis, text book proofing, medical research
What's really happening is that UnFree Trade is killing America. As long as overcompensated CEO's can achieve their quarterly earning targets, they could care less what long-term damage they do to the country or companies they are running. This is nothing more than blind, stupid, greed. The only question is "How hard is America going to crash?" I wonder how many people will lose their jobs, pentions, and healthcare this year as our CEO's continue to commoditize, outsource, and offshore with the blessing of our politicians (and ultimately consumers).
It's tragic that a country founded on the pioneer spirit, hard work ethic, and innovation is now dismissing everything that made it great, calling it a commidity and then getting the service performed by what amounts to slave labor. I am personally trying to break my additiction to cheap foreign goods, but can't seem to find anything made here. Recently I found a manufacturer called Jaton [jaton.com] that makes nvidia video cards in California. I was so impressed I bought 4 of them.
Re:I intended a Zen (Score:2, Insightful)
Granted, research and development are key to a companies success. And yes, I have typed on an IBM Selectric. BUT... But sometimes CIOs embrace pretty toys for the sake of having pretty toys. There's a reason buz-words are buz-words. Some managers breath them like they were air. Often useful tools come out that can make an Admins job much easier. Other times, "It's a cool web interface to a database", and we all know that the future is the internet, and databases are good"... But what does the tool do??? I donno... But it sure does sound cool.
Re:Are you sure? (Score:5, Insightful)
I see busy, well-paid professionals doing stupid clerical shit like making their own travel reservations and doing their own expense reports. A couple of jobs ago, I maintained the fucking phone list in between looking after the servers. My boss at the time spent at least 1 day/month locked in his office, doing expense reports.
Back in the day, the secretary did that junk, and I did the technical stuff, i.e. the tasks they'd hired me to do.
Call it progress if you like, I think it's another case of "penny wise and pound foolish".
ROI, TCO, IT - does not matter (Score:2, Insightful)
A business sells stuff in the hope of turning a profit.
In the course of doing business, the people in the company have to do certain things - like advertise, talk to customers, talk to suppliers, build and service the products, keep the books up to date, etc.
Each of these activities is a royal pain in the ass, and every business is swamped with problems and inefficencies in all of these areas.
All the business owner wants to know is - 'Hey, IT dude, these are my biggest headaches today - how much time and money do we need to spend to make these headaches go away'.
More often than not, the IT dude will propose some electronic solution to remove the day to day headaches
I really think that the best 'IT strategy' for any business is to simply treat each real-life headache as a separate issue, and knock em on the head one after the other, as quickly as possible.
I would advise any non-IT management types to be extremely suspicious of any IT person who proposed some grand unified vision of the future, which would be delivered at some unknown time, and would be guaranteed to solve all their headaches - both current and unseen headaches to come. They are basically saying - Hand me an open cheque book, and continue to be plagued with your current set of problems for an indefinate period, whilst we go away and develop this miracle cure for you - here are the ROI projections that justify the expense.
Now, the problem for the (sane) IT provider is to be able to implement the quick fix to solve today's problems without creating a hopeless morass of incompatible solutions that need to be integrated tommorow. Whatever solution you build today, needs to be open enough for you to either plug into or extend tommorow. This is only remotely possible using a completely OpenSource infrastructure that you control. Trying to do this with any collection of 3rd party proprietry products is just suicidal - it will simply never work. Even if it does manage to hang together today (by whatever miracle), you can bet that future strategic moves from the 3rd party proprietry vendors are going to rip your solution apart, and create much bigger problems tommorow than the original problems that you are addressing today.
We have already seen a thousand examples of internal IT departments which are nothing more than marketting arms for a proprietry vendor - basically a parasitic unit who's loyalty lies not with the business that pays their wages, but some foreign organisation who is fleecing the business in licence fees. Such IT departments need to be revealed for what they really are - and then cast out from their long suffering hosts, just like one would remove a bloated tapeworm from the bowels of a suffering dog.
An IT department that is working FOR the business that employs them, should have a TODO list that covers today's problems, and be working on providing fast solutions to these problems. The IT department should be paying for all hardware uprades and licences out of it's own fixed budget. Once these things come at the expense of their own wages, you will be amazed at how little 3rd party software is really 'needed' after all, or be astounded to find out how much life is left in that 'old' server that was installed 2 years ago.
Having said all that, it should be recognised that the IT department is not the holder of Corporate IP. Any code developed by the IT department (in order to solve todays problems for the business) should be allowed to flow back into the larger system as GPL'ed code, simply because that is the way that IT works. The business is there to make and sell widgets - let IT get on with it's job of serving the business in GPL'ed peace.
As an analogy - if your plumbing broke in the office, and the company maintenance dude knocked together some intriguing
Re:I intended a Zen (Score:5, Insightful)
I disagree.
This may hold water if you're a mousetrap manufacturer. If it's your core product at stake, sure. But IT is a support system, almost always a cost center. The question is whether the latest-and-greatest from HP, Sun, and Oracle is really worth latest-and-greatest prices. Is having a product that's two years newer going to save your company how much you're dropping on it?
The huge gains that came from computers were when, say, Levi decided to computerize inventory management. Now it's computerized. They don't have to worry about warehousing junk where it isn't needed or paper mistakes costing them huge sums of money. Are they going to save more money by using a new HP system with WhizBang-3d-with-sound GUI analysis instead of IBM's old system? Maybe some, but nothing like the gains that have already been realized.
Furthermore, even if the IT spending is worthwhile, spending it on *products* may not be a good idea. If a CIO decides to drop five million dollars on software licenses, that's 100 man years of IT work that he's just exchanged for latest-and-greatest. If he does this every five years, he's losing 20 support personnel. Twenty people can get an awful lot of work done.
Walmart moving to RFID tags could save a *lot* of money, because they eliminate a lot of human work. However, there are two reasons jumping ahead like this ain't necessarily a great idea. First, RFID is a potentially big money saver, but advances like it also don't come along very often. Second, a lot of IT purchases and decisions turn out to be *bad* moves. If you let your competitors lead, and let them soak up the costs of development, only copying them when they do something that works well, yes, you lose a bit of lead time. You might have to absorb some losses. But you also gain a lot of money, and have time to let competing vendors enter the market space.
This is one reason why a lot of successful big companies are pretty conservative. Microsoft doesn't actually try very many new things for a tech company (and when it does, it tends to not do very well). Microsoft does *much* better by sitting around, waiting for masses of tiny companies to try various things, and then buy the one or two that succeed. Sure, they have to pay a hefty price for the one, but they let hordes of VCs fund their development and testing, rather than having to do it themselves. Most of Microsoft's primary products were originally developed by other companies that were then acquired.
Cutting edge? (Score:3, Insightful)
I would shore up the usefulness of the IT infrastructure of my company rather than slap another requisition on the table for a brand new server. This is a Microsoft based network with the network and servers taking good pressure from the users. A few years ago I would've tried to push management for better servers and gigabit ethernet, but one should see such problems as challenges to their skills.
Smarter placement of switches, interswitch links moved to gigabit, tuning the MS SQL server, replacing the winproxy firewall with openbsd, getting a cisco 1700, moving services to makeshift servers, stuff like that will allow a company to wait another year before investing in cutting edge. It will also require some creativity and activity on behalf of the IT guys.
Re:I intended a Zen (Score:4, Insightful)
But IT is a support system
I would argue against this basic assumption of yours. I want IT to replace the majority of the management tier within Walmart etc... because the majority of the lower management tier in Walmart etc... are absolutely shit at their jobs.
Example? I worked throughout my 4 years at University part time (full time out of term) for a major supermarket. Every Christmas for four years I spent a majority of my time in the store apologising for the lack of Milk, Beer and Bread. I had access to central inventory systems and could see vast quantities of the stuff sitting in the distribution centres, and I could see our bakery continuing to work an 8 hour day when we could have sold a 16 hour days worth of breads.
When approached Management delivered a condescending 'its more complex than that kid' or a more honest 'I dont want to be stuck with any cluttering my warehouse following the new year'. They would rather sell 50% LESS than have a 1% stock holding after the peak season. Also, they were happy to piss off their customers at Christmas. Shit at their jobs.
How many indicators does a decent IT system need to do a better job? The checkouts tell you that the last beer went through at 2PM every day, that the last bread went through at 11AM. The stockholding is shown as 0 for > 25% of the week.
Some minor systems are in place to handle these things - but they are crude. (Example - a few years back my local store in Scotland took delivery of 3000 England replica football shirts - 28 sold the rest were returned).
By solving some of these issues you can kill off a huge number of overpaid under qualified under skilled smug shits and have a BETTER company. IF the IT graduates get the idea. So - 3 years of workplace should be mandatory prior to an IT degree of any kind. Unless your planning to become a bearded AI researcher in which case please yourself!
Re:IT... What is strategy? (Score:1, Insightful)
What Mr Carr says (rightfully in my opinion) is that IT is not strategic for a company. Why? Maybe WalMart can use RFID to keep a better inventory, BUT any other company can do the same, with the same gains. That's why it's not strategic (in a Porter sense, this means: IT doesn't differentiate you from your competitors).
Of course commodity-IT (commercial or open source packages) is useful, but replicable. On the other hand, in-house developments, with propietary information that won't be shared with competitors (for example Google search algorythms) are VERY strategic.
Public software packages and hardware, consultant based customization can be copied by any company and hence irrelevant from the strategic point of view (I repeat, it doesn't mean it is not useful)
Re:I intended a Zen (Score:3, Insightful)
The VC funded startups come up with a superior core trap mechanism, but it's Microsoft that turns it into a viable product. This conversion of prototype to finished product is itself a form of innovation-a form Microsoft is good at, given time. That's how MS is effectively first to market even though other people create the original ideas. The early versions aren't really at the market yet.
For a concrete example of how MS doesn't succeed, look at Oracle. MS didn't buy out the database market in the beginning, and competitors paved the way. Once MS decided to enter it was too late-the competitors didn't need MS. As a result MS failed to capture the market and missed out on a lot of money. Their bought-out core products were all purchased before the other company got big. Microsoft does take risks, and not all of their buyouts pay off. They're just big enough to ignore the losses and wait for the good opportunities to pay for the bad.
You do have a valid point in that many companies do not need to lead the way in innovation. The question of "Does IT matter?" is actually very industry specific. A doctrine that brings riches in one field could be suicidal in another. Many industires-especially stable blue chip types-don't need to change the world on a regular basis. That CIO you mentioned who spends $5 million on new software might not need to if he works for an old school company with a stable product line. Why then does he? It might not be pure incompetence. He could very likely be getting kickbacks from the software company. Corporate bribery in the form of luxury vacations and entertainment is responsible for most deals these days. Or, he may be afraid of the 20 people he could've hired. 20 people can get a lot of work done, but 20 ambitious people can also rise into a lot of executive positions! Our CIO is one because he's a good executive-he knows how to slit throats, and get himself promoted. The people who are good at getting promoted are the ones who end up at the top, so defensive political tactics rule the day, and the company is milked like a cow. The CIO who bought the new software gets a trip to Taihiti, 20 fewer people trying to steal his job, and a big, headline grabbing decision that allows him to rest on his laurels for a while and hire a new young, promiscuous secretary to sexually harass. That bastard. I hate him and his groping. It's a well known fact that cocaine usage among the business elite exceeds that of the general population by an order of magnitude. With all of this in consideration, it's clear why companies in traditional, mature industries do not use the same tactics that apply to more fast-paced fields.
In summary, the author of the article under discussion is asking a meaningless question in "Does IT Matter?". It's like asking, "Why is the sky blue?" when you're talking about both Earth and Mars. The question only applies to some companies, so any possible answers will be both right and wrong. Chances are, the guy is being so general and provocative because he just wants to stir up controversy and debate. That could be a very good thing for all industries in the end, and the only way to get controversy is to have two sides worth arguing.
Re:IT... (Score:1, Insightful)
Re:Just do it . . . (Score:3, Insightful)
How many people seriously need something like J2EE or .NET for software development of their data processing systems?
How much gain is there really to be made from switching a system to ASP to ASP.NET? OK, there are advantages, but disadvantages too. Rewriting means shifting your skill base - forcing them to learn new stuff, and they lose the experience. After 10 years of COBOL, I used to be able to virtually code blindfold or drunk because I had so much familiarity and had many so many mistakes that I'd learnt the best way of doing things.
Nowadays, companies dump stuff so fast that they never gain that software maturity. And if it's not the software, it's the other stuff around it. Methodologies, inventions, whatever.
Companies get hooked into the fashions that bog down the productivity. OO, web services, browser based systems, XML all have their uses. But I see people overusing them and overcomplicating their systems.
If I was running a business, I'd find something in the software world that is long term, proven and changes little. C++ or COBOL. Run it on something that works AND will not be subject to the whims of one manufacturer (say Linux or Unix). And try and get programmers to think more about the business than what they can stuff on their CVs.
Why mod this funny? (Score:3, Insightful)
Doing the details right is hard, hard work. Witness both MS and Linux to see this: MS can't get the security thing down and Linux still fails at the ease of use thing, despite a lot of smart people working on both.
Re:I intended a Zen (Score:4, Insightful)
As you said the info etc is all there. But people are just refusing to act on it because perhaps the perceived danger/punishment for having 1% excess stock is considered worse than the reward for selling more, factoring the perceived risks. They may be rewarded more for having no excess stock. Which indicates a management issue.
It doesn't matter if a ship has tons of sophisticated systems, if the captain overrides all of them and slams into an iceberg.
On the flipside pilots of planes have done amazing things despite sophisticated systems failing.
Look at HP, just a single person (Carly) can do more damage to HP than tons of IT can ever fix or prevent.
So while IT matters, for most organisations it really doesn't matter that much, nor should it. Accounting and Finance probably matters more. Good bosses and leaders matter far more.
Re:I intended a Zen (Score:2, Insightful)
I would say, doing their jobs. If that 1% overstock left over after Christmas eats 90% of gross margins, then it makes total sense to avoid holding that stock afterwards. I'm suspicious of your 50% sales increase number to begin with. I imagine you were sensitized by the entire ordeal because, being a front-line manager, you were the one the irate customers were coming to with their complaints.
Re:IT doesn't matter - but not how you think... (Score:3, Insightful)
Computers are orders of magnitudes more complex than a dusty, cluttered house.
Computers are small, they are hard to visulize. It takes time to understand the ways they work.
I believe anyone can be trained to fix, design, program computers. Much like anyone can learn painting techniques. But it takes a degree of insight and craftiness to do good things with a brush. Not everyone can do that.