Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Is Client/Server Really Dead? 54

the-empty-string asks: "Technology fads come and go, but sometimes they do leave behind real systems supporting real business processes. There was a time when 'client/server' was all the rage, and today there are thousands of such systems still in use, happily serving HR departments, providing inventory management, or tracking complex production processes. These days, after 'reusable components', 'three-tier', 'J2EE', and other resume-enhancing keywords, the magic phrase is 'Web services'. Consequently, many companies think they must scrape their existing client/server applications, in order to "move them to the Web". While the advantages of exposing functionality to the outside world are beyond debate, does this mean perfectly good and working applications must be abandoned only because they are client/server, or do they still have a useful role to play? Also, what is the migration strategy you would recommend to your boss or your customer, when these systems have to be replaced no matter what?"
This discussion has been archived. No new comments can be posted.

Is Client/Server Really Dead?

Comments Filter:
  • by sully67 ( 550574 ) on Saturday November 23, 2002 @09:09AM (#4738079) Homepage
    One of the major problem's we've found is the next version of windows can cause problems for some clients for client-server based applications. We currently have a custom application written using Oracle forms in which the printing is unreliable and crashes the client.
    With a browser based soloution you can then assume the browser at the client end should be able to work properly across different versions as long as the platform has a suitable web browser
    You just then need to work on making sure the server works properly and generates standards based output that any browser is able to render. As long as you can do that you've then eliminated a number of client headaches.
    • Um, which browser?:
      1. IE (which version?)
      2. Netscape (which version?)
      3. Mozilla (not the same as Netscape and also with version problems)
      4. Opera
      5. ...
      You have to QA against each, then there is the Java Runtime, then the OS and the OS version. This, I guarantee you isn't nice.

      And then the browser may also be used for general Internet access so requires a quick mandatory update because of the latest security hole. Your only real hope is to differentiate the version of the browser use for application access (and thus has been QAed) from the version used for general purpose work.

      • The OS version is relevant mainly to Win32, where each application subtly upgrades the OS.
      • > You have to QA against each

        heh :) tell that to my company : our expense reporting , and now our Time Sheets are all on-line; except they use so many frotzing Active X controls that they only work with IE 5.5 (not 6, mind you...)

        the idea that this kind of corner cutting now will just lead to more work later when ie 5.5 goes away for good just doesnt seem to occur to anyone...
        • I'm curious at to why the time sheets need Active-X?

          This is exactly the kind of relatively dumb for-filling exercise that is ideal for the web. I have seen some lovely soloutions, even freeware ones where most of the work is done in PHP back on the server and the browser is largely version agnostic.

          Incidentally, the paranoia we went through to test our stuff is justified. Some idiot calling the API direct managed to exchange price and quantity (they switched the checks off as well). If the price is 5000 and the quantity is 10, selling 5000 at 10 is going to do bad things (and wipe about 100 million Euros off the market).

  • by Eagle7 ( 111475 ) on Saturday November 23, 2002 @09:15AM (#4738085) Homepage
    C'mon... anyone who can pull thier head out of thier buzzword-infested ass knows the answer to this - of course client/server isn't going to die. For one, just look at applications like games, irc, etc. to grab a few from the Internet relm.

    I think this question is really just shows the limitation of thinking in terms of technology "brand names" - i.e. client/server, webservices, etc. Because what are web services - well, they are services provided by a server over a web (http) client. Sure, they all conform to a certain general format, and they are platform independant, but is this so different from the multitude of 3270-based VM applications that some many places used (and still use?) There is still a workstation client, and there is still a central server and database repository.

    In fact, as we get to more complicated web pages (xforms, etc), and more streamlined client machines, we are ironically regressing back toward the days of 3270 dumb terminals that conneted en-masse to big 360 and 370 main frames. Because is there really that much difference between a company Intranet today and thier VM system a decade ago? Aside of useability and flexability, not a whole hell of a lot.

    So I argue that client/server hasn't gone anywhere. Yeah, perhaps traditional client/server applications (which lack in flexibility and are more costly - in terms of development and platform requirements - to deploy) are going by the wayside, but they are being replaced by new client/server systems that use a web client rather than a custom built DOS/Windows/UNIX/VM application.

    So, I reitterate, get your head out of your buzzword infested ass, and look at technology for what it is, not just the name that people attach to it. Until individual workstations are powerful and reliable enough, and the network between them is fast and flexible enough, and some other system for keeping ata permission based comes along, we're going to need servers. And with servers will come client, in whatever form they arrive.
    • by vsync64 ( 155958 ) <vsync@quadium.net> on Saturday November 23, 2002 @09:54AM (#4738145) Homepage
      In fact, as we get to more complicated web pages (xforms, etc), and more streamlined client machines, we are ironically regressing back toward the days of 3270 dumb terminals that conneted en-masse to big 360 and 370 main frames.

      I agree with the main idea of your post, but you've made an error here which obscures its correctness. The 3270 is not a dumb terminal; in fact, this is what distinguishes it from the VT series and the ANSI X3.64 standard.

      The 3270 is a smart terminal with support for forms-based input. The server specifies the types and locations of the fields required, and the terminal draws them, accepts input, and does basic verification, batch-submitting the entire form when complete. Typing lag, therefore, doesn't exist (this fact saved me from going completely insane when Office Depot couldn't keep its network running and thoroughput dropped to ~300bps).

      So yes, HTML viewers and 3270 terminals are very much alike, and share many features, drawbacks, and programming issues.

      • Yes... thanks for the clarification. "Dumb" was the wrong term perhaps... but I meant "dumb" in terms of it's need to connect to a server to accomplish anything besides terminal configuration. And in that sense, they are dumb (as would be a terminal that just has an http client). Heck, the 520 (540?) series of VT terminals could split the screen, have multiple connections, etc, etc... but they're still dumb. I don't think that just becuase 3270's can process more complicate input and handle simple verification and form functionality, that moves them out of the dumb arena - even if they are near/at the top.
        • And in that sense, they are dumb (as would be a terminal that just has an http client).

          I'd like to see something like this. Perhaps something with a 512MB flash drive, a bunch of RAM, and USB, VGA, and audio ports, running Mozilla?

          I liked the JavaStation...

    • >Because is there really that much difference
      >between a company Intranet today and thier VM
      >system a decade ago?

      Yeah, the mainframe was a lot more reliable.
  • by anonymous cupboard ( 446159 ) on Saturday November 23, 2002 @09:16AM (#4738089)
    I was working at a financial exchange and the application used a 3 to 4 tier client-server technology. The client can't be moved to the web because performance would suck big time. Some of the clerical back-office operations could theoretically be moved to the web, but when someone looked at doing this, performance sucked.

    Some exchange members may effectively put a web server at tier 3 (Member interconnectivity server) and a browser on tier 4, but that is their problem not the exchange's.

    If we were writing the application again, maybe we would move non-performance critical stuff off to the web. Already the front-end is written in Java, but a mixed architecture would be verycomplicated (how to support a web-server tied intimitely with the exchange interconnection software).

    The main win that a web-based solution would bring is that it would make a client much easier to update. At the moment, the exchange has to coordinate the rolling out of new releases over 550 organisations with many thousands of workstations. Painful, eh?

    And then on the browser side, which do you support? If you work at the sharp end, there are many incompatability issues. If you role out your own tested/debugged browser, you may as well roll out a dedicated client.

    For use as a dedicated application clients, most browsers are awefully fat (big and slow with unwanted functionality) and the extra they bring make them less easy to support when some bozo calls up to say that their trades are being garbled or lost.

  • Consequently, many companies think they must scrape their existing client/server applications

    I don't understand how scraping them would help anything.
    • I'm sure it was a typo in the article, but the first step to doing the smart client for many companies was to use "screen scraping." A terminal emulator back-end would parse the output originally intended to show up on a character-cell terminal, digest it into a form some local peecee program could use, then spoof terminal keyboard activity to get the results back into the old mainframe hiding in the background.
  • by Lord Sauron ( 551055 ) on Saturday November 23, 2002 @09:22AM (#4738099)
    Technically it's the same thing. A browser is the client and the web server, well, the server.

    A better question would be if proprietary client-server solutions, that require proprietary clients are doomed.

    The web services model has a client already installed in most OS's, runs in most platforms, so it has a clear advantage.
    • At their core, Web browsers are more like dumb terminals than intelligent clients. Even with stuff like javascript you're still pretty much at what used to be called the "smart terminal" level of functionality.

      Once you add something like Java, or embed browser code into some other program, yeah, the distinction evaporates.
    • by Anonymous Brave Guy ( 457657 ) on Saturday November 23, 2002 @11:17AM (#4738406)
      The web services model has a client already installed in most OS's, runs in most platforms, so it has a clear advantage.

      It does?

      Web services != web browsing.

      The only connection, aside from a convenient and hypeworthy similarity in the name, is that both use HTTP as the protocol.

      As others have noted, web services are just another version of client/server, which happen to use things like HTTP for the communications.

  • by Mark Wilkinson ( 20656 ) on Saturday November 23, 2002 @09:41AM (#4738129) Homepage
    To me there's very little difference architecture-wise between client/server, n-tier, J2EE, web-services and so on. It all comes down to taking a single logical application and splitting it into chunks that can be reused separately and that can sit on different machines. The interesting thing is where you actually decide to split the application, and I think this is where client/server went wrong.

    For the large-part, client/server was basically taking an application and putting the database on a central server, using ODBC, Oracle client or whatever to transport SQL over the network. It's the logical progression when you're used to writing VB applications using an Access database and you want to make your existing model work for lots of users.

    What we're now seeing, though, is that splitting an application at the point where the business logic talks to the database is one of the worst places to do it, architecturally speaking. You finish up with your business logic bound to the presentation in a way that makes it difficult to reuse, and eventually you have multiple implementations of the same piece of business logic with a single database underneath it, and a whole bunch of inconsistency.

    So J2EE, web services and so on are really about trying to make it easy for developers to split their applications into chunks somewhere between the presentation and the business logic, rather than between the business logic and the database.

    But this new technology doesn't necessarily mean that people will understand the architecture any better. Just wait long enough and someone will miss the point completely and produce a piece of software that repackages ODBC, JDBC or something similar as a web service.

    Are client/server applications dead? No. But you do need to have a migration path that allows you to extract the business logic as distinct software components so that you can re-use them in new applications. Whether you use web services or J2EE to help you do that is of little consequence. You could just write good libraries instead, as long as you get the conceptual architecture right.

    A good migration path is one which will allow you to incrementally move business logic from your existing client/server front-end into a reusable middle-tier. Don't try to rip complex applications out and replace them wholesale - it's much more risky.
  • And the most portable ones. If a webapp developer follows the given standards, then his application will work on most (if not all) graphical browsers (sorry i'm leaving lynx and alikes out here). So the client just needs to be a compliant browser, platform independent (sw and hw). Current and new technologies makes possible almost any kind of user interface and functionality upon a web based application. Using PHP, XHTML, JavaScript et al there's whole range of rich of functions, database conectivity, printable documents generation (PS/PDF), etc. And with the upcoming adoption of Xforms, things are just about to get better. My advice, just follow the standards, the open standards. Leave out flash or any other kind of malicious, closed evil web-gadget.
  • I feel for you (Score:4, Insightful)

    by photon317 ( 208409 ) on Saturday November 23, 2002 @10:16AM (#4738171)

    You're lost in a blizzard of buzzwords with no meaning, and acronyms for acronyms of buzzwords with no meaning. You need a vacation in a log cabin in the woods of Ohio with some deep technical books that they don't sell at Barns and Noble for a few weeks to get your feet grounded again. Your question is irrelevant. The real question you should be asking is, "Why would I ponder such a question?"
    • Re:I feel for you (Score:1, Redundant)

      by Eagle7 ( 111475 )
      Amen, brother.
    • by Hubert_Shrump ( 256081 ) <cobranet@@@gmail...com> on Saturday November 23, 2002 @02:03PM (#4739063) Journal
      You are in an open field west of a big white house with a boarded front door.
      There is a small mailbox here.
      >_


    • Go further. (Score:2, Insightful)

      You need a vacation in a log cabin in the woods of Ohio with some deep technical books that they don't sell at Barns and Noble for a few weeks to get your feet grounded again.

      Heck, go futher. Go on vacation and REALLY get your head out of the clouds.

      Take a REAL vacation with family & friends. No technical books, no computers, no shop talk; just cabin life, hiking, reading novels, fishing, playing, drinking, eating, sex, whatever.

      Get far, far outside the box, where you can see the big picture and gain a real perspective on your life/project/job/whatever.

      Then come back, with the perspective of an outsider, but still with enough knowledge to make an informed decision.

      You might be suprised by the ideas that come into your head after a real vacation.

    • Flamebait? Moron mods at it again.
  • by spike666 ( 170947 ) on Saturday November 23, 2002 @10:58AM (#4738333) Journal
    ...or at least one of the points of web services?

    web services gives the ability for anything to call it, 'client' programs, or web page generators. the idea is to allow a single backend that multiple front ends can utilize.

    so no, i think client/ server is dead. i think it is actually now becoming client/web service.

  • by Anonymous Brave Guy ( 457657 ) on Saturday November 23, 2002 @11:20AM (#4738424)
    While the advantages of exposing functionality to the outside world are beyond debate...

    Um... No. The advantages of exposing systems currently implemented in some in-house client-server way to the outside world are far from proven in most cases.

    Exposing things to the outside world guarantees only one thing in itself: you will be subject to more security vulnerabilities than you ever had before.

  • by ka9dgx ( 72702 ) on Saturday November 23, 2002 @11:46AM (#4738514) Homepage Journal
    SQL, one of the original client-server protocols is alive and well... buzzwords change, but the concept (shipping less data across a network) will always have value.

    --Mike--

    • SQL, one of the original client-server protocols is alive and well... buzzwords change, but the concept (shipping less data across a network) will always have value.

      SQL is a language, not a protocol. You are probably thinking of SQL*Net and TNS, both of which happily run on TCP/IP networks (and AppleTalk and IPX/SPX, and DECnet and plenty more).
  • Uhh, No. (Score:3, Funny)

    by smoondog ( 85133 ) on Saturday November 23, 2002 @12:00PM (#4738562)
    Last time I checked /. was a server....

  • by schmaltz ( 70977 ) on Saturday November 23, 2002 @12:54PM (#4738760)
    User interfaces sometimes get downgraded when being confined to the limited set of GUI widgets available in HTML interfaces. At the very least, the roundtrip and render time required to regenerate a screen or page, even when only one item changes, is at best distracting to the user, and also wastes their time.

    Where applications need to be broadly distributed, extranet and internet sites for example, HTTP/HTML can be appropriate. But for internal applications, for performance, convenience, sophistication of the UI, a compiled application running on your local desktop -and accessing a central server for shared data- is still gonna be best, imo.
    • N-tier does not imply HTTP/HTML. It can, but there are other ways as well.

      Further, ActiveX, Java, Macromedia Flash, etc. were all developed to help provide a feature-rich UI environment in a web server housed application.

      The point is, you have many options. It's not just web versus client/server.
  • The term 'web services' still represents software with a 'client-server' architecture. The distinction is a web services client is the common web browser.

    Clients can now be built using common open standards, instead of platform-specific native applications.

    Many older 'client-server' systems can 'resurrected' by developing a web user interface using web protocols http/s + (html, Java applet, and/or active-x).

  • I don't think that client/server architecture is going anywhere anytime soon. If nothing else there are still allot of things that you simply can't do with a web based solution that you can with a dedicated client. One of the big drawbacks to real applications is the Post/reply communication of web based services, you can't get a constant stream of information. The widget options on the web are also not nierly as dynamic as what you can get in a dedicated app.

    Not to say that the web does not have its place, it most assuredly does. It just not quite ready to completly replace dedicated clients yet.
  • does this mean perfectly good and working applications must be abandoned only because they are client/server, or do they still have a useful role to play?
    Depends on where you work and how much influence you have. If management where I work had a say, yea we would get rid of our client/server apps that work perfectly well and go to the newest buzzword compliant kid on the block. (Mind you, we would also go without a pilot or any load testing either, because that's all unnecessary you see, according to management). I still think that you use the best tool for the job, regardless. The end user doesn't give a crap how he gets his info, as long as it's fast and reliable. If you can fit into that model, then more power to a client/server app.

    Of course, they also don't give a crap about scalability, but that's because they don't have to deal with it either.
  • I don't get it. Having just read through the posts so far, I can't help noticing that:

    They are mostly on-topic.

    There is a relatively low occurrence of lame wisecracks

    Most of the posters seem to have read the content of the article before posting. If this keeps up, it threatens the very existence of the /. culture!

  • Buzzword Bullshit (Score:3, Interesting)

    by renehollan ( 138013 ) <[rhollan] [at] [clearwire.net]> on Saturday November 23, 2002 @03:03PM (#4739313) Homepage Journal
    Ya know, that's all it is: bull shit, crapped out by marketting types to sell the "next greatest thing".

    Certainly, there are a variaty of ways to distribute an application, when you start to have real compute power at the human end of things, but they're all variations on a distributed processing theme, conveniently placing a "tier" at an architectural point with clean, simple, and few interactions with components at another tier. Hence, three-tier and n-tier applications: they're just particular distributed "sweet spots" particularly appropriate for certain large classes of applications.

    In the old days, if we even considered such distributed systems (requiring a PC on a desk, instead of just a terminal), we coded all the protocols, marshelling schemes, and so forth ourselves -- it was an in-house thing. So, yeah, all such distributed applications look the same, from that perspective, the way that all large C or C++ applications "look the same", if well-designed.

    But, as particular distributed mechanisms meet large "sweet spots", such as a web browser/server split, they're pushed as the definitive "answer" to man-machine interaction.

    And, just as quickly, or shortly thereafter, the shorcomings of a particular popular distributed architecture split become apparent, and the next "model" is pushed as the definitive "answer". The fact that it is tuned to a different class of problems is generally lost on those flogging it, and those buying it. The rest of us just kind of look at it and say, "yeah, so?" -- after filtering through the buzzword muck.

    Now it is true, that, as client-server models (and all of these are just that, really) mature and are pushed into use, they are strained to the limits of their scalability, and two-tier architectures give way to three-tier architectures, and so on. So the new tunings offered by the "latest" architectural split does serve to solve new problems, but that in no way invalidates the fact that the "old" architectures did a splendid job of serving to solve the "old" problems.

    The frustrating thing, from the perspective of someone like me, having worked with all sorts of distributed systems for close to 20 years, is the notion that one such architecture is sufficiently different from another that the "old skills" are now obsolete: client-server techniques do not transfer to three-tier architectures do not transfer to, what's the buzzword?, oh, yeah, "Web Services".

    While the interfaces are new, and the implementations of the various components need to be picked up, a seasoned architect will look at them an say, "yup, that should scale the way we need". However, there is this notion that this can not happen quickly, and new "experts" need to be hired to replace the "old" experts.

    Funny, most places don't get rid of their "if statement" C++ programmers when they need "while" statement-, or golly gee, "function call"-programmers. Understanding of standards particular to a given architectural model may be important, but it's such a small part of rolling out a working system, that any competent software engineer can deal with them.

    Don't make the mistake a former employer of mine made: contracting out some servlet code to "Java experts"... that had no clue about threading issues because, while they "learned Java", they knew nothing about multitasking issues. Hint: those that understand the latter generally adapt to a different language faster than those that know a single language adapt to other than basic computer concepts. And so it is with buzzwords designed to obscure the obvious.

  • by sheldon ( 2322 ) on Saturday November 23, 2002 @04:05PM (#4739559)
    Client/Server or 2-tier implied you had a client which interacted directly with the database.

    N-Tier development means you have a minimum of 3-tiers... Client, Logic and Database. The Logic can be broken into other parts such as business or data-access, etc. N-Tier development does not imply web server. It can, but you can also build a "fat client" using .NET, ActiveX, Java, etc. which communicates directly with the application server.

    One of the advantages of n-tier development is the abstraction between UI and logic. This allows for somewhat more rapid development on larger projects as you can divide the work up between several groups.

    But the chief advantage is one of security. In order for client/server app to work, the user needs direct access to the database. With n-tier you can authenticate/authorize at the application layer, which gives you finer granularity of control than at the table layer. With the client/server model you need grant the user rights to an entire table, but with n-tier your app server has rights to the table, but the user may only have rights to the individual records that they should be able to see.

    In a larger enterprise you also run into issues where once people have direct access to the database, they start running ad-hoc queries from reporting apps. It's nice to be flexible and convenient this way, but in larger environments, if those queries are not tuned they can consume a large amount of resources. So instead you keep the users out of the OLTP environment, and replicate the data to a data-warehousing environment that they can use for reporting. While you could possibly do this with training, you leave yourself at risk that your employees are all smart enough to understand. Better to just design the system to prevent potential issues.

    Anyway, there's a lot of advantages in this model and there are books written on the topic. I just point out a few obvious ones that I've seen based on my experiences.
  • The client/server relationship is one that has existed a long time before computers, and it was doubtless that computers would come to be used in such a manner also. Just think about it, do we not engage in client/server activites daily? There're televisions, radios, newspapers, teachers/students in a classroom environment, etc etc. Furthermore, these models I mentioned only apply to the propogation of information, just as computer programs function, but for the distribution of other things, there are many, many more examples to be found in the real world.
  • Honestly... (Score:1, Flamebait)

    by cr@ckwhore ( 165454 )
    Honestly, this is the dumbest question I've heard in a while... but whatever.

    Here's an analogy...

    Just as TCP is built on top of IP, "nTier" is built on top of client/server. No matter how many "tiers" you have, there's always one tier that is the client (aka, user interface) and the rest are servers. It doesn't matter that one server might to X and another Y... they are still servers communicating with clients, and communicating with each other.

  • by sigwinch ( 115375 ) on Sunday November 24, 2002 @12:22AM (#4741423) Homepage
    This reminds me of a comment [kuro5hin.org] on Kuro5hin:
    Client Schmient

    The whole client/server concept is so '01. We abandoned it in favor of something we call "(n-1) tier". No clients at all, just multiple levels of server. We don't even expose an interface, most of the time.

  • .....about five years into my career client/server was the in thing. This was meant to replace mainframes, reduce costs, reduce man power and solve world hunger. Well it didn't do any of these. They cost more and need more manpower.

    Now web services are being touted as the golden bullet. It's all bunk and marketing. Not to say web services do not have a place, they do, they just ain't a complete solution.

    The sooner we come full circle and get back to terminals attached to the central server the better we'll all be.

    But gotta go I've got to debug this AS/400<->NT<->CE database integrity problem, then I can get started on the CE gui lock up and lastly solve the backing up of two central servers and many remote PCs. Client server keeps me employed.

    The question you should be posing is 'why do users always throw out systems?' I have no answer to this . The lemming syndrome springs to mind as does sheer stupidity.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...