Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Communications Networking The Internet Your Rights Online

Anti-Technology Technologies? 146

shanen writes "A story from the NYTimes about metering internet traffic caught my eye. I thought the exchange of information over the Internet was supposed to be a good thing? Couldn't we use technology more constructively? For example, if there is too much network traffic for video and radio channels, why don't we offset with the increased use of P2P technologies like BitTorrent? Why don't we use wireless networks to reduce the traffic on the wired infrastructure? Such technologies often have highly desirable properties. For example, BitTorrent is excellent for rapidly increasing the availability of popular files while automatically balancing the network traffic, since the faster and closer connections will automatically wind up being favored. Instead, we have an increasing trend for anti-technology technologies and twisted narrow economic solutions such as those discussed in the NYTimes article, and attempts to restrict the disruptive communications technologies. You may remember how FM radio was delayed for years; part of the security requirements of a major company includes anti-P2P software, as well as locking down the wireless communications extremely tightly — but there are still gaps for the bad guys, while the main victims are the legitimate users of these technologies. Can you think of other examples? Do you have constructive solutions?"
This discussion has been archived. No new comments can be posted.

Anti-Technology Technologies?

Comments Filter:
  • Control (Score:5, Informative)

    by Xiph ( 723935 ) on Sunday June 15, 2008 @08:53AM (#23799551)
    It's a matter of balancing control against efficiency.

    Understanding the workings of an entire swarm is is not easy.

    With a swarm it is harder to differentiate for "elite" customers who pay to get that extra bandwidth.
    Where you are in the swarm will matter just as much as which connection you're paying for.
    • by Anonymous Coward on Sunday June 15, 2008 @09:10AM (#23799631)
      Bittorrent is a major part of the problem because it attempts to utalise 100% of the available bandwidth (and the client end). If every user used bittorrent, then the ISPs would have to supply 1:1 bandwidth (instead of overselling as they do at the moment), thus dramaticly forcing the price up for every user.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        ... thus dramaticly forcing the price up for every user.
        Thats what they say...

        Bittorrent is a major part of the problem because it attempts to utalise 100% of the available bandwidth
        Bittorrent is a protocol, what you describe is the default configuration of popular Bittorrent clients that can be configured otherwise.
        • Re: (Score:2, Insightful)

          by bane2571 ( 1024309 )
          "Default Configuration" is for all intents and purposes the same as protocol standard for a significant portion of the population.
          • While to some extent we hopefully recognize some value in human creativity, such that copyrights and other IP deserve to be respected for an appropriate LIMITED time, excess greed is trying to create "Artificial Scarcity". ALWAYS a bad thing.
      • by Dan541 ( 1032000 ) on Sunday June 15, 2008 @09:52AM (#23799891) Homepage
        Consumers using what they paid for!!!!!

        Oh no we can't have that.

        ~Dan
        • by homer_s ( 799572 )
          Customers having to pay for what they use!!! OMG!!!
          • Re: (Score:2, Insightful)

            by rocketPack ( 1255456 )

            ISPs promising what they can actually deliver!!! ZOMG!!11one!11oneoneone!!!!11!111one

            In the corporate world, this shit doesn't fly. You get less for more money, but it's guaranteed. What if ISPs just sold us connections that they could actually deliver, instead of jacking up the numbers to look good?

            This issue can be argued from many angles, and I think it's pointless to throw mud back and forth -- the article asked for CONSTRUCTIVE suggestions, and I see neither of you have provided one. Let's stop reha

            • Well, for starters, this would mean that ISP's would likely have to dramatically raise prices, maybe more than 10x the current price. The whole reason we have multi-megabit internet for $30 is because they oversell. The other option would be to reduce bandwidth by as much as 10x. (10:1 is the usual overcommit ratio). Then, there would be a lot of capacity going unused, because most people aren't utilizing even 10% of their bandwidth, especially over the long haul.

              This would lead to decreased energy effi
              • Re: (Score:3, Interesting)

                by L4t3r4lu5 ( 1216702 )
                There is only one solution, and it's extremely simple...

                Everyone is first sold a standard cable connection, unless they state specifically that they require otherwise, and can demonstrate or explain why. Other than that, after the first 3 months of a new contract with an ISP, your usage statistics are provided for you to view, and the cable company shows you the list of connections available to you, with the one you would most likely benefit from using being highlighted. Users of 10% would be recommended a
          • by shanen ( 462549 )
            I don't understand the excessive focus on BitTorrent, but this is more of a barter situation. We can both be downloading the same file from various other sources and from each other, and both of us (and the rest of the network) are all benefiting at the same time. The long-distance data traffic is automatically reduced, because the local connections are faster precisely because they are local.

            One case in point is the Ubuntu distributions, where their servers basically go nuts when a major release goes out.
        • But that's the problem! No-one would ever use what they paid for, since they weren't actually sold the bandwidth that they were told they were sold on paper.
      • by Anpheus ( 908711 ) on Sunday June 15, 2008 @10:13AM (#23800031)
        How? There isn't enough content to run BitTorrent maxing out my connection 24/7. I'd have to buy a new hard drive every day to do that. Can you propose to me any way of actually utilizing my connection 24/7 with BitTorrent, maintaining a seed ratio and not clogging my hard disks (because I'd need to buy a NAS in short order.)

        I think the result would be significantly lower than 100%. For one thing, 100% of people will never use any one technology. For another, even those who do can't possible saturate their connection 100% of the time unless they're on dialup. I have fifteen megabit cable with a realized throughput of around 13000 kbps to the continental US, and can easily get 1.6-1.7 mega-bytes- per second on downloads. Even at just 1 MB/sec, I have to buy another 80GB hard disk a day to fill this line. Heck, I'd run out of content I'd even want to download.
        • by Wildclaw ( 15718 )
          It is like you see on the other slashdot ISP/internet discussions.

          One person suddenly says, "If you actually use your X mbit/s download, how much do you think it would cost the ISP?". That is of course ridicioulus, because who actually use their download 24/7?

          There are three things that limit most heavy users (exception for compulsive hoarders)

          * Consumption Time - You only have X amount free time.
          * Quality Material - There is only so much good material.
          * Upload - Heavy users often run servers or upload to p
          • One person suddenly says, "If you actually use your X mbit/s download, how much do you think it would cost the ISP?". That is of course ridicioulus, because who actually use their download 24/7?

            Using a connection 24x7 is easy and filling both the upload&download stream is plenty easy.

            Offer seeds of the popular linux distros and the latest helix iso.
            Put up a Tor router
            Or maybe a game server or irc relay

            Just three examples off the top of my head that could fill a pipe 24x7. :)

            Gimme the bandwidth, I'll use
            • by Wildclaw ( 15718 ) on Sunday June 15, 2008 @06:05PM (#23803815)
              You just gave 3 examples of how to fill your upload. None of those will fill your download. The one coming closest would be the TOR router that would use about as much download as upload (leaving maybe 7/8 of the download unused on an ordinary residential connection). If you are on fiber with equal download/upload, you are the exception and my original post was obviously not directed as such connections.

              My statement stands. You have to try extremly hard to get even close to filling your download on an ordinary residential connection. Even if you go for something like downloading Blueray ISOs, you still have to find a place where you can get them without trading your upload bandwidth for it. There is probably some theoretic case where it is true, but not in practice.
              • Aye, those examples would fill the upload side before saturating the download side. Filling the download side is easy, the only limitation is disk space. Drive space is plenty cheap; newegg has a 500GB external drive for $105 and an internal 750GB for $120. Either would be plenty of space to rotate media thru. The average US broadband connection would take a couple of months to fill those with near 24x7 downloading.

                Queue up your downloads and away you go; use nntp if you don't want to bother with "tradi
        • I have to buy another 80GB hard disk a day to fill this line

          You know hard drives are rewritable? After watching that HD version of %latest_crappy_movie% you don't actually have to keep it on your hard drive for the rest of your life.
          • Re: (Score:3, Interesting)

            by shanen ( 462549 )
            Actually, the way I imagine it is that there would be a pool of movies available locally, perhaps within a large urban wireless network. When you wanted a new movie, you would (perhaps automatically) delete some older movie (that you hadn't watched in a long time) from your disk, and download it from other people who already had copies of that movie. However, before you deleted your old movies, they would have already been copied somewhere else. (It would also be beneficial to have a regional metric of movi
      • The simpler solution would be to oversell some more!
      • by Hojima ( 1228978 )
        Or reducing the ungodly amount of profits they receive. The cost of bandwidth is already a hell of a lot more than it should be, not to mention you don't even get what you pay for. And hosting servers can be a pain in the ass, not to mention more prone to security attacks. If a small segment of a p2p network fails, the whole system remains intact. It's robust and need based, which is exactly what we need. And to top things off, p2p prevents bot activity by making it more difficult to forge IP addresses.
    • Re: (Score:2, Insightful)

      by PPH ( 736903 )

      With a swarm it is harder to differentiate for "elite" customers who pay to get that extra bandwidth.

      But does this justify delaying its introduction? Must we wait for any new technology until someone figures out how to squeeze every last dollar out of the rich folks?

      This is the whole point of technological advancement: To provide goods and services of higher quality for lower price than what was available in the past. If someone happens to be making a living providing the low quality, expensive crap to a small market niche and some innovation undercuts their business model, that's just tough.

      There appe

  • by Anonymous Coward
    is still the best solution.

    USENET
    • by Omestes ( 471991 )
      Erm... Read the story just above this one. Usenet fails to think of the children, therefore no Usenet for you.

      Also the net neutrality is about the "average" user, I really doubt that the "average" can differenciate between Google Groups and plain ol' Usenet. I'm guessing that maybe >1% of modern traffic is because of Usenet.

      Oh and, to fulfill the cliche requirement for all comments, "the first rule of Usenet is?"
  • Seems like monitoring would either cause an additional choke point or add more traffic. Neither option seems like it helps...
    • OK, your geek credentials are revoked too. Go read up on routers and gateways and promiscuous packet sniffing.
  • by SilentChris ( 452960 ) on Sunday June 15, 2008 @09:02AM (#23799583) Homepage
    In reference to the bandwidth limiting efforts in particular, just because there may be a way to offset technical problems with good technology (e.g. Bittorrent for video/audio) doesn't mean it makes business sense. For an ISP, it may be more economical to simply limit the bandwidth of users (which is easy) than figure out what is really a fairly difficult problem. If:

    What we're making now - Cost to implement bandwidth controls - Loss of customers that get ticked off

    is greater than

    What we're making now - Cost to implement good technology that handles bandwidth more efficiently

    most companies are going to choose the former. It makes more business sense.

    I'm reminded of a passage in "Becoming a Technical Leader" (great book btw - a commenter on Slashdot mentioned it). Anyway, it's about making the transition from techie to management, and analyzing the differences in thought processes. The author tells a story where a company was designing a system, and the requirements were "Make sure it can recover from one error per day" (or something similar). Anyway, the technical people involved with the project thought it would be better if they could get it to "Make sure it can recover from any error, ever, immediately", as they thought it was a more interesting technical problems. Turns out it cost the company something like $4 million, and in the end they had something that a) the customer didn't really need and b) they basically couldn't sell to anyone else. The moral of the story is that just because there are interesting technical problems, doesn't mean that solving them makes good business sense.
    • I agree with you that equation seems to be the way many companies are deciding their technological investments. If you think about it however, you might notice the problem with that equation:

      It does not take into account the effect improved technology will have on future markets. Successful businesses focus just as much on the future as the present. Sure the present is important, you botch that and whatever your future plans are, they're worthless. On the other hand it is idiotic to ignore the future. Succe
  • Is this what AT&T stands for: Anti-Technology Technologies... Interesting.

    Only reading the first few lines of TFA (which I suppose is more than some people would). But it seems that this Internet metering stuff is the same as what has always been in NZ -- 5GB monthly bandwidth +$10 for extra 5gb, etc... Not till about 1~2 years ago did we have 1gb limits and shittier overage 'consequence' - go over and pay 1c/MB, or speed capping back to ~56kbps (we were already on 256kbps~2mbps dsl). Then we increase
  • by Anonymous Coward on Sunday June 15, 2008 @09:08AM (#23799609)

    use wireless networks to reduce the traffic on the wired infrastructure
    Wireless networks are useful when there is no wired infrastructure, but if you have a wired network, it is orders of magnitude faster than the wireless option, especially where congestion is a problem. Using wireless to offload traffic from the wired network is like walking to avoid traffic jams.

    BitTorrent is excellent for rapidly increasing the availability of popular files while automatically balancing the network traffic
    BitTorrent (and P2P in general) is a kludge. Multicasting is a solution. BitTorrent is an inefficient protocol (from a whole network load point of view.) It bounces the same data around the net in unicasts. The swarm control overhead is bigger than it has to be because with slow upstreams you need more peers for acceptable download speeds.

    It is a case of technology being held back by non-technical reasons, but please look beyond popular technologies when you make an assessment about desirable technologies.
    • Re: (Score:3, Interesting)

      by Wildclaw ( 15718 )

      BitTorrent (and P2P in general) is a kludge. Multicasting is a solution. BitTorrent is an inefficient protocol (from a whole network load point of view.) It bounces the same data around the net in unicasts.

      Only when it comes to incredibly popular files. However most torrents have maybe a few hundred peers or up to maybe a few thousand, spread over a huge part of the earth Multicasting does little good in a such a situation.

      Multicasting is basically about taking you back to the old paradigm where everyone watches the same thing at the same time. (ok, you can save things so people can watch it later, but it is still the old)

      Bittorrent could probably benefit some from pairing peers that are locally close to eac

    • Using wireless to offload traffic from the wired network is like walking to avoid traffic jams.
      Thanks to the price of gas, people have been driving less and walking more. People will switch technologies whenever the existing ones hit a threshold of impracticality.
  • Simply stated, it's not in management's best interest to think beyond the next fiscal year or two. Massive rollouts of new technology or widening network backbones are simply not cost-effective in the short run.

    The other side of things is that bandwidth usage isn't a constant-- much like TV, there's a definite 'prime time' when the networks are under heavy load, and laying new cable or provisioning new wireless devices just to cover those periods is not cost-effective.

    There's also the real cost of bandwidt

    • Actually, that is not quite right.

      Widening backbones and massive rollouts are not cost effective in the short run or the long run. Just ask Verizon about FiOS, assuming they will actually tell the truth.

      Smaller rollouts and increasing the backbone in small increments is cost effective and is what is happening. The problem is the usage is increasing faster than the net work can be grown in a cost effective manner.
    • by Entropius ( 188861 ) on Sunday June 15, 2008 @10:05AM (#23799973)
      I read an article by a high-ranking Toyota exec in the New Yorker about how, in contrast to American companies, Japanese companies *do* think ten or twenty years in advance. He made the point that they didn't introduce hybrid cars in order to sell to hippies in ca. 2000; they introduced them because, come 2010 or 2015, gas is going to be expensive enough that lots of people are going to want them... and they wanted a mature product -- both from an engineering and from a brand standpoint -- ready to go.

      Now, in 2008, Priuses (and Corollas and Yarises) are common on the road in my city, while many of the short-sighted US manufacturers are trying to retool from building 18 mpg SUV's.

      The interview mentioned a Japanese business term that has no translation in English; I forget the word, but it meant something like "the faith that building products that people need and selling them for a fair price, long-term, will be profitable, long-term." That might be less true now than it once was, but it's interesting to note that Japanese companies do tend more toward the "Build useful stuff; sell it for cost + profit" model, and American ones toward "Make whatever we can market and sell it for whatever we can convince people to pay".

      The main exception to this that comes to mind immediately is Sony, who can go die in a fire. They've got their hands in lots of markets and are thus successful in that regard, but they don't seem to be market leader in any of them. I follow the camera market fairly closely, and Sony's main market in the US seems to be

      1) people buying point-and-shoot cameras that didn't do their research, and wind up paying >$100 more than the equivalent Canon or Panasonic that performs better;

      2) digital SLR's, which aren't really Sony's; they're rebranded Konica-Minolta stuff who Sony bought out.

      As an example of Sony's failing, their top-end bridge camera still doesn't offer any sort of processing controls: you're stuck with a JPG with one compression setting, one saturation setting, one contrast setting, one (excessive) noise reduction setting, etc. There's no RAW mode. The lens is *very* prone to chromatic aberration.

      Canon and Panasonic's competitors are cheaper, use superior optics, and offer control over the processing; Panasonic's versions have RAW, and Canon's

      But, as a marketing matter, you can't sell stuff like this to Joe Sixpack by saying "Look! Good optics! Controllable processing! RAW mode!", so Sony didn't even bother trying to do this stuff.

      • Re: Japanese Proverb (Score:4, Informative)

        by TaoPhoenix ( 980487 ) * <TaoPhoenix@yahoo.com> on Sunday June 15, 2008 @11:42AM (#23800601) Journal

        "The interview mentioned a Japanese business term that has no translation in English; I forget the word, but it meant something like "the faith that building products that people need and selling them for a fair price, long-term, will be profitable, long-term."

        The translation is "Fast Bucks vs. Slow Dimes". America likes This Quarter's Sales. Japan does likes Next Decade's sales.
        • by shanen ( 462549 )
          No, that is not the translation.

          Actually, I do speak a fair bit of Japanese, but his description did not bring any kotowaza (folk sayings) or yojijukugo (four-character slogans) to mind. He did remind me of kaizen, but his explanation went rather beyond that idea, which is often translated as 'continuous improvement'.
  • Short version (Score:3, Insightful)

    by Kohath ( 38547 ) on Sunday June 15, 2008 @09:14AM (#23799647)
    Short version:

    "I want everyone in the world to behave in a precise (but poorly defined) way to suit my personal sensibilities. Why don't they? Any ideas on how to make it happen?"

    Have you tried saying "please"? Other than that, I have no ideas. Maybe try to help people and solve problems instead of worrying about whether things are done exactly your way.
  • by terryducks ( 703932 ) on Sunday June 15, 2008 @09:15AM (#23799653)
    Follow the money. The ones with (power|control|money) want to stay on top and it's only the ones with better agility that corner the market and then become the top dog. So you're looking for a technical solution for the wrong problem.

    What's the problem ?
    IMHO, it's the "last mile". Legislated limited monopoly controlling access with an interest in keeping that position. so there's a high barrier to access put in place.

    Some of the other problems is what may work in a high density area will probably not work in a low density area. A wireless mesh may work in cities and towns but completely fails in rural. Another issue - making data retrieval a crime. "you're" responsible for someone else's actions and that kills any open public access. Some one has to pay to connect to the backbone.

    If I had a solution that would work in all cases - I'd be rich :p

    Here's a lynchpin that needs to be remove - the last mile monopoly and its bundling with "providers". Here in the Northeast (US) the power line is a separate charge on your power bill than the generation. Break that up. Internet access "line" charge $0.02 per month. ISP charge $x. Anyone should be able to send data over the lines without the big guys restricting access - for the same cost. NO AT&T ISP should be able to send data cheaper than another ISP.

    It may be time for $TOWNs to own the lines, bid repair out to another party and anyone to sign up to an ISP.

    BUT it won't work. See any telcom endevor.

    The Duck
    • Re: (Score:3, Informative)

      by Drogo007 ( 923906 )
      "It may be time for $TOWNs to own the lines, bid repair out to another party and anyone to sign up to an ISP. "

      UTOPIA (utopianet.org) is an attempt to do exactly that - and you wouldn't believe the dirty tactics Comcast and Qwest have been using to fight it (ok, so you probably WOULD believe the tactics they've been using, but still...)
  • by canipeal ( 1063334 ) on Sunday June 15, 2008 @09:21AM (#23799691)
    couldn't figure out why the darn thing kept blowing itself up....
  • I'd let them measure, but I was wondering a couple of things:

    Users that don't know much about internet, are those thinking they just look emails, now, what kind of emails? I've seen people (still) sending 40MB files attached to emails.

    A virus, popups and advertisement, download flash animations that people would believe they can't be charged for. How do the companies will deal with the "advertisment" issue, given that most of the advertisement these days is heavy and flash based? Moreover, how do they
    • Re:What about... (Score:5, Interesting)

      by Idbar ( 1034346 ) on Sunday June 15, 2008 @09:27AM (#23799723)
      BTW, is Microsoft paying for the constant annoying updates of its OS, as well as Apple for the annoying connections of Quicktime (and iTunes) and Acrobat for their automated downloads too?

      • by nurb432 ( 527695 )
        Nope, when we go to metered service you get to foot the bill for that 200mb service pack.

        Us FreeBSD people that like to use the ports tree to be current will be screwed too.
    • Re: (Score:3, Interesting)

      The problem you've identified is similar to the spam problem: I could not only annoy people, but cost them money by sending them large unsolicited emails.

      A solution for this would be to charge for traffic, but charge the broadcasters, not the consumers. Home users would pay nothing for bytes received, but would be charged for every byte they send -- which is negligible for most home users but would cost prolific file-sharers and people running web sites on their home machines. (As a side effect, this woul
      • by nurb432 ( 527695 )

        The problem you've identified is similar to the spam problem: I could not only annoy people, but cost them money by sending them large unsolicited emails.
        Or the random DoS.
      • by shanen ( 462549 )
        We need to tackle the spam problem as the economic issue it is. We should work to make sure that the spammers can't make any money from their spamming activities. We're evolving in that direction now, but it would have been a lot better if SMTP had included some real economic modeling there...

        Rather a poor joke, but I blame Al Gore, even though I admire and respect him. He kept telling them not to worry about money--he'd worry about that part for them. Okay, so it led to faster progress that way, but they s
  • by DaveV1.0 ( 203135 ) on Sunday June 15, 2008 @09:23AM (#23799713) Journal
    I hereby revoke the shanen's geek credentials for failing to understand that single source versus multiple sources doesn't matter if the problem is the total volume.

    The problem is not that on server or site is overloading. The problem is that the provider's network, including things like routers and gateways, have a finite bandwidth and these applications, regardless of source, are using up most of it.

    Ever hear the phrase "You can't put 10lbs of shit in a 5lbs bag"? Ever wonder why they put in new water mains and increase the size of water mains when the build more housing developments? Or why the widen roads with more housing? It is because the total volume has increased.
    • Re: (Score:3, Funny)

      by SoapBox17 ( 1020345 )
      We all know the internet is a series of tubes, like water pipes. But you aren't thinking outside the box. Instead of building larger water mains, these cities should just use catapults to throw "packets" of water through the air to parts of the city, thus reducing the load on the old "pipe" infrastructure.
    • by iamwahoo2 ( 594922 ) on Sunday June 15, 2008 @11:04AM (#23800341)
      The applications are not using up most of it. Just their share. If twenty people are sending bits over a line, then the bandwidth can be divided up evenly between the twenty. If two of these people are torrent users downloading 4Gb files and they remain online until peak hours when the number of user jumps to 200 people, then they should only get 0.5% of the total bandwidth. If we kick them offline, then there are still 198 "normal" users on the line and it is still congested at peak hours.

      The problem for most users is the amount of available bandwidth at peak hours. If some guy is sucking up tons of bandwidth at non-peak hours, then he is not hurting anybody. It is not like we can take the unused bandwidth from non-peak hours and use it during peak hours.

      The telecoms have not been able to follow through on their bandwidth promises during peak hours and they have managed to push the blame onto someone else. Now that people have bought into that excuse, they are going to try to make a few extra bucks off of it.

      Quite honestly, I have no problem with people who use more of a service getting charged more, if that is your business model. The phone companies have been charging for long-distance by the minute for years. But if we are going to start charging on a per bit basis, then shouldn't I, as a person who sends fewer bits, get a lower price? Or at least get to carry my bits over to another month? See, they want to treat each customer different based on what benefits them the most, and if it were not for their monopoly positions, they would not be able to get away with this.

  • I have remarked in the past that I am not sure that any government can really allow the free flow of information.
    And there is little we can do about the nature of government but the second player is big business. We can and should limit the power of corporations and punish them when the work against public interests by doing such things as limiting the flow of the internet. People have rights. Corporations should not enjoy the same rights as people do, For example the directors of
  • Simple reason (Score:3, Insightful)

    by Oktober Sunset ( 838224 ) <sdpage103@ y a h o o . c o.uk> on Sunday June 15, 2008 @09:31AM (#23799751)
    why build more infrastructure to serve customers if you can find new ways to make them pay for the infrastructure you have now.
    • by mgblst ( 80109 )
      Yes, they should just spend the $10-$100 billion of new infrastructure, just to keep some fuckers happy. I can't understand why they wouldn't just do that. I mean, they wouldn't get any more money from us, but still, it is only $10-$100 Billion, who cares?

  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Sunday June 15, 2008 @10:04AM (#23799957) Homepage
    If you had been paying attention at all, you'd understand the purpose of these "anti-tech techs" as you call them is explicitly to limit progress so the rich old fucktards can continue milking their obsolete business models until they retire or drop dead.

    To many people, progress is a scary, dangerous thing. Money, on the other hand, is a sultry lover that drives their every passion. Us folks on slashdot may prefer cheap plentiful bandwidth over money, but we're a tiny little minority in the grand scheme of things. The average Joe doesn't understand technological evolution, and most certainly does not see where it is all headed... it is far easier for Joe to stay ignorant and pay up.
    • by shanen ( 462549 )
      I have been paying attention, I am not new here, but I was trying to be polite. Also, I believe that in the long-term, you can't win against reality. For example, FM did become popular displaced AM in many applications.

      I actually think the ultimate solutions call for local accounting and a lot of resource sharing. Essentially your computer may request shared resources, but the neighboring computers should 'compare notes' before deciding how much help to give. If you've been playing fair, for example, by hel
  • bullshit (Score:2, Informative)

    by speedtux ( 1307149 )
    I thought the exchange of information over the Internet was supposed to be a good thing?

    It is. And that's why it's a good thing if my neighbor is discouraged from eating up 99% of the bandwidth with hundreds of simultaneous connections while I'm trying to work over ssh, or if he is at least made to pay for the necessary upgrades to our shared wire.

    Why don't we use wireless networks to reduce the traffic on the wired infrastructure?

    Let us know if you come up with something that works. Having suffered throu
    • P2P is a technological success in reducing the load on any one point on the network. If you make the assumption that the cost of bandwidth grows nonlinearly, then it's highly useful for e.g. Ubuntu releases.
      • P2P is a technological success in reducing the load on any one point on the network.

        Reducing relative to what? Heck, USENET is a more efficient distribution mechanism than P2P.

        If you make the assumption that the cost of bandwidth grows nonlinearly,

        Using 'em big words again without knowing what they mean, eh?
        • P2P is a technological success in reducing the load on any one point on the network.

          Reducing relative to what? Heck, USENET is a more efficient distribution mechanism than P2P.

          I think he meant that P2P reduces the load on the original distributor of the information, which is definitely true as compared to hosting something on a website. It's good for the distributor, bad for everyone else (though faster with a decent swarm). Your counter about Usenet is true as well, Usenet is actually pretty efficien

    • by tylernt ( 581794 )

      it's a good thing if my neighbor is discouraged from eating up 99% of the bandwidth with hundreds of simultaneous connections while I'm trying to work over ssh

      Let him. Just make sure the router and switches prioritize traffic per-user, based on the number of packets they've sent/received in the last hour or 24 hours. You'll neither notice nor care if that other 99% is totally utilized or not used at all because your trickle of SSH packets will always zip to the front of the queue ahead of of your neighbor's

      • Let him. Just make sure the router and switches prioritize traffic per-user, based on the number of packets they've sent/received in the last hour or 24 hours.

        That's not sufficient because the same bottlenecks occur all over the network, so this kind of logic needs to be deployed in all routers. In addition, some people are willing to pay for sustained 50 Mbps, so you need traffic classes. And to make it all work, you need more than the current TCP/IP protocols. So, although you may not be aware of it, you'
  • by GraZZ ( 9716 ) <jack&jackmaninov,ca> on Sunday June 15, 2008 @10:22AM (#23800095) Homepage Journal
    The internet providers were given massive tax breaks to improve their networks (fiber to the home and whatnot). Now they not only haven't done that with the money, but the inferior networks they've built instead are reaching capacity.

    Somebody should make your ISPs sleep in the bed they made.

    I also notice that the TFA appears to reference only cable companies. Cable internet shares bandwidth to the endpoint, a pretty bonehead move if a significant number of endpoints are going to be using it. Maybe this is simply the end of that technology's ability to improve. DSL and FTTH vendors could then capitalize and crush those companies, improving internet access for all. What is stopping this from happening (besides laziness)?
  • Technology is the systematic recording of who to do things.

    If I use technology to build a missile, and then use technology to build a laser to shoot it down, is one technology and one anti-technology? No, both are simply the application of the techniques learned and taught.

    Furthermore it would be difficult to know which is the technology and which is the anti technology. If I can't go about my work because kids are downloading pron 24/7 and clogging the pipe, even though the attempt is made to balanc

  • "I thought the exchange of information over the Internet was supposed to be a good thing? "

    Only if that information has been properly sanitized by the government and you pay a licensing fee to consume it.

    Otherwise, its evil.
  • You may remember how FM radio was delayed for years

    FM radio was delayed for years because the enormous amounts of money being generated by RCA's investment in AM broadcasting was funding the development of the infinitely more disruptive technology of television.

    Then there were the minor setbacks of World War Two and Korea.

    FM doesn't come into its own until the Hi-Fi craze of the mid to late 50's. The LP. Magnetic Tape. Heathkit for the budget-minded hobbyist. H.H. Scott, Marantz and McIntosh for the audi

    • by shanen ( 462549 )
      Your version of the history of FM does not agree with my sources--but I acknowledge that they were little interested in TV per se. My research was more focused on the use of patents to prevent the use of FM technologies because they were satisfied with the existing AM business model.
      • by shanen ( 462549 )
        I try to avoid such ambiguities...

        The "they" who were little interested in TV were the people who were writing about the history of FM radio.
      • the AM business model.

        Business models tend to evolve for perfectly intelligible reasons.

        The AM station of the Thirties and Forties had a strong local and regional identity and a national network affiliation. That is a fairly good reflection of the much less homogenized American society and culture of the time.

        It shouldn't be surprising to discover that the 40 year run of the Midwest's National Barn Dance originated out of Chicago and WLS - then owned by Sears, Roebuck. "The World's Largest Store."

  • by aleph ( 14733 )
    The poster makes it sound like the sky is falling. OMG, if you download terrabytes of data/month on your residental account they might *gasp* charge you!

    Australia has had metered plans pretty much since inception. Most are of the "XXgb then shape to 128kbps" variety. There have been companies offering unlimited, but they either go under, or oversell at a horrendous ratio.

    If the cost of bandwidth, as a proportion of operating cost, goes up for the ISP, then something has to break. Either they introduce some
    • ``The poster makes it sound like the sky is falling. OMG, if you download terrabytes of data/month on your residental account they might *gasp* charge you!''

      That's fine, but then they have to be upfront about it. If they tell you you get so many bits per second, you should be able to expect to get that many bits per second. If they throttle your connection or send you extra charges if you generate more than some set limit of traffic, without having told you they would do so, you are not getting what you sig
  • Many of the posters here, including the one who authored the original article, seem to be forgetting a very simple but important point: bandwidth costs money. A lot of money, in fact, if you're an ISP outside a major city. Many ISPs pay $100 to $300 per megabit per second per month for their bandwidth. Can they afford to give bandwidth hogs unmetered, unrestricted access to it? Of course not! Add to this the fact that TCP/IP is the most inefficient way yet devised to distribute media (a simple analog TV tow
    • >>>Can they afford to give bandwidth hogs unmetered, unrestricted access to it?

      They don't give access. They sell it. Now they're complaining that someone is using the bandwidth that been paid for. Now most providers provision X bandwidth and (Y x X). The problem is users are beginning to ask for the X they were sold.

      • No ISP could afford to do that. Currently, the practice has been to sell "all you can eat, subject to certain rules." These rules say that you can't, for example, run a server or engage in P2P. And these rules make sense. Think about an "all you can eat" buffet. There's virtually always a sign, posted prominently, that says you can't take food out to other people and can't stay all day. They also say that everyone who eats must pay. The rules for bandwidth are analogous. They vary between ISPs, but genera
  • What is this anti-technology you talk of? I thought the article was related, but no, and you don't even introduce the concept. Use wireless bandwidth? Haven't you noticed wireless is *more* expensive? Bittorrent? What could they share that we all download? A lot of download sites already use nearest-server detection, and many of us already use bittorrent. None of your examples make sense, and you fail to define your concept of anti-technology technology.

    FYI unlimited bandwidth does not mean unlimited bandwi
  • For years companies like Comcast, Time Warner, et al have been collecting monthly fees for UNLIMITED internet access.

    They could have beefed up their networks because let me explain, they knew what was coming yet they failed to plan for it.

    Oh some did. Why else would Cox keep upping their net speeds? I note they're the one company that's been noticeably quiet about any kind of metered service.

    And has Verizon tipped its hand yet? I know they're pushing hard to sell FiOS in many areas and metered woul

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...