Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Internet

Managing Bandwidth and Bandwidth Costs? 202

azav asks: "The company I work for has bandwidth requirements that occasionally spike to satisfy the immediate requirements of a several meg download to say 30,000 users. We hope to make this several million in the future. With that in mind, this request is directed to any person who manages a site that must deliver content on an irregular schedule. How do you manage your bandwidth costs? How do you manage the availability of bandwidth?"

"I'd like to illustrate the second concept. When you have your (for example) T1 and you're not really using it, you are still paying for all that bandwidth. It's like the car that sits in your garage, you're still paying insurance and car payments on it even though you're not using it. But then you put up a new game, serve new media or suddenly become the 'Site of the Day' and your bandwidth is flooded and maxed out. For that case, it's like you've bought a car that only goes 40 miles an hour but while the demand exists and only while that demand exists, you need a car that goes 150 miles an hour. You don't want to pay the money for a car that goes 150 because you only need it occasionally. Later, you know you'll need that car to go 220 but you're not there yet.

So if this makes sense with regards to bandwidth, it is like you'd want burst-bandwidth depending on need. Do any of you face this problem? If you do and have solved it, I'd love to hear about your strategy. Once this is solved, we get back to the first question, how do you manage that cost, put a number on it and either fit it in to your business model or pass it on to your customers?"

This discussion has been archived. No new comments can be posted.

Managing Bandwidth and Bandwidth Costs?

Comments Filter:
  • Doh!, (Score:3, Funny)

    by Anonymous Coward on Saturday June 21, 2003 @02:42PM (#6262699)
    We can do it the slackware way - tar it up and let someone else do it.

    Or we can do it the debian way - let someone else tar it up and then do it.

    But we like the Slashdot way most. Post it up and watch the traffic floooooooooooooooow....
  • Many Options (Score:3, Informative)

    by voxel ( 70407 ) on Saturday June 21, 2003 @02:42PM (#6262700)
    Too many to spell out here.. Search google, LOTS of people sell bandwidth at X mbits/s with a BURST to x mbit/s.

    You figure our how much you will estimate you will use as a total per month, and you pay for that. You cruise at 2mbit/s mostly, then you explode to 145mbit/s and at the end of the month you average out to 6mbit/s.

    The idea is you BUY your peak speed, but pay for a low average..

    EVERYONE does this now, call around... It was kind of silly for you to ask Slashdot how to do this. UGH.

    - Voxel
    • I think the server would die before getting close to 145mbit..

      If it's a game, get it on fileplanet, ign and such. Places like ign even have a built-in p2p client for people to use if downloads are heavy for a particular game.
    • Colocation! (Score:5, Informative)

      by danejasper ( 523409 ) <dane@sonic.net> on Saturday June 21, 2003 @04:31PM (#6263244) Homepage
      Colocation or contracted file hosting is probably your best bet. You'll pay by the gigabyte, or by peak utilization. Careful as you quote this - a 95th percentile means that they bill you monthly for the PEAK after tossing out the top 5%. For a site which is pretty even all month long, this works great. However, if you're serving a single file, once in a while, and expect heavy traffic only then, you do NOT want to pay on 95th, as you'll pay for your peak utilization all month long.

      Be sure to tell your colo or file hosting provider what your projected usage is, and how many megabits you may want access to, to assure that they can handle it. You may also want to make a courtesy call a day or so prior to each launch to let them know what to expect.

      Remember when Eddy Van Halen got tounge cancer a couple years ago? THAT was a busy weekend for their website, which we host. Of course, they didn't have any warning, but boy-o, that was bigger than any slashdot effect that I've ever seen. We also host O'Reilly (the computer book folks), so we certainly see plenty of slashdotting.

      We're at: http://www.sonic.net/sales/colo/ [sonic.net]

      Shop around - but keep in mind that buying from someone near your intended downloader may help you with both latency and costs. The SF Bay Area has the best pricing for bandwidth, and the lowest latency connections to the highest number of users - that said, if your target market is on the east coast, you should be in Hearndon, VA or NY or Boston.

      -Dane Jasper (Sonic.net)

      • (Article author here)

        Dane, what does the cost per megabyte work out to with your model? I think we are using the 1 cent per megabyte model now just in case this may need to be covered in the costs we have to pass on to the customer.

        • One cent per megabyte seems a bit high. That comes to $10/gigabyte (or $7 for a CD) If your bandwidth usage gets high enough, you should be able to get down below $2/gigabyte and possiby even to $1/GB, if you're lucky. I've got a friend who runs a hosting company [75-hosting.com] is charging under $2/GB for transfers [75-hosting.com] -- and that's retail pricing.

          The earlier posting about 95percentile peak rate is pretty important to consider.. That means you only get grace on the top day and a half per month. If you release somethi

  • by Anonymous Coward
    ..and let the kids distribute it for you.
  • Managing bandwidth (Score:4, Informative)

    by js62 ( 609777 ) on Saturday June 21, 2003 @02:42PM (#6262707)
    Where possible BitTorrent is your friend. If you can use it, let your customers help out with the bandwidth usage. They will probably get the file(s) quicker and you won't have servers getting torched.
    • by Neophytus ( 642863 ) * on Saturday June 21, 2003 @02:45PM (#6262730)
      What must be noted is that a custom tracker would almost be a necessity. Trackers overload REAL easy, so having one you can run over a load balanced system or that keeps CPU and memory usage to a bare minimum is a must.
    • by flowerp ( 512865 )
      Yeah, but you probably want to put out a branded version of BitTorrent with a nicer user interface.

      The current UI looks like puked out in 2 days ;)

      Also it might make sense to provide a BitTorrent version that installs through ActiveX. The download version should be offered for Linux, Mozilla and Opera users.

      Christian
      • by billstewart ( 78916 ) on Saturday June 21, 2003 @05:22PM (#6263467) Journal
        He'd be happy to make money customizing it for people...
      • Yeah, but you probably want to put out a branded version of BitTorrent with a nicer user interface.

        The current UI looks like puked out in 2 days ;)

        Also it might make sense to provide a BitTorrent version that installs through ActiveX. The download version should be offered for Linux, Mozilla and Opera users.

        The thing is, clients don't want to be leaving windows open to be helping the reseller, or whatever it is the poster's company does. It's simply not done - the client should be able to log on, get w

      • Also it might make sense to provide a BitTorrent version that installs through ActiveX.

        Didn't somebody do a Java BitTorrent client already? That would be better than an ActiveX version, especially if you're delivering something that's platform-neutral.

    • by KDan ( 90353 ) on Saturday June 21, 2003 @02:50PM (#6262763) Homepage
      Yes, this is one thing where it would probably be VERY worthwhile to write a custom bit-torrent client (which looks like one of these installers which goes and downloads the data from a server, but actually goes and downloads it from everyone else). Just give people a choice in the maximum speed of the download, eg: 20k/sec if you don't want to share, 150k/sec if you're willing to share the file 10 times, and max out bandwidth if they're willing to share it whenever the application is open. Seeing as it seems you're distributing updates to an application, incorporating a bit-torrent client in the app shouldn't be beyond possibility, and even if only 10% of people who say they'll be sharing actually let it through their firewall, that will still make it a lot easier on your servers.

      Daniel
      • by jhines ( 82154 )
        Is there any reason an FTP server couldn't support bit torrent as well?

        The combo of the two would be great for moving distributions and other mirroring of publicily available data.
      • by MortisUmbra ( 569191 ) on Saturday June 21, 2003 @03:18PM (#6262909)
        Why is it that BitTorrent is the fix-all solution for the /. crowd? I for one HATE BT, because, seeing as I am on DSL, I usually get no more than 20K/Sec even when sharing to my maximum potential. Why you ask? Because due to the assymetrical nature of ADSL, once my upstream gets clogged, it cannot facilitate the neccisary ACK's that need to be sent out to the other end in order to ensure my download stays up.

        In other words, if I'm uploading at my maximum speed, I can't reply quickly enough to tell the nodes I am downloading from that the last packet got through, so they resend it instead of the next packet until I can squeak out an ACK or two to let them know to move on.

        BT is a nice concept, but in practice it needs work.
        • by mskfisher ( 22425 ) *
          The newest experimental versions of the BitTorrent client allow you to throttle your upstream usage.

          I don't have the problem very often on my cable connection, but the one time I experienced a major BT-related slowdown, I was quite annoyed - so I can empathize with your hatred.

          Here's the link to the experimental client:
          http://ei.kefro.st/projects/btclient/ [kefro.st]
          • I've tried that version myself, and, sadly, it does not properly limit the bandwidth being used, it either dropps off several K beow where I tell it, or several K above where I tell it. On average I get about a 3K/Sec window of control, which means I either risk droppinginto a lower share bracket or max out my connection and kill my download.

            I love the idea of BT, but it's far from the greatest solution. I think the Kazaa method of assigning everyone a contribution rating is a MUCH better idea. That way,
          • Emule let's you throttle your upstream and it's been doing so since I know it.
        • It does need to autotically try to determine max upload speed, cut that in half or quarter, try that, and see if download speed becomes substantially better (eg. you're sitting on an upstream-throttled line). That's not a huge amount of work required...
        • by Zarquon ( 1778 )
          Look at wondershaper.. I have it set to prioritize my upload stream (don't bother having it reduce queueing on my downstream, hasn't been necessary) by putting ACKs, DNS, irc, ssh, ping, and http (port 80 target; haven't addressed https) in the high priority queue, and everything else in bulk. Works pretty well, although I'm considering adding more ram on the NAT box and running a local DNS server again; Comcast's are flakey.
        • Use the experimental BitTorrent client.

          You can set it to just under your max upload... There's stll enough bandwidth for your ACKs, and then you can DL as fast as other people upload to you(which can be a problem due to these other clients...)
      • by azav ( 469988 ) on Saturday June 21, 2003 @05:11PM (#6263430) Homepage Journal
        (Author here)

        What you're missing is that if you end up using your client's bandwidth to distribute your software without their knowledge or permission, you're screwed.

        In serving the general public, something like this would be subject to a large negative reaction not to mention the problem of "have you ever gotten a file from bit torrent that was invalid?" I have.

        Though technically neat, practically, it's unfeasable for a mass market product.

        1) You can't impact/rely on the user base to help you deliver your product.
        2) Added chance for error introduces risk and jeapordizes your distribution model and therefore business model.

    • by Specialist2k ( 560094 ) <slashdot-200408.10.spezi@spamgourmet.com> on Saturday June 21, 2003 @03:12PM (#6262882)
      One problem might be corporate customers who have to pay for their bandwidth:

      If you had to pay for your bandwidth, would you give it for free to some company from which you are currently downloading a product update? I wouldn't...

      • by jetmarc ( 592741 )
        > One problem might be corporate customers who have to pay for their bandwidth

        Actually I was working on a P2P software that solves this problem,
        by introducing "neighbourhood". That means, when several machines
        in the same LAN receive the update, they will prefer to connect to
        each other rather than to a machine outside on internet. This
        SAVES money (on ingres), provided that exgres bandwidth isn't
        excessively donated back to the P2P community.
    • I keep saying this, after my own bandwidth blowout, but BitTorrent isn't generally appropriate. It's a great idea and I have no critique on the implementation, but many people are in situations where running a program like BitTorrent could get them arrested, fired, fined, or expelled.

      BitTorrent is a P2P-style app, even though it can have a specific aim in mind that's not illegal filesharing, and the vast majority of people actually are under an AUP (acceptable use policy) or EULA or whatever that prevents
      • Please provide an example where sharing legitimate content w/ BT would result in an arrest. What law is being broken?

        Violating ISP AUPs result in the contract being terminated and possibly a bill for early cancellation, I don't know if you would count that as a "fine".

        No university would expell someone for sharing legitimate content with BT. There would be a process and some other punishment (loss of computing privileges prolly) before resorting to expelling.

        Fired, is a possibility as private firms hav
        • You're so entirely wrong. Many universities and corporations are banning the CONCEPT of peer-to-peer networking, or any kind of collaborative filesharing.

          You're not following the news to make the statements you're making.

          If you use BT and you wind up with illegal content on your machine, even in parts, you're certainly subject to fines and arrest, just as with Kazaa, et al.

          What you're saying isn't unreasonable, it's just not how companies and institutions and government is coping with file sharing at the
          • I asked you to cite an example where someone was arrested for sharing legitimate content w/ BT. You did not do so.

            If you use BT and you wind up with illegal content on your machine

            Of course if you violate copyright law you risk serious consequences. That was not the point you were making earlier. You maintained that one risked serious consequences, including arrest, whether using BT with legitimate material or not.

            Universities may well ban BT as a bandwidth hog. But they will not expell someone for
  • by Triumph The Insult C ( 586706 ) on Saturday June 21, 2003 @02:43PM (#6262711) Homepage Journal
    how can we get our company's bandwidth setup so we can survive a slashdotting?

    maybe if he got some comments from the peeps at osnews or cnet...
  • BitTorrent? (Score:1, Redundant)

    by SpotBug ( 228742 )

    Maybe?

    Check it out: http://bitconjurer.org/BitTorrent/
  • Get what you want... (Score:5, Informative)

    by anthony_dipierro ( 543308 ) on Saturday June 21, 2003 @02:44PM (#6262718) Journal
    If you want to pay for data by the gig, rather than by the pipe size, just sign up with an ISP which allows that.
  • Get Burstable Fibre (Score:5, Informative)

    by manly_15 ( 447559 ) on Saturday June 21, 2003 @02:45PM (#6262728)
    At the ISP where I work we offer fiber connections that allow for increased bandwidth for certain periods of time. For example, our burstable connections are usually around 1-3 meg for normal times, then burstable up to 10 megs. I'm sure you can find something suitable to you.

    *blatant sales pitch*
    If your buisness is near Southern Ontario, check out our website at www.sentex.net [sentex.net]. We rock :-)
  • Free Tip (Score:3, Funny)

    by fastdecade ( 179638 ) on Saturday June 21, 2003 @02:45PM (#6262731)
    Here's a free tip --- don't post your site to slashdot.

    Congratulations - you passed this one.

    Anyway the answer to all your prayers is obviously BitTorrent [bitconjurer.org].
  • by Anonymous Coward
    Whenever Savvis has end of quarter sales you can negotiate the price a ton.

    Right now they're about 700/month for a managed T1 with local loop included.
    • FYI, a T1 is just a bit over a 1 x CD rom.

      We see 180 Kbps max downstream on our T1.
      If we are serving all modem users at a max dl speed of 5 Kbps, then a T1 can only serve 36 users at once and allow 5 Kbps per user.

      That ain't much.

  • by bc90021 ( 43730 ) <`bc90021' `at' `bc90021.net'> on Saturday June 21, 2003 @02:46PM (#6262736) Homepage
    1) Agree with your ISP on a standard data rate, burstable to X as needed.
    2) Use RTG [sourceforge.net] to monitor traffic in and out, making sure that you know what switch/ports/etc. that client is using.
    3) Charge the client (this is usally done based on 95th percentile).
    4) Profit!!!
  • It seems like an aboveous place to use some Peer2Peer method. (Unless these X meg downloads are pay per download or members only). BitTorrent is very good at this but stick links for people to download it off other systems like gnutella or kazaa... (whatever is the latest best). These people who download your errr... whatever it is, will pass it on.
    All your company needs to do is to seed the files on the networks. Much lower bandwidth than serving each one.
  • by sully67 ( 550574 ) on Saturday June 21, 2003 @02:46PM (#6262743) Homepage
    This is what Akamai and the like are sold for.
    It's almost certainly going to be cheaper than just buying bandwidth.
    Or you could go for the approach of colocating your own box somewhere central for the heavily hit stuff.
    Even this will be a whole lot cheaper and won't impact on your normal traffic to your organisation.
    • Totally. Akamai really isn't that expensive when compared to buying your own bandwidth. The only thing about Akamai (from what I've heard, I've never used it) is that you have to plan updates a few days ahead of time to give them time to propagate amongst the various Akamai balancing servers. But have you ever noticed that downloads from Akamai are usually the fastest around? And that pretty much every corporate high-traffic site uses them (CNN, Apple, IBM, ETrade, FedEx, Yahoo, etc.) so I think they'd be f
      • We just had an Akamai deployment that was setup in a week (though they claim to have had the FBI up in a day, post 9/11). Major cache updates are propagated in minutes or hours (depending on how you are configuring the change.)

        Akamai isn't cheap, but having done both, it is cheaper than setting up your own infrastructure.
  • One word (Score:3, Informative)

    by LittleLebowskiUrbanA ( 619114 ) on Saturday June 21, 2003 @02:47PM (#6262750) Homepage Journal
    ALTQ [sony.co.jp]
  • by jmt9581 ( 554192 ) on Saturday June 21, 2003 @02:47PM (#6262751) Homepage
    I don't know what solution you'll eventually decide on, but to test it you can post a link to the file on AskSlashdot.

    :)
  • How bout paying for a full time high bandwidth connection, and then whenever Slashdot posts a story about the release of Mandrake, Red Hat, etc., you post a mirror of it?

    Then this whole "unused bandwidth" issue would be kinda moot. Everyone ones (well, except the person paying the bill).

  • by Velocity44 ( 635370 ) on Saturday June 21, 2003 @02:48PM (#6262754)
    This is exactily what Kast was designed for. Bit Torrent is great, but you still have to request the DL. With Kast, you subscribe to the channel and you get what is delivered. Go to http://konspire.sourceforge.net/index.shtml [sourceforge.net] to get what you need.
    If you need proof that Kast is better than BT in this situation look at http://konspire.sourceforge.net/BitTorrent.shtml [sourceforge.net] .

    Hope it helps.

    • the problem with the logic of the second link is that they assume that users of bit torrent close the client after their download is done. The study of any popular files shows a great number of seeds, as many people find it only fair to leave their client open as long as possible (i try to redistribut the file 10x whenever i download it, which is about 3 days open for the files i download). Because he does burst files, it would be very similar to a new eps released on BT, as a surge of people will ensure at
  • by grumpygrodyguy ( 603716 ) on Saturday June 21, 2003 @02:49PM (#6262758)
    As soon as we all have 50mpbs 802X connections and, anonymous P2P software. This kind of question will never be asked.

    With 100 gig hds, and reliable high speed connectivity 24/7. It's pretty easy to see what could be built from that.

    Someone wants to download the new 150mb CS or BF1942 upgrade? Just enter the name of the file, select it...and the P2P software does the rest. Initiating multiple downloads from about 20 or so nodes in parallel. Then you just glue the program together once you have downloaded all the pieces.

    Mirroring is such a hack, and dynamic bandwidth is the last gasp of the client-server paradigm. Let's move on.
    • Indeed, as soon as we all have 50mbps connections (and file sizes remain the same), the question will no longer be asked. But it won't be due to P2P, just due to the cheap bandwidth.

      P2P is severely overrated. It's not a solution for anything that doesn't involve sharing illegal files. If everyone on P2P is on a standard 128k/768k ADSL connection, it will not work -- there will be more demand than supply. It only works when there are nodes with a fat upload line that never download anything. But those
      • I disagree (Score:3, Insightful)

        by ionpro ( 34327 )
        P2P realizes the two facts that you obviously don't:
        1) Not everyone uses their Internet connection 24 hours per da
        2) Most people don't need hard drive space these days (i.e. storage is cheap)

        Combine these two developments, and you have a lot of upload bandwidth sitting idle. I would argue that P2P becomes MORE effective, not less, as you move to legitimate files, because people are more likely to leave it running when they aren't afraid of the RIAA/MPAA tracing their connection down. Since ISPs have been r
        • Actually, most people don't leave their computer on 24 hours/day. I'm not talking about hard-core internet users, I'm talking about your average Joe Sixpack whom the parent article seems to refer to. As for ISPs: they sure haven't been reluctant to artificially cap upload speeds to ridiculously slow speeds. That tells me that they probably don't like the whole P2P idea too much. Besides, it only drives people to broadband if there are illegal files to be gotten for free; otherwise, it just costs the ISP
      • [P2P's] not a solution for anything that doesn't involve sharing illegal files.

        Oh? So then I guess "illegal p2p" sites like GameTab [gametab.com] (bittorrent) and FileFront [filefront.com] (redswoosh) aren't a solution for getting game demos faster than CrapPlanet's [fileplanet.com] fantastic client-server lines eh?

        P2P is a hack

        Why do I get the impression that your job depends on the centralization of power that client-server allows?

        If I need to download a BF1942 patch, I'll get it and delete it

        Speak for yourself. Not everyone is as selfish as

        • Why do I get the impression that your job depends on the centralization of power that client-server allows?

          P2P allows much more centralization of power, actually. Just cap the upload or block some ports and it's over.

          Speak for yourself. Not everyone is as selfish as you apparently are, and p2p will eventually have reputation systems for weeding rogues and assholes out of our webs of trust.

          Webs of trust? Nice buzzword, not likely to ever happen. Unless you massively centralize the whole P2P network
          • Webs of trust? Nice buzzword, not likely to ever happen. Unless you massively centralize the whole P2P network around one registry. And even then such systems are easy to bypass unless you use something like Palladium.

            Been implemented before -- and without a central trusted server, and in a very difficult-to-circumvent fashon. See PGP's method for determining keys' trustworthiness.
    • Well, they _could_ only publish their corporate website on Freenet, but it isn't really widespread enough to make it practical yet. It's currently very much in the state the internet was in before it was viewed just as another glossy brochure come marketing circle jerk, but that might change in another 5-10 years - just like the real internet did.
    • For corporate downloads of legal content, there's no reason a P2P system need be anonymous. See BitTorrent for an example of that. Having your P2P software handle searching as well as downloads -- well, you could, but why not use a specialized engine for each rather than trying to have one piece of software that Does It All?
  • by IpSo_ ( 21711 ) on Saturday June 21, 2003 @02:54PM (#6262781) Homepage Journal
    that offers "pay by the GB". The hosting company I work for has GiGE links but only pays for the exact amount of traffic we push through the link. The monthly cost on the line is minimal, or nonexistent depending on the provider.

    Since we have more then one link as well, it gives us redundancy and the upper hand to negotiate the best price per GB, so we can send 90% of our traffic out that link. If the next month a different provider comes back with a cheaper price, we switch it around and send the 90% out their link. Within days of cutting the traffic off for a link, we can usually expect a phone call from the sales rep with lower price offer.

    Any idle links we have don't cost us anything extra, since we _will not_ deal with any provider that doesn't offer pay by the GB. Paying for the raw link speed, regardless of how much traffic you push through, or paying 95th percentile prices are all mostly a rip off.

    • One such host is BSDwebhosting [bsdwebhosting.net]. Haven't used them myself, though. They require you to use FreeBSD, which in my opinion is a plus :-)
  • CDN, colocation (Score:5, Informative)

    by UnderAttack ( 311872 ) on Saturday June 21, 2003 @02:56PM (#6262801) Homepage
    Couple of different options. First, you could talk to the Content Delivery Networks (CDN's) like Akamai or Digital Island. They can probably help you (for a price).

    Another option is colocation. In particular if you have short traffic spikes. Many colocation places charge your for at a '95 Percentile'. This will cut out about 3 days worth of 'peek traffic' and you only pay for the maximum bandwidth you use after removing the top 5%. Just make sure the colocation place has enough bandwidth to handle the spikes.

    Some ISPs (e.g. Yipes) offer flexible contracts that allow fast (daily?) bandwidth changes. So if you announce a new version of your product, you can increase your bandwidth until the rush is over.

    One hint: Try to move the large file/content away from your 'importants' networks, so other things like e-mail keep flowing even if the content site is running into issues due to load.
  • by ryuspeed ( 443132 ) on Saturday June 21, 2003 @02:57PM (#6262803)
    It seems to me that this is exactly the type of problem that places like akamai [akamai.com] and cable and wireless [cw.com] (was digital island) are trying to solve. Pay only for the bandwidth you use, leverage their existing distributed architecture, profit. You can try to get bustable bandwidth etc, but in the past I've found it to be more expensive. Things may have changed since them (a year ago) but you should still look into a content delivery network.
    • My employer is looking at CDNs. I understand that they had an impressive presentation from Speedera [speedera.com] as well. It looks like Speedera has a service specifically for downloads. [speedera.com]

      No, I didn't site through the presentation; however, our admin guys seemed impressed. Just another option.
  • by ssclift ( 97988 ) on Saturday June 21, 2003 @02:59PM (#6262819)

    Based on recent research at the University of Waterloo, you may well be able to treat the bandwidth usage as a risk factor and treat the option to buy more bandwidth as exactly that: an option on a real commodity. You would likely be able, then, to price the value of waiting to invest versus the value of investing now with a given expected return. Basically the cost of holding off on investing would then be quantifiable and you could choose the best time for investment.

    There has been some good research [uwaterloo.ca] done on this lately which you can read up on at the U. Waterloo Scientific Computation Group [uwaterloo.ca] which did the work in co-operation with telecoms and the Finance department. The math is perhaps a little heavy going, but the results may put you on a firmer footing than doing the same computation with NPV or similar methods.

    Disclaimer: I'm currently doing research with this group, though not exactly on this topic.

    • Who in their right mind modded the parent to +5???

      "Bandwidth trading" escapades have bancrupted Williams and contributed to Enron having to go ever more aggressive with their "creative accounting." It reeks of the same foul smell as the "real options" nonsense. (Note: the article referenced even has "real options" in the keywords).

      Basically you've got a whole bunch of not-so-smart people who can fake expertise in nearly anything as long as they don't get their hands dirty (aka academics) colluding with s
  • by Peter Cooper ( 660482 ) on Saturday June 21, 2003 @03:00PM (#6262822) Homepage Journal
    Let's face it. Most of the suggestions above are useless. Since when is a company going to officially distribute stuff via Kazaa or BitTorrent? Sorry, but when Microsoft says 'To download our latest Service Pack, use Kazaa' then pigs will be flying. It's so unprofessional.

    The easiest solution is not to host it yourself, but to use specialized file hosting ISPs. There are lots of these around, and it's a trivial task on Google to find one at the price you want. These are ISPs that entirely focus on hosting large files for download, with servers optimized for that job.

    There's no point in lagging out your regular servers which are probably optimized for something else.. and a dedicated file host can scale as you go.. which would usually cost you a packet.
    • There was a time when selling software without a manual was considered unprofessional. Now you can buy software online and get NOTHING except for the string of bits - not even a CD.
    • Sorry, this is half-pure bollocks. Companies can easily distribute via P2P ... Nobody said you'd have to use the "official" client. Many bigger software installers download the bits and pieces they need from the net. There's absolutely no reason why you wouldn't be able to put a P2P technology in there instead of simple http gets.

      Even if you don't go the P2P route, look at the many dedicated server providers out there. If your tech staff is half-way decent, they can set up a "simple" fileserver in a very

    • > Since when is a company going to officially
      > distribute stuff via Kazaa or BitTorrent? Sorry,
      > but when Microsoft says 'To download our latest
      > Service Pack, use Kazaa' then pigs will be flying.
      > It's so unprofessional.

      Replace all the statements in here about kazaa with statements about linux.

      And you get the same thing people were saying 4/5 years ago. Don't underestimate the ability for the tech industry to evolve. ;)
    • Let's face it. Most of the suggestions above are useless. Since when is a company going to officially distribute stuff via Kazaa or BitTorrent? Sorry, but when Microsoft says 'To download our latest Service Pack, use Kazaa' then pigs will be flying. It's so unprofessional.

      They will do it if saves them money or if the alternative is simply not to host large files. Many sites already allow or even require you to have special download managers (Fileplanet etc.). What if they provide special rebranded versions

  • I don't know how much of your future bandwith requirements..., and I doubt someone else do that than you...
    How much will be web-content? Files? Ftp? Small or large files? What about your customers? Are the "tech savy"? US-only? Old? Young?

    Maybe there are other distribution-modells that are better for you than the tradidional client-server approach?

    Some suggestions:
    1.I heard about a game company that saved *thousands* of dollars each month when they actively started to supply game magazines with demos a

  • Start with a best guess for amount of bandwidth based on number of users and type of usage. Then monitor and adjust. If you pay for uneeded bandwidth you may be losing money. If underestimate bandwith needs and lose business, you may lose money...

    Some things to consider checking into:

    Use SNMP and RMON to manage and monitor your wide area connections. You will be able to do trending on your traffic to see what percentage of used is used during any predetermined interval. Free tools like MRTG are a great pl
  • The basic solution is to move your servers -- at least the ones that will be handling high bursts of traffic -- to an offsite, dedicated hosting facility. A facility like this will have big pipes coming in with the cost split among all of their customers, and typically offer you pricing based on burstable bandwidth use.

    It's just not economical for a single company with occasional high-bandwidth requirements to bring in a pipe that sits idle 90% of the time. A co-location facility will serve many differen
    • A facility like this will have big pipes coming in with the cost split among all of their customers, and typically offer you pricing based on burstable bandwidth use.

      It's mentioned elsewhere, but worth mentioning here: be very, very careful if you're going the burstable route. Most providers charge based on 95% utilization, not actual bandwidth usage. What this means to you is that a single short burst of extremely high usage (of the sort described by the poster) causes you to pay as though you were

      • Tis true. I was 'lucky' enough at one time to be at an ISP that didn't manage their network very well -- they didn't even know what switch port I was connected to and they apparently weren't even trying to monitor my bandwidth usage. I was paying for 256K but was getting unmetered 10mbps. Ahh the good ol' days... :)
  • Most places are willing to sell "burstable" bandwidth -- they could run, say, an OC-3, but you might normally only use 3 Mbps of it. You'll pay more than if you just bought 3 Mbps of plain old traffic, but if you get hit with a deluge of traffic, you can use all 155 Mbps of it. (But then you get charged for it.) If you bandwidth usage is irregular, buying "burstable" bandwidth is definitely the best way to go. "Burstable" pricing is usually more expensive per Mbps, but it's cheaper than getting a super-big
  • Also, if your business can get away with it, try and spread the downloads out among time. For example: split your customer base by some hashing algorithm, and notify them evenly spaced out over a few days. Or even tie bandwidth monitoring to notification, so that as the usage peak from the last batch of notifications begins to drop, start notifying the next batch. Again, this depends on your business model of course.
  • Your Options (Score:3, Informative)

    by Zebra_X ( 13249 ) on Saturday June 21, 2003 @04:03PM (#6263121)
    How to handle this situation primarily depends on where your site is hosted.

    If you have your servers hosted with co-location provider like AT&T, Globix, Cable and Wireless (ha ha), Verio ot the like then you'll almost always have the option to "burst" above your monthly allotment of transfer per month - problem solved. Just do some simple math and figure out what is the max of the LAN connection of the provider, how much your server(s) can handle with respect to transfer and you'll be able to figure out if you need to add hardware or NIC's to handle the load. Generally speaking the "bursting" is usually calculated fairly. Also, in months that you use less bandwidth you won't have to pay for a higher class of service above the already agreed upon monthly transfer rates.

    However, It doesn't sound like this is the case. A t-1 has a physical capacity to transfer 1.544 Mb/sec. You will never be able to "burst" above this rate. As long as your servers are at the end of this pipe they will never be able to transfer more than ~ 180K/sec to the internet at large. Your options at this point are:

    * Add more capacity to your hosting site (expensive).

    * "Partner" with a company like Akamai for content delivery. It'll be a few thousand dollars plus the cost of bandwith to set up the content redistribution. Not really a bad deal if you are serious about delivering your content to your users. They also have great reporting about who and where your users are coming from.

    * Your last option is to set up a shared/dedicated hosting account with a provider that charges you by the GB. That way you only pay for what you use + the monthly cost of the server. Try interland they have some good deals.

    The bottom line is that if your site is hosted at the end of a physical connection you own - it's not going to be enough. You'll need a datacenter environment that has a PhatPipe to the internet and a machine or machine(s) that can handle the throughput. If you have the cash go with Akamai - they are good at what they do.
  • by Mordant ( 138460 ) on Saturday June 21, 2003 @04:22PM (#6263213)
    First of all, you need to know what's normal for your network, and how your bandwidth is actually being used:

    http://www.linuxgeek.org/netflow-howto.php

    http://wwwstats.net.wisc.edu/

    http://www.arbornetworks.com/products_sp.php

    Then you need to make use of technologies which allow you to provision accordingly:

    http://www.cisco.com/en/US/tech/tk543/tk757/tech no logies_white_paper09186a008017f93b.shtml

    http://www.cisco.com/pcgi-bin/Support/browse/ind ex .pl?i=Technologies&f=1387

    Juniper and other vendors have equivalent capabilities, I'm just not very familiar with them. But the concepts are the same.
  • Put apache on it. Try to get a contract based on bandwidth used. Configure it to accept connections only when you have the situations when you need it. Have robots.txt turned on to help keep away bots. Make your primary site issue redirects to the ISP apache box when usage gets too high.
  • Well,

    since you are talking about FUTURE, and not now, I would first and foremost use Linux to manage the bandwidth. I assume you haven't made any buying decisions yet?

    You can buy propritary heavy lifting equipment such as Cisco gear, and collapse the entire management tier into one switch, but I wouldn't.

    It will work fine, BUT you can't reuse Cisco gear as desktop PC's, or even donate them to a school when they get too old.

    My way of building things, you can reuse the equipment when its all done.

    Wit
    • I would connect them all with Gigabit Fiber cards into one box.

      Why would you use fibre and not copper for the "interconnects". Surely the distance can't be *that* far ?

  • Is what they do just what you want? I do not have a clue what they cost to the customers but i've seen alot of akamai hardware spread around different hosting facilities around the europe.

    I think their basic product is "x amount of space and bandwidth very close to the end user".

    For example, Ximian used akamai at somepoint when their first gnome 1 desktop came out. I downloaded it and installer was *blazing* fast. I was wondering what was going on as i never get +500kb/s speeds from us and i was pretty su
  • First, install the caching server "Squid". Have all FTP and HTTP requests intercepted and diverted through the caching server. This'll save you massively on bandwidth, as you won't need to do multiple transfers of the same file.


    Have the timeout on a file reasonable, though. FTP files generally don't change that frequently, so hold those for 48 hours. HTTP pages can change a lot more often. Make sure the caching server is configured to respect the HTTP flag to not cache, and don't cache unmarked web pages for more than a few minutes.


    This should be sufficient to spread out the load some. I prefer to have a few "extras" in there, though. If I know there's going to be a popular file released, it makes sense to have a CRON job pre-load that file into the cache during off-peak hours. That way, it's in the cache and ready for users by the time anyone gets round to requesting it.


    If there are a few web sites that are VERY popular but largely static, set up a "neighbor" cache, which ONLY caches those web sites, with a very long time-out. Use a program like Harvest to grab the entire site, via the cache, and you'll then have everything ready for the users. (It'll also be searchable, via Harvest, which'll be a bonus.)


    The second option is at the network layer, and should be used only if the above is not sufficient. Enable "diff-services" and "Quality of Service" in the kernel. How to do this depends on the OS you use. Linux and all the *BSDs support these options, but how you set it up varies with each.


    Once you've done that, enable "HQF" or "CBQ" queueing discipline, and attach those to the FTP and HTTP services. Configure them such that each user is guaranteed a certain level of bandwidth for that service, if they request that or more. They get more only of nobody else needs it. (This is usually described as a "soft ceiling".)


    You also want to enable the "RED" networking option.


    This isn't as hard to do as it sounds, and it can massively ease network congestion.


    (CBQ = Class Based Queueing. HQF = Heirarchical Queueing Function. RED = Random Early Detection.)


    Once you've applied both the "high level" and "low level" solutions, your network congestion should be massively eased. Again, though, use pre-loading for the cache as extensively as you can to ease those peak-time burdens.


  • Put your server in a colocation center. You can normally get deals where you pay for a low bandwidth regularly, but have the ability to burst quite high. As an example, you can be connected to a 100 megabit port on their router, but pay a base rate for, say, 1 or 2 megabits.

    If that's not an option, you can always look into per-IP bandwidth limitting and QOS.

    steve

  • Here's the easiest solution yet: Distribute your file via BitTorrent. Problem solved!

    steve
  • I'm in the process of putting my high-bandwidth data at Download Technologies [downloadtech.com] for exactly this issue. They have a high-speed backbone connection and charge for actual data transfers, so its pretty inexpensive --- even a slashdotting would only last a day or two and not add up to much probably. For example, I have videos from the 1997 DaVinci Days [igrbp.com] Kinetic Sculpture Race and Trebuchet events in Corvallis Oregon there.

    Disclaimer: This particular site is run by friends of mine, but they're not the only ones

  • by drasfr ( 219085 ) <revedemoi&gmail,com> on Saturday June 21, 2003 @07:59PM (#6264118)
    What I would do is colocate all my servers in a datacenter.

    Take a burstable bandwidth, let's say that can burst to 100mbs, but to control your bandwidth in most time to ensure you do not go over the cost, you configure your router to not allow more than let's say 1mb of bandwitdh or whatever you want as a maximum and willing to pay for in normal time.

    You should then monitor your bandwidth usage in real time, as well as the logs on the machines, and adjust the traffic shaping to the amount of traffic you want to allow.

    For example, you know what on that day, you will do a marketing operation, and you are willing to spend $xxx more for the bandwidth, you then change your setting right before your marketing plan to the maximum of bandwidth you are willing to pay.

    my 2c...
  • The majority of modern transfer protocols, HTTP, FTP, RSTP, will allow the systems administrator to limit the bandwidth available to the client. Yea, this doesnâ(TM)t make the client very happy that he/she can't have that new version of xyx.exe in seconds, but at least they can get it.

    As the first local court system in the US that streamed their own court proceedings via the Internet, we were inundated with hits to our media server. By using the bandwidth throttling in Real Server we were able to max

  • Later, you know you'll need that car to go 220

    You should buy a ram jet powered locomotive... never a problem with speed or overloading your hauling capacity... be sure to lease it.


  • Getting a burstable line to the net is not hard at all; most ISPs work that way (you pay for 6 mbps, for instance, but have a 100 mbps ethernet line, so you just pay when you burst above 6).

    One thing you guys might want to think about is using a bandwidth arbitrator for when you do have a busy day. There's one good project I know of: the Linux Bandwidth Arbitrator [bandwidtharbitrator.com]. It's easy and free, and it'll keep individual users from hogging bandwidth -- and meter all users to whatever rate you choose. It's based on

An authority is a person who can tell you more about something than you really care to know.

Working...