Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Wireless Networking Networking Hardware

Ask Slashdot: Why Does Wireless Gear Degrade Over Time? 615

acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Why Does Wireless Gear Degrade Over Time?

Comments Filter:
  • Obligatory (Score:3, Interesting)

    by SuperMooCow ( 2739821 ) on Sunday October 21, 2012 @03:05PM (#41723061)

    It could be the noise floor going up near your house, or just planned obsolescence [imdb.com].

  • by afgam28 ( 48611 ) on Sunday October 21, 2012 @03:09PM (#41723099)

    I once had a router where the signal started to go bad over time. I called up the company and the tech support guy told me that most routers "wear out" after around 2 years, and that I'd need to replace it. He struggled to give a logical answer when I asked him how a device with no moving parts could wear out so quickly.

    If you're right, and if this is the standard advice being given to everyone, we're in for a huge arms race.

  • by carlhaagen ( 1021273 ) on Sunday October 21, 2012 @03:13PM (#41723127)
    I've been using wifi instead of ethernet for about 7 years now. Almost all of the NICs/APs I've used have displayed this problem with time. It's as if the equipment somehow develops creeping signal attenuation. My guess is that it's something relating to capacitors gathering a slow overcharge of some sort, causing them to block current in a growing fashion - I seem to recall this being possible from my early days of electronics studies.

    Anyhoo, I fix the problem by simply switching the equipment to another channel, say, 3-4 steps away, to make sure the frequency some of the components will be switching at will be notably different. So far it has worked with all equipment I've had this problem show up on. After a while the signal attenuation develops on the new channel as well, upon which I simply switch back to the one I used before. Rinse, repeat.
  • by K. S. Kyosuke ( 729550 ) on Sunday October 21, 2012 @03:20PM (#41723169)
    I thought the channels overlap to a significant degree. If there is interference on channel 1 to such a degree that you feel the need to switch it, are you sure that switching to channel 2 would actually help?
  • by Tastecicles ( 1153671 ) on Sunday October 21, 2012 @03:22PM (#41723181)

    1. slow burnout of emitter gear due to thermal degradation (yes, clock chips and transistors get hot, as do solder tracks and joints). Thermal runaway can occur if a solder joint fails and arcs, or overvoltage causes signal tracks to vapourise.
    2. ionising radiation, particularly on unshielded components such as antenna conductors (I've seen something like this occur on an externally mounted amateur radio antenna: the sunward side of the antenna completely degraded, the result being that the only signals received (or sent) were on the shadow side).
    3. component quality on consumer gear is not as stringent as it could be. Components can and do fail, and considering the number of components in a lot of consumer gear, it's a wonder any of it actually leaves the factory.
    4. the noise floor of several years ago was far, far lower than it is now. The ERP of newer gear is (by design or by necessity) higher than older gear as more and more transmitters have to share the band. As a result, the signal quality taking a dive may be at least partly illusory. The equipment may actually be perfectly fine.
    5. parasitic structures in semiconductor packages may be the catalyst for failure, either immediate or delayed. Such structures may be as small as a single atom of chlorine embedded in a crystal of germanium - innocuous at first (undetectable, even), but over time and use, that contamination will alter the chemistry of the semiconductor, possibly causing it to bond with the package material and rendering it useless. This might not even be an issue in high powered gear like regulators but in something like a microprocessor, it's a showstopper.

  • by NJRoadfan ( 1254248 ) on Sunday October 21, 2012 @03:28PM (#41723237)
    Use of channel 14 isn't permitted in the USA. Routers sold here disable it by default, although you can get the option back by flashing 3rd party firmware onto the router. I ran a router on channel 14 for a brief period of time to see if interference was causing connection issues. The problem I ran into is some wireless devices wouldn't work on channel 14 (like ebook readers) since the radio was region locked.
  • by tehrhart ( 1713168 ) * on Sunday October 21, 2012 @03:37PM (#41723279)
    I recall that the IETF meeting in Paris this year had some wifi troubles and they ended up using overlapping channels intentionally. It would seem to me that straddling two of the non-overlapping channels would at least allow you to compete for resources in two areas - i.e. if channel 1 was flooded, you'd still have some bandwidth availible in your overlap of 6, but I could be mistaken. Reference article about IETF Paris: http://mobile.slashdot.org/story/12/03/29/140207/ietf-attendees-reengineer-their-hotels-wi-fi-net [slashdot.org]
    Quote:

    Elliot noted that France lets Wi-Fi use channels 1-13 in the 2.4 GHz band. "As three channels are very limiting in a very 3D structure, like this hotel, I've chosen to go with 4 channels, using 1, 5, 9, and 13," he said. "This is a layout that is well respected by others, and one [that] we've considered using at the IETF on numerous occasions--and very similar to what we used in Hiroshima. You get a slight bit more of cross-channel interference, but the additional channel is worth it, especially in this hotel's environment."

  • by girlintraining ( 1395911 ) on Sunday October 21, 2012 @03:37PM (#41723285)

    ...but the noise floor has increased substantially, degrading performance.

    Bingo! You hit the nail on the head. Wifi is now commonly found in most homes. And the overwhelming majority are b/g routers. That means that everyone's "last mile" internet is running on only three non-overlapping channels (in the United States), with a maximum capacity of only 54mbps for each of those channels. While your effective range decreases, your signal still continues to interfere with others out to its maximum range, which is typically around 300 feet. Beyond that, it's only a decibel or so above the noise floor (about -96dB) and is basically ambient. So consider urban density: In a 300 foot hemisphere, how many transmitters will be in that space?

    Well, I live in a residential neighborhood that is mostly single-dwelling homes, which is about as ideal as you can get from a low-density city environment. Using a pringles can, I took a neighborhood survey and found about 26 access points within 300 feet of my home. Now, this is a survey that took several days to complete because of the marginal signal integrity, after which I drove my car in circles matching associated clients to those APs. Each access point had approximately 2.25 clients associated with it. So that's about 60 transmitting devices, in an ideal urban environment. And that's just those using wifi.

    2.4GHz is also used by: Wireless phones, microwaves, wireless "hifi" stereo systems, etc. It's also used by wireless mice/trackballs and keyboards. So, realistically, I've got at least 100 devices that are transmitting with a signal high enough to interfere with the front-end RF of my wifi.

    Shannon's Law stands tall in all of this: As you increase the noise floor, the amount of data you can transmit regardless of encoding scheme or receiver selectivity falls proportionally. Every device added decreases your own devices' performance.

    Solutions
    I found that by setting my router to 'g' only and then forcing the bitrate down to 24mbps, I was able to get a much more reliable and speedy signal. Every WiFi standard is designed to cope with interference by renegotiating to a higher or lower bitrate dynamically. Which would be fine if they were isolated, but in an environment where they're in close proximity to each other, what happens is as each device broadcasts and interferes with the other, they detect this and then renegotiate, generating more interference; And pretty soon you've got routers constantly in a state of renegotiation, with fluxuating bitrates. Manually force your router to a specific bitrate and don't allow re-negotiation, and you'll find that those momentary spikes in the noise level won't wash out your signal -- renegotiation takes 10--30ms, and during that time, you can't send/receive any data. The data burst that caused it is over long before the renegotiation completes.

    So in short, it's not your transmitter, it's the environment. Take your transmitter out of its default settings and enable RTS/CTS (if available) and you'll be fine. Another, more sociopathic answer, is to get a 100W 2.4ghz booster (you'll have to build it), mount it on your roof, tune it to one of the 3 non-overlapping channels (I suggest 1 or 11, since most microwave ovens tend to tune at the middle of the band -- channel 6), and then let it run for about 3--5 days. Everyone will bail off that channel because nothing tuned to it will operate over a distance of even a few feet. Again, very illegal, very sociopathic... but very effective. You'll have to do "plow the spectrum" about once every month or two, so count on downtime.

  • by Migraineman ( 632203 ) on Sunday October 21, 2012 @03:51PM (#41723397)
    More than likely, the older router was expecting a relatively clean RF environment, and was crippled when all the neighbors deployed APs nearby. The newer APs were designed to handle cluttered environments, and their more-advanced algorithms provide improved performance over the previous generations' products. As old equipment is replaced with new, you'll probably see the same degradation in performance until new countermeasures are developed (in the next gen equipment, of course.) Ref: arms race.
  • by currently_awake ( 1248758 ) on Sunday October 21, 2012 @04:21PM (#41723585)
    semiconductors are known to degrade if run hot. home routers don't have fans or heat sinks. budget devices often run hot to save on heat sinks and fans.
  • by Artifakt ( 700173 ) on Sunday October 21, 2012 @04:27PM (#41723627)

    You've said quite a few interesting and even correct things in your post, but I want to add one point (or quibble if you wish). You are probably not in an ideal urban environment, and don't want to be. That's because an 'ideal' environment might have the broader spacing given by single family homes, but a very low average income, where there would be both fewer access points and fewer average clients per AP. In theory, parts of Detroit that are now full of more abandoned homes than inhabited ones sound 'ideal'. 26 points and 2.25 clients per sounds like you qualify as upper middle class (if anyone does these days). You would actually see less traffic in a wealthier neighborhood, as there would be more average distance between houses, and less in a poorer neighborhood, as there would be fewer people on wireless. In addition, there's a northern/southern states bias in the US, in that there's more brick and concrete block and less wood used in low to medium priced residential neighborhoods as you look at more northernly cities, and this tends to attenuate some traffic over shorter ranges.

    I would like to know more about how you measured the number of associated clients though. I thought some of the mixed wired and wireless hubs reported the wired accesses mixed in with wireless, and/or indicated both active and non-active connections, but didn't give out the rest of the information needed to properly interpret what you can get. If you queried my parent's wireless router at their place, for example, and if you could get the info remotely, you might find 14 devices assigned an address, but those are for two Cat 5E wired desktop machines that are currently active, and the rest are for laptops and such the various kids, grandkids, nieces and nephews have brought over when staying for a visit, and are still on the router's lists months later. I've had five different laptop or tablet machines connected through that router at various times visiting, but never more than one on the same weekend or holiday.

    You're the first post I've come to on this thread that has mentioned the sad relationship between microwave ovens and channel 6. Other people mentioned the noise floor early in the discussion, but I didn't notice any of them actually referencing that to Shannon's Law, either. You should be modded informative a couple of times, but I won't be surprised if the 'sociopathic answer' part gets you downmodded instead.

  • by tlambert ( 566799 ) on Sunday October 21, 2012 @04:37PM (#41723705)

    Like, oh, say, "smart meters".

    When they started installing "smart meters" in my area because "too many" people were installing solar, and PGE were paniced that they'd have to pay the same rate they were charging instead of cutting off the payout at net zero by having differential rates so they payed less when solar was active, my bandwidth went to hell, and I had to more centrally locate my AP to avoid the interference.

  • by NixieBunny ( 859050 ) on Sunday October 21, 2012 @04:43PM (#41723749) Homepage
    The RF transistors used to make a sizable fraction of a watt at 2.4 GHz tend to be made with many tiny transistor junctions in parallel on one die. Individual transistor junctions can fail, causing the output power to be reduced yet still non-zero.
  • by girlintraining ( 1395911 ) on Sunday October 21, 2012 @04:53PM (#41723815)
    I'm actually in a suburban city in the midwestern United States, and the neighborhood is very middle class. If I go downtown, there are over 250 access points and well over 1,500 clients visible using the same method of discovery. If I head to the 'rich' part of town, there are fewer access points, but a lot more clients connected to them on average. And the poor parts of town are usually apartment housing, which is much denser and so has a higher number of access points... but a lower number of associated clients per, on average. So overall, except for high density commercial environments, the number of devices per mile doesn't vary much based on income -- it's pretty locked in on the number of people per mile. But again, these are averages from having sampled dozens of neighborhoods. In any given neighborhood, you're going to find locations that are very dense and others almost barren, sometimes separated by only a few hundred feet. Statistics, you know...

    I would like to know more about how you measured the number of associated clients though.

    I cheated, and it was probably illegal, though petty. I salvaged a grey fiberglass enclosure pod previously used for housing cable TV equipment, and then modified it so it could be quickly and discreetly attached to a telephone pole. Inside was a 'microATX' computer with a pair of SD cards to boot off of, and was underclocked and undervolted, the fans stripped off and replaced with large passive fins. water-resistant wire netting was added along the inside (the part facing the pole) and thermally bonded to oversized heatsinks. Two wifi antennas were mounted along the front at 45 degree angles (to catch both vertically and horizontally-aligned signals) and attached via SMC connectors to adapters. One was 'a', the other was 'a/b/g/n'. Last was a special diagnostic adapter with three antennas which cost an arm and a leg. It was a spectrum analyzer designed to identify sources of high EMR and localize the source. The remaining space was packed tight with high capacity deep cycle batteries. The unit could run for about 86 hours before it ran out of juice.

    It recorded every MAC address and associated BSSID, etc., as well as encoding and some other properties. You can catch new clients because before they associate, they broadcast the SSID they want to connect to in the clear. Over a period of three days it collected all that information and then wrote it out to the SDcards. Drag it home, hook it up to the car charger and pull the cards. Rinse, wash, repeat.

    You're the first post I've come to on this thread that has mentioned the sad relationship between microwave ovens and channel 6. Other people mentioned the noise floor early in the discussion, but I didn't notice any of them actually referencing that to Shannon's Law, either. You should be modded informative a couple of times, but I won't be surprised if the 'sociopathic answer' part gets you downmodded instead.

    I wouldn't be surprised either. Slashdot ought to just replace their mod system with "like" and "dislike", since that's really what it boils down to. I frequently get downmodded for saying something that is technically correct or possible, without addressing the ethical considerations. I prefer to tell people the whole truth and let them make their own choices, rather than leaving out critical information because I don't trust them enough not to do something stupid. People need to know what's possible, not just what's legal or ethical. This idea was summed up beautifully for me years ago on a forum far, far, away, when someone wrote "You can't expect a terrorist to care that his car bomb is taking up two parking spaces." Criminals don't give a damn about you or the law; They are there to grab the low-hanging fruit, and unless you know what's possible, how they operate, you can't defend against attacks.

    You can't be a good white hat without having worn the black hat.

  • by whizbang77045 ( 1342005 ) on Sunday October 21, 2012 @04:56PM (#41723843)
    I'm having problems on a 2+ year old router, yet I live in a remote area. There aren't any other routers any where around here, because there aren't any other people around here. You can scan for other routers, and you won't find any. Thus, in at least some cases, the noise floor isn't the culprit. I suspect there may be more than one problem. I don't doubt the noise floor is a problem in some areas, but I'd bet on component problems in the router in other cases, like mine.
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Sunday October 21, 2012 @05:03PM (#41723877)
    Comment removed based on user account deletion
  • not kidding. newer gear uses junk parts that the vendor or builder decided to use instead of brand name trustable parts. or, they sought out real ones but got fakes. in electrolytics, its a mostly fakes world ;(

    You're 100% spot on with this. Back about 3 or 4 years ago I had two monitors made by Samsung and LG which had those knockoff capacitors in them, unfortunately I didn't know until the monitors started failing to start up in one, the other started dimming out. It was an easy fix in both cases, and in both cases Samsung and LG were willing to reimburse me the cost in higher grade capacitors when I contacted them, then sent them the knock-off defective caps. Both monitors were just out of the warranty period. I will admit, very good of them. They didn't have to.

    I haven't had a problem on my newer syncmaster e2220 though, so it looks like they're making sure that their supply is clean now.

  • by mysidia ( 191772 ) on Sunday October 21, 2012 @05:24PM (#41723975)

    It's also possible, too much power was being transmitted, it wasn't balanced properly, or the antenna was inefficient, resulting in signal deflection, and heat buildup in the transmitter on the old AP, causing eventual long-term damage/degradation to the transciver.

  • by mysidia ( 191772 ) on Sunday October 21, 2012 @05:28PM (#41724001)

    How about a fix that doesn't involve the FCC knocking on your door in a few weeks?

  • by TheGratefulNet ( 143330 ) on Sunday October 21, 2012 @05:50PM (#41724101)

    Everything electronic degrades over time. The only electronics we have that are mostly proof against it are in orbit and cost exorbitant amounts of money.

    it didn't used to be that way.

    a time-travel into the past, via ebay, can be insightful. over the last year or so, I've been collecting test gear from the 50's 60' and 70's (from good brands, fluke, tek, HP back before agilent, etc). one brand that really impressed me was 'power designs'. search for it with the word 'precision' and you'll see samples of what was built tens of years ago (some half a century!) and yet they still hold their values to high tolerances, more so than quite a lot of 'high end' supplies made by high end brands today! I bought quite a few, tested them (after cleaning them up) and they showed amazing stability, down to microvolts. the switches were good, the meter was good, paint and metal was fine, wiring and circuit boards were fine. even 40 yr old caps were fine, to be honest. I replaced the caps as a matter of course, but I could probably get more time from the gear if I wanted to.

    these costed a few hundred dollars back then (30-40 yrs ago). now, you can't even buy that quality (and the ebay sourced PDI precision gear tends to go for $100 level prices, today). but back then, it was expensive but not outrageous. and it lasted more than many peoples' careers! the stuff STILL runs! and stays in spec.

    most people have no exposure to this. all they know is what they buy at frys or best buy or worse, ebay china stuff. they *assume* that things won't last more than a few years, usually less than 5. they're OK with junking it and starting with new gear! ;(

    older gear was meant to be repairable and last for decades. my test lab is mostly filled with vintage gear that was trivial to restore and would cost me 10x or more to get new equiv stuff, if even possible.

    look at 'general radio' and search for pics of what their manual switched voltage dividers look like. I have one that is 50 yrs old and can't be matched with even today's gear.

    things USED to be built to last. they really did. and people who are in the 20's and 30's have no idea about this strange old idea, either. that's the pity of it all.

  • by realityimpaired ( 1668397 ) on Sunday October 21, 2012 @06:00PM (#41724161)

    Most smart meters also use Power line communication [wikipedia.org], rather than wireless or cellular networks to report in. At least, in this country, they do. That shouldn't interfere with wireless in any way, unless you have dirty power and a poor quality router that can be affected by it.

  • by Tastecicles ( 1153671 ) on Sunday October 21, 2012 @06:02PM (#41724167)

    wait, what? As frequency increases, the power needed to achieve the same range increases. Radio 101.

    At 100mW EIRP and 0db antenna gain (what you'd expect to find on a legally compliant consumer 2.4GHz AP), you're looking at a line of sight range of about 670 feet. Theoretically.
    At 1W EIRP and 0db antenna gain (what you'd expect to find on a legally compliant consumer 5.4GHz AP) you're looking at LOSR of about 800 feet. Ten times the power at twice the frequency for not much in the way of range gain.

    An unlicensed consumer radio transmitter in the UK cannot under any circumstances exceed an EIRP of 4W. So for a 60GHz transmitter, with a 0db antenna, at 4W, you're looking at an effective LOSR of...

    less than thirty inches. Theoretically.

    There is another problem...

    Bear in mind that the 60GHz band (it's actually the midpoint of the FCC allocation between 57-64GHz) is very susceptible to oxygen absorption - to the tune of requiring a 22dB gain on the antenna just to overcome the problem, otherwise you're stuck with something that will not even cover the backplane on a rack. The FCC in the US is allowing 40dBm EIRP transmit power to account for this, though this extra power (we're into the several tens to several hundred Watts here: not a consumer-grade solution) will not improve the range over 2.4GHz at consumer-grade output, in fact it will not even come close. Current 60GHz designs are aimed at datacentre interconnects over distances probably not exceeding 40 to 60 feet.

  • by Hadlock ( 143607 ) on Sunday October 21, 2012 @06:22PM (#41724255) Homepage Journal

    Then wouldn't my Linksys WRT54 GS (the one you can swap out the firmware on) start working better when I put DD-WRT or OpenWRT on it? My WRT54 just up and died two months ago (this was already the warranty replacement) and the wifi would just die if two people started using it one day. Both WRT54s were using the same firmware version (it hasn't been updated in a while) and both of them dropped to about 20% of their original range... then just died completely, only allowing you to connect after reboot.

  • by zenith1111 ( 1465261 ) on Sunday October 21, 2012 @06:24PM (#41724267) Homepage

    Then of course there's the fact that 802.11n completed changed frequency bands, from the 2.4 GHz region (which is extremely cluttered) to the 5 GHz region, which is relatively empty.

    The better and more expensive Access Points can usually operate at both 2.4 and 5GHz, but most of the affordable 802.11n devices only operate at 2.4GHz (at least those I can find).

    I was looking for a wireless card to enable an old laptop to connect to my 5GHz AP and I found that pretty much all of the cheap wireless cards also only operate at 2.4GHz.

  • by Anonymous Coward on Sunday October 21, 2012 @06:31PM (#41724309)

    I had a HUGE argument with the manufacturer of an AirPrint capable printer because they had designed it to only work on ONE channel...the exact channel that all my neighbors were on and that I was avoiding like the plague. I have 12 devices on wireless in my house and have quite enough collision problems without adding the neighbors to it also.

  • WRT54G routers (Score:5, Interesting)

    by XB-70 ( 812342 ) on Sunday October 21, 2012 @06:51PM (#41724401)
    Six years ago, I took some tupperware cereal containers, drilled two holes in the bottom and pushed a WRT54G router's twin antennae through each hole. I caulked both holes. I inverted the containers, took an old broom stick and jammed it up into the contaners and screwed each router up high outdoors on buildings spaced 400' apart. I configured all three routers with DD-WRT in bridged/AP mode. I have never taken them down. This is in Vermont. They work great and cover some 18 acres. What more can I say?
  • by Anonymous Coward on Sunday October 21, 2012 @08:05PM (#41724715)

    Actually, radio standards (such as 802.11) never specify the receiver algorithm or architecture, as it is not necessary for interoperability. The receiver is only described in general terms.

    Standards specify what is to be transmitted over-the-air, and might give a reference (typically in efficient) implementation of a transmitter for testing purposes. They might also give some examples of expected receiver output, also for testing.

    Standards deliberately do not specify receivers, to give engineers an opportunity to innovate and provide differentiation in the market. Just one choice, for example, might be whether to do soft or hard decision decoding. (I happen to design receivers and transmitters for a living.)

    As you've said, chances are any degradation are due to electrolytic capacitors drying out (and for that reason electrolytic are seeing less use), dirt and contamination, or maybe corrosion on connectors. Dropping output power might be one issue, but instability or inaccuracy developing in the timebase (crystal oscillator) is probably a bigger issue, and most actual failures end up being a power supply problem.

  • by pem ( 1013437 ) on Sunday October 21, 2012 @08:55PM (#41724953)
    No, silly, to provide stable power to the circuit trying to transmit or decode at 2.4GHz.

    Noise (because of bad caps) from the power supply could easily cause jitter, which can reduce viable range considerably.

  • by gfxguy ( 98788 ) on Sunday October 21, 2012 @09:30PM (#41725131)
    This is going to sound stupid, but I'm going to say it anyway - I'm not an electrical engineer, but my first wireless router to die died because the wall-wart power supply died. I don't imagine the electronics need all that much to work, but the transmitter might - is it possible to lose some power output over time? When I measured the output with my meter, it was like 3.2V, but it was supposed to put out 9V. I replaced it with a universal wall-wart and brought the router back to life, good as new.
  • Crystal Aging (Score:3, Interesting)

    by Anonymous Coward on Sunday October 21, 2012 @09:55PM (#41725261)

    Having worked on some of the 802.11 algorithms, I can safely predict that all equipment with a crystal oscillator will eventually degrade.
    802.11g was speced for a differential frequency deviation of 40ppm. Nominally thats 20 ppm allocated to AP and 20 to the remote.
    In an effort to cut costs as much as possible (because most of you shop only on price, right?), the crystals used do not have good aging properties. It's not unusual to see crystals used that have as much as a 5-10ppm/yr agiing spec.

    Unluckly customers will see their AP and remote device diverge, and within 3-4 years the frequencies are out of range of the algorithm's ability to compensate.

  • by adolf ( 21054 ) <flodadolf@gmail.com> on Monday October 22, 2012 @01:12AM (#41726011) Journal

    Except for the competing theory of failing capacitors allowing for less-stable operation due to power supply wonkiness.

    I've seen my share of bad caps doing all manner of strange things in modern ("digital") electronics. Things can turn very weird a long time before they outright fail.

  • Built to fail. (Score:3, Interesting)

    by Billgatez ( 2652403 ) on Monday October 22, 2012 @04:03AM (#41726505)
    Almost all cheap and even some expensive router's are built to fail. Cheap components and no cooling end up causing weak BGA joints and dryedup cap's causing it to fail. It the same with new TV's they last a few years then they die because of bad caps or overheated voltage regulators. look at old tv's they go forever. well until the tube burns out. I have a old console TV still works.

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...