Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Wireless Networking Networking Hardware

Ask Slashdot: Why Does Wireless Gear Degrade Over Time? 615

acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Why Does Wireless Gear Degrade Over Time?

Comments Filter:
  • by pentabular ( 2609873 ) on Sunday October 21, 2012 @03:05PM (#41723071)

    ..have a tendency to degrade and fail over time.

  • by Anonymous Coward on Sunday October 21, 2012 @03:09PM (#41723095)

    Probably capacitors degrading, transient spikes making it through the hardware, electrostatic discharge during assembly, just plain overheating as dust coats the internals, more people using the same frequencies (possibly including yourself as you add more wireless devices), and a bevy of other reasons I can't think of at the moment.

  • by Anonymous Coward on Sunday October 21, 2012 @03:12PM (#41723117)

    Check the power supply. Usually the electrolytic capacitors are already dry.

  • by spongman ( 182339 ) on Sunday October 21, 2012 @03:13PM (#41723125)

    Using the same channel does not increase signal interference. Signal interference comes from APs using neighboring channels in close proximity. If you're looking for greater range, try switching to the same channel as your neighbor. Your bandwidth could be lower, but the interference will be reduced.

  • by Anonymous Coward on Sunday October 21, 2012 @03:16PM (#41723143)

    In my experience power adaptor degradation is the main culprit. Over time the adaptor will provide lower voltages and a less stable current. This translates into a lower signal output and higher noise respectably. I've seen bad adaptor turn repeaters into signal jammers - trust me, that was not an easy issue to troubleshoot...

  • by TechyImmigrant ( 175943 ) on Sunday October 21, 2012 @03:29PM (#41723241) Homepage Journal

    >cheap electroytic capacitors have a tendency to degrade and fail over time.

    Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.
    I don't think electrolytics are it.

    I have my suspicions about the noise figure of LNAs changing over time. There are some very highly strung, teeny weeny transistors in LNAs (Low Noise Amplifiers) right in the signal path.

  • by Theaetetus ( 590071 ) <theaetetus,slashdot&gmail,com> on Sunday October 21, 2012 @03:33PM (#41723257) Homepage Journal

    >cheap electroytic capacitors have a tendency to degrade and fail over time.

    Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.

    True, but you do use them in your cheap switch-mode power supply, and as they degrade, you get additional AC noise on the rails of your amplifiers that are in the IF/RF signal path. Particularly in cheap routers that are operating near the limits of their amplifiers, voltage drops on the rails could cause clipping of the high frequency signal, which will result in dropped packets, required rebroadcasting, etc.

  • by Ironhandx ( 1762146 ) on Sunday October 21, 2012 @03:43PM (#41723331)

    There is overlap to a significant degree, channel 2 won't help, but channel 4-5 might. There are generally about 3 channels worth of overlap on those wifi channels. If its only a small amount of interference thats causing the signal to drop, channel 3 might even do it. However if theres enough interference to cause problems, swapping to channel 2 from channel 1 won't help because they share about 80% of the same band.

  • Old and tired. (Score:4, Informative)

    by SuperTechnoNerd ( 964528 ) on Sunday October 21, 2012 @03:49PM (#41723385)
    Perhaps it's frequency drift. As components age their values change slightly. And when dealing with 2.4 Ghz and above tolerances are strict. It's just my guess.. Take it or leave it.
  • by ATMD ( 986401 ) on Sunday October 21, 2012 @04:02PM (#41723465) Journal

    If you read the OP carefully, all he says is that he replaced the routers - not that doing so actually fixed the problem.

    Deteriorating SNR does seem the most likely explanation.

  • by TubeSteak ( 669689 ) on Sunday October 21, 2012 @04:14PM (#41723541) Journal

    I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.

    I'd like to see you and I'm sure there are multinational corporations that would pay you megabucks to make it happen.
    The problem isn't the standard, it's the amount of unlicensed spectrum available at 2.4 GHz

    The unlicensed 2.4 GHz has 84.5 MHz of bandwidth.
    The unlicensed 5.8 GHz has 125 MHz of bandwidth.
    The unlicensed 60 GHz has 7 GHz of bandwidth from 57GHz to 64GHz

  • by TheGratefulNet ( 143330 ) on Sunday October 21, 2012 @04:17PM (#41723561)

    chinese capacitors.

    not kidding. newer gear uses junk parts that the vendor or builder decided to use instead of brand name trustable parts. or, they sought out real ones but got fakes. in electrolytics, its a mostly fakes world ;(

    they last a year or a few years, tops. you'll see the cans bulge and burst at the expansion caps, at the top (alum creased areas).

    if its on the digital side, you lose all funct.

    if its on the analog side (rf, etc) then things can degrade before fully failing. I've seen this in audio gear, too, btw.

    cure is to buy known good caps from known vendors (digikey, mouser, newark, etc) and install them yourself. get a hakko desoldering tool, pull out ALL electrolytics and get same LS (lead spacing size) and value caps. try to increase the voltage on them (their rated voltage) as the vendor often gets that part, wrong, in safety margins.

    usually, its the power supply that goes bad. and usually its the caps. if you replace your caps, you can convert a $50 consumer throw-away into a $5000 enterprise class gear that will actually out-run most commercial gear simply by using GOOD low ESR caps instead of fakes that almost every one ships with.

    panasonic, nichicon, others make good low-esr filter caps. they are a dollar or so, in price. not expensive. not hard to replace.

    swap them now or wait for a failure. either way, this is almost always the cause of networking and computer gear these days.

  • by Calos ( 2281322 ) on Sunday October 21, 2012 @04:19PM (#41723571)

    "Algorithms" aren't going to change because that requires a standard that must be followed by the transmitter and receiver. Unless s/he's upgrading from something like 802.11b to 802.11g, then there shouldn't be any such change. Possible exception would be a proprietary addition, but the problem remains.

    It would be interesting to know if, when switching out the router, if s/he changed the frequency it's operating on. There are different bands that can be chosen even within the 802.11g spec, a newer router might have selected a less busy band automatically.

    Then of course there's the fact that 802.11n completed changed frequency bands, from the 2.4 GHz region (which is extremely cluttered) to the 5 GHz region, which is relatively empty. That said, the higher frequency would be more impeded by solid barriers, e.g. walls. But it may compensate by higher transmit power, I don't know.

    Hard to say if transmit power is really changing without being able to rule out other factors. But electronics do degrade. First suspect I'd think would be cheap capacitors. Poorly designed transistors could degrade, but this seems unlikely as RF band usually uses BJTs. Dust buildup could increase temperatures, which could hurt the efficiency and gain of these devices, but that's a rather long shot.

  • by EmagGeek ( 574360 ) on Sunday October 21, 2012 @04:33PM (#41723685) Journal

    Electrolytic capacitors can very well be it, even if they are not used in the RF signal path.

    They are used in power supplies, and a drying out electrolytic can cause a number of things:

    1) Lessened capacitance leads to reduced p/s output capacity, which can cause voltage to droop if the supply hits maximum duty cycle

    2) Drying out caps have higher ripple current, and higher heat dissipation, which consumes more of the power supply's capacity, and can also lead to drooping output

    3) Lower capacitance can lead to increased ripple voltage on the output, and that ripple can leak into the RF signal path, increasing the symbol error rate, and even causing frequency drift in the VCO

  • by sneakyimp ( 1161443 ) on Sunday October 21, 2012 @04:45PM (#41723769)
    I'd wager there are more algorithms involved than just the 802.11 protocol -- that protocol is the top layer in a stack of technology. Before you even get to the part where you are doing any kind of data handshaking, you might have a proprietary algorithm that filters your raw radio signal to weed out interference. There are also implementations of 802.11 on the market with non-standard features [wikipedia.org]. Furthermore there is the inevitable forward march to ever-improving 802.11 standards. 802.11g is really old now. 802.11n is even old news. I've seen gobs of 802.11ac on sale at newegg.com and I don't even think the standard has been formalized. The "algorithms" are absolutely, definitely changing.
  • by artor3 ( 1344997 ) on Sunday October 21, 2012 @04:58PM (#41723855)

    I don't think so. I'm an RF electronics engineer, and we do all sorts of accelerated stress testing to check out the second half of the bathtub curve, and I've never seen degradation on anything in the RF/IF path inside the chip. The typical "wear out" failure mode is cracks forming in the protective seals around the chip and letting in moisture (from the air), which causes leakage (high current draw) and eventually just shorts out something important and the chip dies. At any rate, that shouldn't be happening in just a couple years, unless the submitter moves his PC back and forth between the freezer and the sauna every week.

    One possible explanation would be crystal aging. RF equipment tends to rely on extremely accurate quartz crystals to provide reference frequencies. Those crystals tend to drift as they age. If the design was already near the edge of acceptable frequencies, an extra 10 or 20 ppm from aging could easily result in several dB of degradation.

    Another poster pointed out the possibility of the router using a crappy switched-mode power supply, which is also a good explanation. I would hope that they would power the RF chip through a linear regulator with good noise rejection, but who knows? That sort of switching noise can absolutely interfere with the radio's performance.

  • by Anonymous Coward on Sunday October 21, 2012 @05:16PM (#41723933)

    It takes 10 minutes in all to setup my solder and get this done. I dont about you, but my time is not worth that much.

  • by sortius_nod ( 1080919 ) on Sunday October 21, 2012 @05:17PM (#41723941) Homepage

    There was a well known problem of Linksys caps becoming detached from the board due to heat. They'd be fine when cool, but leave them on for a few hours & signal would degrade almost completely. The only solution was to either (as you did) replace them, or, if you were lazy, resolder them to the board.

    There's many factors at play with wireless networks. The only way to be sure it's a failing cap/chip/board or interference from other devices is to manually test every component & run a frequency mapping scanner in every room in the house.

  • by Anonymous Coward on Sunday October 21, 2012 @05:38PM (#41724057)

    They'll swap brand parts for something their brother, cousin, whatever makes for a fraction of the cost, bugger the value tolerances or the longevity of the substituted components. Then in that tiny, unobtrusive case, the crappy parts cook, degrade and eventually blow up if you haven't replaced the whole device in the meantime.

    Remember the problems with the first batch of Raspberry Pi's? The manufacturing samples came back all to spec and with the correct components and the contractor was given the go-ahead to produce the first batch. First thing they did was to switch the ethernet socket with magnetics for a cheaper one without magnetics. Passed the static check on the production line fine, arrived back in the UK and failed when actually required to work in a live situation. Luckily, the Foundation had access to some non-destructive investigation equipment and x-ray examination of the ethernet socket revealed the lack of those pesky little ferrite cores.

    If that happens to a small outfit with its finger on the piulse, then I suppose the scope for sharp practice is greater with large companies and their long production runs and more remote quality testing.

  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Sunday October 21, 2012 @06:06PM (#41724189) Homepage Journal

    "Which is exactly why any geek worth his or her salt would blow away that factory firmware and install 3rd party firmware that allows you to reach higher up into the channels that normally aren't used (at least in the US)."

    Going to those frequencies in the router won't do jack shit for you if the transceiver in your wireless device can't use those frequencies.

    So, if you run Windows or OSX, you're pretty much fucked.

  • by realityimpaired ( 1668397 ) on Sunday October 21, 2012 @06:19PM (#41724239)

    Each channel is 20MHz wide, and you need to go 5 channels up to get one with no overlap at all. Do the math, that means the channels are at 4MHz intervals (they're actually 5MHz apart, for extra white space). Channels 1, 6, and 11 do not overlap with each other, channels 2, 7, channels 3, 8, etc.

    If there's enough noise at 2412MHz (Channel 1) that it's essentially unusable, then switching to 2417MHz (Channel 2) will probably not make a difference at all... the frequency listed is the peak frequency, but channel 1 is actually 2402-2422MHz and channel 2 is actually 2407-2427. Going to a higher channel that's farther away, however, probably would make a decent difference.

    That's also why lots of newer routers have a channel bonding 40MHz option... they transmit/receive on two channels at the same time, in the hopes of being able to get better bandwidth through the noise.

  • by MSG ( 12810 ) on Sunday October 21, 2012 @06:27PM (#41724283)

    PLEASE STOP OFFERING THIS ADVICE.

    Increasing your WAP broadcast power does nothing to improve signal in the other direction, so while it will make your mobile devices show more bars, it won't actually improve network performance. TCP doesn't work unless a host can both send and receive (packets need to be ACKed), so even if the client receives further away from the WAP, it'll stop getting new packets if it can't notify the sender that those packets were received.

    All that really happens when you increase broadcast power is an increase in interference with neighboring WAPs, which tends to lead other people to the conclusion that they also need to increase broadcast power in order to overcome the interference that you created.

  • by adri ( 173121 ) on Sunday October 21, 2012 @06:56PM (#41724429) Homepage Journal

    Hi. FreeBSD open source wireless developer here. I also work for a wireless company but this is all my own writing and is not endorsed or linked to my employer.

    Don't do that. Let me repeat - don't increase TX power from what the card and regulatory limits say you can transmit.

    Besides the regulatory limitations, the card may actually degrade if you increase the TX power. You may end up pulling more power than the card is designed or rated at. You may end up causing the output amplifiers to distort, which means you're not only breaking regulatory by spewing noise into adjacent channels, you're actually making your transmissions _worse_. It gets worse with higher transmission rates (especially 802.11n where the higher TX rates have much higher power density than the lower ones) - the Atheros driver implements per-rate TX power limits for this specific reason.

    Chances are the manufacturer just has poor cooling, cheap part selection and all of that finely tuned RF front end is slowly degrading as a result. Buy an AP with better cooling or add better cooling yourself.

    In fact, if you run the hardware at a _lower_ power output, you may find it lasts longer.

  • by postbigbang ( 761081 ) on Sunday October 21, 2012 @10:17PM (#41725341)

    It's not so simple. Let's review the history:

    802.11a was first (there is another but obscure predecessor), and used 5Ghz. Advanced modulation techniques blew the doors off 802.11b @2.4ghz, which at 11mbit/s (on a good day), had a very low yield.

    802.11g also uses 2.4ghz, but early products had trouble going back and forth between b and g, thus slowing throughput down, despite faster yields in g via advanced modulation techniques.

    Enter N, was an advanced modulation scheme with higher throughput, first largely found on 2.4ghz. At first N was really fast, and had only a bit of trouble SLOWING DOWN for b and g. You have one collision domain unless you break out the b/g radio with the n radio. So, you have to dawdle while b goes through, then the channel is free again for something faster.

    Then come the dual-band radios, and the dual band, dual radio routers which can walk and chew gum-- and handle paralellizing a 2.4ghz and a 5ghz conversation simultaneously-- major thruput.

    The transceivers in the older routers appear to slow down, but in fact, they stay the same compared to newer ones for three reasons: 1) better firmware design that can switch back and forth quickly between protocols (where present) 2) have dual radios for bgn and (maybe a)Highband N and 3) the more recent the device, the more likely it has faster processing power inside the router. The final reason is that your backhaul might be getting faster without you knowing it; DSL gets faster but so also do cable broadband connections. And it's likely the driver in your machine is faster; they change all the time with small improvements, sometimes in real throughput.

    Summary: the router didn't change, but newer stuff is faster given the same conditions for the reasons stated.

  • by Pentium100 ( 1240090 ) on Sunday October 21, 2012 @10:58PM (#41725507)

    Devices that use high frequencies (like a switching power supply) need low ESR caps. Normal ones will get very hot and fail soon, also you will have more ripple.

    Devices that use low frequency (linear power supplies) or low currents (coupling caps in audio amps) can use normal ESR caps. Using low ESR ones will not hurt the device, just that low ESR caps are more expensive and may not be available at the required voltage/capacity.

  • by dotgain ( 630123 ) on Monday October 22, 2012 @12:51AM (#41725939) Homepage Journal
    It's an exploit of a loophole in /. moderation. Funny mods provide no 'karma'. The basic idea is mod the post up funny, and back down overrated. This can repeat without end with enough moderators, and the poster goes to full negative karma for just one post.
  • by RPI Geek ( 640282 ) on Monday October 22, 2012 @01:24AM (#41726029) Journal
    No, we know [wigle.net].
  • by postbigbang ( 761081 ) on Monday October 22, 2012 @11:32AM (#41729033)

    The range problem is an illusion. The output to the antenna(s) is the same. The radial or omni-polar radiation doesn't change. What changes is that newer hardware is faster.

    Should you take a field-strength meter and walk around the area, noting the polar strength, you'll see that it doesn't change in two, three, four, etc years.

    Can you reduce the coverage area? In certainty! Block the antenna. Add paint, drywall, furniture, or anything else but air (even relative humidity has a bearing on the distance). If you use 802.11a or n-high then its penetration radius is much smaller as the transmission characteristics of 5Ghz isn't nearly as nice as 2.4Ghz. There power limitations and antenna limitations (technically self-imposed by the vendors) in the 5Ghz transmission devices that limit effective radiated power. If you get a signal, it's sometimes by reflective rather than direct means as 5Ghz has a short-range bouncing effect.

    The same AP on the same freq/channel using the same transmission modulation technique (there are more than 25 different types) will yield the same effective radius given the same antenna and receiver. Other items change, as noted, that give the *appearance* that things slow down. The APs don't "age", and the device inside your notebook or USB-WiFi adapter don't change. At best, on a bad day, you might get oxidation on the TNC or other coaxial (if present) connector for the antenna on a 2.4Ghz device. Otherwise, it's not going to change electrically, and so the slow downs are perceived, but the source of the perception isn't a geriatric or entropic device combination.

  • by tlhIngan ( 30335 ) <slashdot.worf@net> on Monday October 22, 2012 @11:57AM (#41729377)

    As you've said, chances are any degradation are due to electrolytic capacitors drying out

      Really? Electrolytic capacitors at 2.4GHz? /blockquote.

    No, but in the power supply. Routers are built to a price, even the more expensive $200 ones.

    If you've built a PC, you know skimping on a power supply is a sure way to end up with random reboots, BSODs, crashes, data corruption and other oddball behavior that makes absolutely no sense at all.

    And guess where Linksys/Netgear/DLink/etc save pennies per router? Exactly.

    First, the power adapter is never anything special - literally it costs around $1.50 to $2.50 for the adapter (more expensive routers maybe $5). This doesn't give much in the way of safety nor power cleanliness, especially after months of 24/7 operation.

    A crapped out adapter is probably the root cause of most router problems - it provides the necessary power, but it's full of ripple and noise that the router itself craps out in random unexpected ways.

    Next comes the power supplies IN the router itself - again, they save the penny by not including a reverse polarity protection diode, and you can bet the caps are like the ones in the adapter - the bottom of the barrel brand probably reused from the capacitor plague replacements (because they're cheap).

    On the bright side, the big silicon parts tend to be good (as are the non-electrolytic caps), but the power feeding them is uneven and can exceed limits during high power use. So when that wireless power amplifier chip starts drawing power to transmit, the power rails dip (and with bad caps, can dip a lot) due ot inductance and the power amp struggles to amplify the signal, being starved of power. So RF output drops, and more importantly, RF output quality drops as well (a crappy strong signal is just as useless as a weak one).

    Couple that with bottom of barrel UFL connectors and pigtails that oxidize and get loose due to metal fatigue

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...