Ask Slashdot: Why Does Wireless Gear Degrade Over Time? 615
acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"
The Hamsters get tired (Score:3, Funny)
and worn out. Also I think they have pretty short life spans.
Re: (Score:3, Funny)
Re: (Score:3, Funny)
Uh, yeah, but how do you think reproduction works? The stork (Fedex) will have to drop off your new baby hamster WiFi device before it can be trained.
Re:The Hamsters get tired (Score:5, Funny)
The transmitter range isn't decreasing.
It's actually due to the expansion of the universe. It's because your house is getting bigger. You just don't notice it because you are expanding at the same rate. Try going on a diet.
-
Signal isn't chaning, the noise floor is (Score:5, Insightful)
As all of your neighbors add wireless routers, the noise floor goes up, and the usable signal goes down, even though the signal strength is the same.
Re:Signal isn't chaning, the noise floor is (Score:5, Insightful)
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
"Algorithms" aren't going to change because that requires a standard that must be followed by the transmitter and receiver. Unless s/he's upgrading from something like 802.11b to 802.11g, then there shouldn't be any such change. Possible exception would be a proprietary addition, but the problem remains.
It would be interesting to know if, when switching out the router, if s/he changed the frequency it's operating on. There are different bands that can be chosen even within the 802.11g spec, a newer router might have selected a less busy band automatically.
Then of course there's the fact that 802.11n completed changed frequency bands, from the 2.4 GHz region (which is extremely cluttered) to the 5 GHz region, which is relatively empty. That said, the higher frequency would be more impeded by solid barriers, e.g. walls. But it may compensate by higher transmit power, I don't know.
Hard to say if transmit power is really changing without being able to rule out other factors. But electronics do degrade. First suspect I'd think would be cheap capacitors. Poorly designed transistors could degrade, but this seems unlikely as RF band usually uses BJTs. Dust buildup could increase temperatures, which could hurt the efficiency and gain of these devices, but that's a rather long shot.
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Then of course there's the fact that 802.11n completed changed frequency bands, from the 2.4 GHz region (which is extremely cluttered) to the 5 GHz region, which is relatively empty.
The better and more expensive Access Points can usually operate at both 2.4 and 5GHz, but most of the affordable 802.11n devices only operate at 2.4GHz (at least those I can find).
I was looking for a wireless card to enable an old laptop to connect to my 5GHz AP and I found that pretty much all of the cheap wireless cards also only operate at 2.4GHz.
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
It's not so simple. Let's review the history:
802.11a was first (there is another but obscure predecessor), and used 5Ghz. Advanced modulation techniques blew the doors off 802.11b @2.4ghz, which at 11mbit/s (on a good day), had a very low yield.
802.11g also uses 2.4ghz, but early products had trouble going back and forth between b and g, thus slowing throughput down, despite faster yields in g via advanced modulation techniques.
Enter N, was an advanced modulation scheme with higher throughput, first largely found on 2.4ghz. At first N was really fast, and had only a bit of trouble SLOWING DOWN for b and g. You have one collision domain unless you break out the b/g radio with the n radio. So, you have to dawdle while b goes through, then the channel is free again for something faster.
Then come the dual-band radios, and the dual band, dual radio routers which can walk and chew gum-- and handle paralellizing a 2.4ghz and a 5ghz conversation simultaneously-- major thruput.
The transceivers in the older routers appear to slow down, but in fact, they stay the same compared to newer ones for three reasons: 1) better firmware design that can switch back and forth quickly between protocols (where present) 2) have dual radios for bgn and (maybe a)Highband N and 3) the more recent the device, the more likely it has faster processing power inside the router. The final reason is that your backhaul might be getting faster without you knowing it; DSL gets faster but so also do cable broadband connections. And it's likely the driver in your machine is faster; they change all the time with small improvements, sometimes in real throughput.
Summary: the router didn't change, but newer stuff is faster given the same conditions for the reasons stated.
Re:Signal isn't chaning, the noise floor is (Score:4, Informative)
The range problem is an illusion. The output to the antenna(s) is the same. The radial or omni-polar radiation doesn't change. What changes is that newer hardware is faster.
Should you take a field-strength meter and walk around the area, noting the polar strength, you'll see that it doesn't change in two, three, four, etc years.
Can you reduce the coverage area? In certainty! Block the antenna. Add paint, drywall, furniture, or anything else but air (even relative humidity has a bearing on the distance). If you use 802.11a or n-high then its penetration radius is much smaller as the transmission characteristics of 5Ghz isn't nearly as nice as 2.4Ghz. There power limitations and antenna limitations (technically self-imposed by the vendors) in the 5Ghz transmission devices that limit effective radiated power. If you get a signal, it's sometimes by reflective rather than direct means as 5Ghz has a short-range bouncing effect.
The same AP on the same freq/channel using the same transmission modulation technique (there are more than 25 different types) will yield the same effective radius given the same antenna and receiver. Other items change, as noted, that give the *appearance* that things slow down. The APs don't "age", and the device inside your notebook or USB-WiFi adapter don't change. At best, on a bad day, you might get oxidation on the TNC or other coaxial (if present) connector for the antenna on a 2.4Ghz device. Otherwise, it's not going to change electrically, and so the slow downs are perceived, but the source of the perception isn't a geriatric or entropic device combination.
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Actually, radio standards (such as 802.11) never specify the receiver algorithm or architecture, as it is not necessary for interoperability. The receiver is only described in general terms.
Standards specify what is to be transmitted over-the-air, and might give a reference (typically in efficient) implementation of a transmitter for testing purposes. They might also give some examples of expected receiver output, also for testing.
Standards deliberately do not specify receivers, to give engineers an opportunity to innovate and provide differentiation in the market. Just one choice, for example, might be whether to do soft or hard decision decoding. (I happen to design receivers and transmitters for a living.)
As you've said, chances are any degradation are due to electrolytic capacitors drying out (and for that reason electrolytic are seeing less use), dirt and contamination, or maybe corrosion on connectors. Dropping output power might be one issue, but instability or inaccuracy developing in the timebase (crystal oscillator) is probably a bigger issue, and most actual failures end up being a power supply problem.
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Noise (because of bad caps) from the power supply could easily cause jitter, which can reduce viable range considerably.
Re:Signal isn't chaning, the noise floor is (Score:4, Informative)
Re:Signal isn't chaning, the noise floor is (Score:5, Insightful)
There are 14 channels (frequencies) in 802.11b/g/n (2.4 GHz). There are only 3 that do not overlap though (1,6,11). The best thing to do if you plan to use the 2.4 GHz range is to run something like inSSIDer and see which of those 3 channels are least congested, then set your router to use that range. The problem with 2.4 is the lack of non-overlapping channels, and the fact that most routers have a default setting to pick the "least congested channel" but not conform to the 1,6,11 standard. So therefore you have all your neighbors congesting multiple channels by overlapping 1 and 6 or 6 and 11, using a channel in between. This is a nightmare for high-density areas (I do wireless for large conferences. It's a huge challenge).
In 802.11a/n (5 GHz), there are 23 channels you can use (depending on if you bond for N or not). This is like comparing a 3 lane highway to a 23 lane highway. Your density capacity is FAR higher than 2.4. The downside is that most mobile devices do not have 5 GHz radios, and 5 GHz, because of the nature of the higher frequency, does not penetrate (giggidy) as far as 2.4 (as you said). From a management point of view this is a benefit, but when you are trying to cover a large house it leads to weak signals at the edges of the house, if you center it. This is a good point to use multiple access points though (routers without the routing in lay terms).
So, the near future for wireless is 5 GHz (or, "Wireless A and N"). 5 GHz is catching on (iPhone 5 and a few Androids and Tablets have it now) and will start to get much busier, but the great thing about that is it's designed for it. The downside to the less-near future is that 802.11ac will add larger bonding in the 5 GHz range, which will lead to less available channels/capacity. We'll see how that goes...
One correction to your comment I have is that N is not 5 GHz specific, and it has not at all changed people from 2.4 to 5. It's only allowed more throughput than G or A did on those frequencies. It's caused more problems than not because of the fact it can use the 40 MHz window, taking up 1 of the 3 non-overlapping channels, pissing off your neighbors.
Re:Signal isn't chaning, the noise floor is (Score:4, Interesting)
Then wouldn't my Linksys WRT54 GS (the one you can swap out the firmware on) start working better when I put DD-WRT or OpenWRT on it? My WRT54 just up and died two months ago (this was already the warranty replacement) and the wifi would just die if two people started using it one day. Both WRT54s were using the same firmware version (it hasn't been updated in a while) and both of them dropped to about 20% of their original range... then just died completely, only allowing you to connect after reboot.
Re:Signal isn't chaning, the noise floor is (Score:4, Informative)
If you read the OP carefully, all he says is that he replaced the routers - not that doing so actually fixed the problem.
Deteriorating SNR does seem the most likely explanation.
Re:Signal isn't chaning, the noise floor is (Score:5, Insightful)
That wouldn't explain why replacing the router fixes the problem, unless he just happens to be replacing the old router with one that just happens to have a stronger transmitter or better antenna. The pessimist in me says that the chances of that happening can't be 100% of the time.
From my experience with all the major manufacturers of consumer routers and switches, the problem is capacitor quality primarily.
Transmission range and stability will suffer over time because of unstable or insufficient voltage.
Also these devices get really hot these days. Most if not all are passively cooled, and don't even have much real ventilation in their casings.
They're designed to be cheap, and to last for at least 12 months generally.
I've replaced all the capacitors on several Linksys "Business class" Gigabit switch, they all started failing after about 14 months.
I did this to my own switch about 4 years ago, and it's still going strong today. I've also done this on an old Linksys WRT54G.
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
There was a well known problem of Linksys caps becoming detached from the board due to heat. They'd be fine when cool, but leave them on for a few hours & signal would degrade almost completely. The only solution was to either (as you did) replace them, or, if you were lazy, resolder them to the board.
There's many factors at play with wireless networks. The only way to be sure it's a failing cap/chip/board or interference from other devices is to manually test every component & run a frequency mapping scanner in every room in the house.
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Re:Signal isn't chaning, the noise floor is (Score:4, Funny)
Damn neighbors - I see 35 wireless networks from my mac: Jes's Awesome Network, Doris Family, Alex, Bellclaire Hotel, I Win, Epsteinland, buckduke, toujoursavectoi, sheilajaffe1, ming. And a bunch of hexcodes. Bit sad to not see HotWorkoutPants on the list today.
Re:Signal isn't chaning, the noise floor is (Score:4, Funny)
Oh, Cliffs of Insanity just showed up - nice name.
Re:Signal isn't chaning, the noise floor is (Score:5, Funny)
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
Re:Signal isn't chaning, the noise floor is (Score:5, Funny)
Re:Signal isn't chaning, the noise floor is (Score:4, Funny)
The FBI is not conducting surveillance today, which is why you don't see HotWorkoutPants. See you next week.
That's not it. It always shows up here as "FBI surveillance Van #27" (although I do sometimes see #33).
Re:Signal isn't chaning, the noise floor is (Score:4, Funny)
Re:Signal isn't chaning, the noise floor is (Score:5, Funny)
Bit sad to not see HotWorkoutPants on the list today.
It's a guy don't be deceived.
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Not to mention other squatters on that band (Score:5, Interesting)
Like, oh, say, "smart meters".
When they started installing "smart meters" in my area because "too many" people were installing solar, and PGE were paniced that they'd have to pay the same rate they were charging instead of cutting off the payout at net zero by having differential rates so they payed less when solar was active, my bandwidth went to hell, and I had to more centrally locate my AP to avoid the interference.
Re:Not to mention other squatters on that band (Score:4, Interesting)
Most smart meters also use Power line communication [wikipedia.org], rather than wireless or cellular networks to report in. At least, in this country, they do. That shouldn't interfere with wireless in any way, unless you have dirty power and a poor quality router that can be affected by it.
Re:Signal isn't chaning, the noise floor is (Score:5, Interesting)
Re:Signal isn't chaning, the noise floor is (Score:4, Interesting)
It's also possible, too much power was being transmitted, it wasn't balanced properly, or the antenna was inefficient, resulting in signal deflection, and heat buildup in the transmitter on the old AP, causing eventual long-term damage/degradation to the transciver.
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
PLEASE STOP OFFERING THIS ADVICE.
Increasing your WAP broadcast power does nothing to improve signal in the other direction, so while it will make your mobile devices show more bars, it won't actually improve network performance. TCP doesn't work unless a host can both send and receive (packets need to be ACKed), so even if the client receives further away from the WAP, it'll stop getting new packets if it can't notify the sender that those packets were received.
All that really happens when you increase broadcast power is an increase in interference with neighboring WAPs, which tends to lead other people to the conclusion that they also need to increase broadcast power in order to overcome the interference that you created.
Re:Signal isn't chaning, the noise floor is (Score:5, Insightful)
FCC doesn't seem to care when people shit all over the bands used for WiFi in some random suburb. I'm going to guess they are more interested in fining radio DJs and filing paperwork for a million or so a year new consumer electronics devices.
Re:Signal isn't chaning, the noise floor is (Score:4, Informative)
"Which is exactly why any geek worth his or her salt would blow away that factory firmware and install 3rd party firmware that allows you to reach higher up into the channels that normally aren't used (at least in the US)."
Going to those frequencies in the router won't do jack shit for you if the transceiver in your wireless device can't use those frequencies.
So, if you run Windows or OSX, you're pretty much fucked.
Re:Signal isn't chaning, the noise floor is (Score:5, Informative)
Hi. FreeBSD open source wireless developer here. I also work for a wireless company but this is all my own writing and is not endorsed or linked to my employer.
Don't do that. Let me repeat - don't increase TX power from what the card and regulatory limits say you can transmit.
Besides the regulatory limitations, the card may actually degrade if you increase the TX power. You may end up pulling more power than the card is designed or rated at. You may end up causing the output amplifiers to distort, which means you're not only breaking regulatory by spewing noise into adjacent channels, you're actually making your transmissions _worse_. It gets worse with higher transmission rates (especially 802.11n where the higher TX rates have much higher power density than the lower ones) - the Atheros driver implements per-rate TX power limits for this specific reason.
Chances are the manufacturer just has poor cooling, cheap part selection and all of that finely tuned RF front end is slowly degrading as a result. Buy an AP with better cooling or add better cooling yourself.
In fact, if you run the hardware at a _lower_ power output, you may find it lasts longer.
built in failure (Score:4, Insightful)
built in failure. bow to your corporate masters and go consume.
Did the signal degrade, or the noise increase? (Score:5, Insightful)
Over 3 years I'd imagine a greater density of wifi devices all sharing the same spectrum to have appeared. Perhaps the signal level is the same, but the noise floor has increased substantially, degrading performance.
Re: (Score:3, Funny)
Re:Did the signal degrade, or the noise increase? (Score:4, Interesting)
Re:Did the signal degrade, or the noise increase? (Score:5, Informative)
There is overlap to a significant degree, channel 2 won't help, but channel 4-5 might. There are generally about 3 channels worth of overlap on those wifi channels. If its only a small amount of interference thats causing the signal to drop, channel 3 might even do it. However if theres enough interference to cause problems, swapping to channel 2 from channel 1 won't help because they share about 80% of the same band.
Re:Did the signal degrade, or the noise increase? (Score:5, Informative)
Each channel is 20MHz wide, and you need to go 5 channels up to get one with no overlap at all. Do the math, that means the channels are at 4MHz intervals (they're actually 5MHz apart, for extra white space). Channels 1, 6, and 11 do not overlap with each other, channels 2, 7, channels 3, 8, etc.
If there's enough noise at 2412MHz (Channel 1) that it's essentially unusable, then switching to 2417MHz (Channel 2) will probably not make a difference at all... the frequency listed is the peak frequency, but channel 1 is actually 2402-2422MHz and channel 2 is actually 2407-2427. Going to a higher channel that's farther away, however, probably would make a decent difference.
That's also why lots of newer routers have a channel bonding 40MHz option... they transmit/receive on two channels at the same time, in the hopes of being able to get better bandwidth through the noise.
Re:Did the signal degrade, or the noise increase? (Score:5, Insightful)
Re:Did the signal degrade, or the noise increase? (Score:5, Interesting)
Re: (Score:3, Insightful)
Don't get caught by the FCC, there are some pretty hefty fines for interrupting that reserved space in the spectrum.
Re: (Score:3, Interesting)
Re: (Score:3)
No desert, no third world country (yet). But only ONE WiFi around: mine.
When I first installed my router, about four years ago, I was able to reach the signal from the garage. Now I'm still able to do it. I didn't measure numerically if the signal degraded along the years or not, but for practical purposes the answer is NO: I'm still able to connect from the garage, with the same two "dots" in the "intensity" display.
So, in my case, the signal did not degrade with time. Therefore, I think the neibourghs dep
Re:Did the signal degrade, or the noise increase? (Score:4, Interesting)
Except for the competing theory of failing capacitors allowing for less-stable operation due to power supply wonkiness.
I've seen my share of bad caps doing all manner of strange things in modern ("digital") electronics. Things can turn very weird a long time before they outright fail.
Re:Did the signal degrade, or the noise increase? (Score:5, Insightful)
Yeah. Just change it to channel 2 and interfere with everyone using channels 1 and 6, BECAUSE THE ONLY NON-OVERLAPPING CHANNELS on 2.4 Ghz ARE 1, 6, and 11.
I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.
Re:Did the signal degrade, or the noise increase? (Score:5, Informative)
I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.
I'd like to see you and I'm sure there are multinational corporations that would pay you megabucks to make it happen.
The problem isn't the standard, it's the amount of unlicensed spectrum available at 2.4 GHz
The unlicensed 2.4 GHz has 84.5 MHz of bandwidth.
The unlicensed 5.8 GHz has 125 MHz of bandwidth.
The unlicensed 60 GHz has 7 GHz of bandwidth from 57GHz to 64GHz
Re:Did the signal degrade, or the noise increase? (Score:4, Interesting)
wait, what? As frequency increases, the power needed to achieve the same range increases. Radio 101.
At 100mW EIRP and 0db antenna gain (what you'd expect to find on a legally compliant consumer 2.4GHz AP), you're looking at a line of sight range of about 670 feet. Theoretically.
At 1W EIRP and 0db antenna gain (what you'd expect to find on a legally compliant consumer 5.4GHz AP) you're looking at LOSR of about 800 feet. Ten times the power at twice the frequency for not much in the way of range gain.
An unlicensed consumer radio transmitter in the UK cannot under any circumstances exceed an EIRP of 4W. So for a 60GHz transmitter, with a 0db antenna, at 4W, you're looking at an effective LOSR of...
less than thirty inches. Theoretically.
There is another problem...
Bear in mind that the 60GHz band (it's actually the midpoint of the FCC allocation between 57-64GHz) is very susceptible to oxygen absorption - to the tune of requiring a 22dB gain on the antenna just to overcome the problem, otherwise you're stuck with something that will not even cover the backplane on a rack. The FCC in the US is allowing 40dBm EIRP transmit power to account for this, though this extra power (we're into the several tens to several hundred Watts here: not a consumer-grade solution) will not improve the range over 2.4GHz at consumer-grade output, in fact it will not even come close. Current 60GHz designs are aimed at datacentre interconnects over distances probably not exceeding 40 to 60 feet.
Re:Did the signal degrade, or the noise increase? (Score:4, Funny)
I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.
Of course, you're a Slashdot Armchair Expert. We're just better than everybody at everything.
Goes without saying...
Re:Did the signal degrade, or the noise increase? (Score:5, Insightful)
This used to work, but with the common availability of WiFi, any scan of your local neighborhood will often never find a channel with more than 1 channel separating you from neighbors (auto channel switching isn't aggressive enough...why is that?). The only way I've found to keep ahead of it is to invest in new frequencies as they become available. I've had the 5Ghz spectrum for quite a few years with no neighbors using it until this month. The first one popped up on my scanner a few weeks ago.
The other has to do with the quality of the equipment. I used to use Linksys, then Netgear, and then tried Buffalo as was disappointed with each either through hardware issues, or due to poor performance. The Linksys gear seemed to go down hill after Cisco bought them, but I always thought that Cisco was an industry leader (not in the telecom field so feel free to chime in). My old 10Mb switches are still working after a decade but it seems rare to find one of these that lasts this long these days.
I finally ended up with an Apple Time Machine which worked well with a mixed environment of Windows and Mac's for wireless backup, and my original printer didn't have WiFi so the print server was ideal. I have a WiFi printer now but that also works as well.
4 years later and I'm still pulling 16 MB/s (granted with very little competition on the 5Ghz band) with mixed mode (g for the printer, and some older smart phones that can't hit 5Ghz).
Holding out for the newest frequency, after which I'll switch again.
Re:Did the signal degrade, or the noise increase? (Score:5, Informative)
chinese capacitors.
not kidding. newer gear uses junk parts that the vendor or builder decided to use instead of brand name trustable parts. or, they sought out real ones but got fakes. in electrolytics, its a mostly fakes world ;(
they last a year or a few years, tops. you'll see the cans bulge and burst at the expansion caps, at the top (alum creased areas).
if its on the digital side, you lose all funct.
if its on the analog side (rf, etc) then things can degrade before fully failing. I've seen this in audio gear, too, btw.
cure is to buy known good caps from known vendors (digikey, mouser, newark, etc) and install them yourself. get a hakko desoldering tool, pull out ALL electrolytics and get same LS (lead spacing size) and value caps. try to increase the voltage on them (their rated voltage) as the vendor often gets that part, wrong, in safety margins.
usually, its the power supply that goes bad. and usually its the caps. if you replace your caps, you can convert a $50 consumer throw-away into a $5000 enterprise class gear that will actually out-run most commercial gear simply by using GOOD low ESR caps instead of fakes that almost every one ships with.
panasonic, nichicon, others make good low-esr filter caps. they are a dollar or so, in price. not expensive. not hard to replace.
swap them now or wait for a failure. either way, this is almost always the cause of networking and computer gear these days.
Comment removed (Score:5, Interesting)
Re:Did the signal degrade, or the noise increase? (Score:4, Informative)
Devices that use high frequencies (like a switching power supply) need low ESR caps. Normal ones will get very hot and fail soon, also you will have more ripple.
Devices that use low frequency (linear power supplies) or low currents (coupling caps in audio amps) can use normal ESR caps. Using low ESR ones will not hurt the device, just that low ESR caps are more expensive and may not be available at the required voltage/capacity.
Re:Did the signal degrade, or the noise increase? (Score:5, Interesting)
not kidding. newer gear uses junk parts that the vendor or builder decided to use instead of brand name trustable parts. or, they sought out real ones but got fakes. in electrolytics, its a mostly fakes world ;(
You're 100% spot on with this. Back about 3 or 4 years ago I had two monitors made by Samsung and LG which had those knockoff capacitors in them, unfortunately I didn't know until the monitors started failing to start up in one, the other started dimming out. It was an easy fix in both cases, and in both cases Samsung and LG were willing to reimburse me the cost in higher grade capacitors when I contacted them, then sent them the knock-off defective caps. Both monitors were just out of the warranty period. I will admit, very good of them. They didn't have to.
I haven't had a problem on my newer syncmaster e2220 though, so it looks like they're making sure that their supply is clean now.
Re: (Score:3, Informative)
It takes 10 minutes in all to setup my solder and get this done. I dont about you, but my time is not worth that much.
Re: (Score:3)
It isn't just their cheap stuff. One of my Cisco 877s started randomly rebooting.
The two large caps on the board were faulty, so I replaced them and now it works perfectly.
Re:Cisco is an industry leader (Score:5, Insightful)
Some idiot thought it was cheaper to spend $1 less on caps than the cost for customers calling tech support repeatedly for flakey performance.
A lot of tech companies have bean counters in charge, except they don't even know much about beans.
Re:Did the signal degrade, or the noise increase? (Score:4, Funny)
How the hell did the parent post get modded funny?
Re:It said using channel 2 would REDUCE interferen (Score:4, Informative)
Re:Did the signal degrade, or the noise increase? (Score:5, Interesting)
I once had a router where the signal started to go bad over time. I called up the company and the tech support guy told me that most routers "wear out" after around 2 years, and that I'd need to replace it. He struggled to give a logical answer when I asked him how a device with no moving parts could wear out so quickly.
If you're right, and if this is the standard advice being given to everyone, we're in for a huge arms race.
Re: (Score:3)
Solid state amplifiers do degrade over time, faster if poorly cooled and/or driven harder than they are designed for. Two years seems like a pretty dreadful lifespan, even for cheap shit shoved into unventilated plastic boxes; but perhaps cost sensitivity has driven us to that...
Re:Did the signal degrade, or the noise increase? (Score:5, Interesting)
...but the noise floor has increased substantially, degrading performance.
Bingo! You hit the nail on the head. Wifi is now commonly found in most homes. And the overwhelming majority are b/g routers. That means that everyone's "last mile" internet is running on only three non-overlapping channels (in the United States), with a maximum capacity of only 54mbps for each of those channels. While your effective range decreases, your signal still continues to interfere with others out to its maximum range, which is typically around 300 feet. Beyond that, it's only a decibel or so above the noise floor (about -96dB) and is basically ambient. So consider urban density: In a 300 foot hemisphere, how many transmitters will be in that space?
Well, I live in a residential neighborhood that is mostly single-dwelling homes, which is about as ideal as you can get from a low-density city environment. Using a pringles can, I took a neighborhood survey and found about 26 access points within 300 feet of my home. Now, this is a survey that took several days to complete because of the marginal signal integrity, after which I drove my car in circles matching associated clients to those APs. Each access point had approximately 2.25 clients associated with it. So that's about 60 transmitting devices, in an ideal urban environment. And that's just those using wifi.
2.4GHz is also used by: Wireless phones, microwaves, wireless "hifi" stereo systems, etc. It's also used by wireless mice/trackballs and keyboards. So, realistically, I've got at least 100 devices that are transmitting with a signal high enough to interfere with the front-end RF of my wifi.
Shannon's Law stands tall in all of this: As you increase the noise floor, the amount of data you can transmit regardless of encoding scheme or receiver selectivity falls proportionally. Every device added decreases your own devices' performance.
Solutions
I found that by setting my router to 'g' only and then forcing the bitrate down to 24mbps, I was able to get a much more reliable and speedy signal. Every WiFi standard is designed to cope with interference by renegotiating to a higher or lower bitrate dynamically. Which would be fine if they were isolated, but in an environment where they're in close proximity to each other, what happens is as each device broadcasts and interferes with the other, they detect this and then renegotiate, generating more interference; And pretty soon you've got routers constantly in a state of renegotiation, with fluxuating bitrates. Manually force your router to a specific bitrate and don't allow re-negotiation, and you'll find that those momentary spikes in the noise level won't wash out your signal -- renegotiation takes 10--30ms, and during that time, you can't send/receive any data. The data burst that caused it is over long before the renegotiation completes.
So in short, it's not your transmitter, it's the environment. Take your transmitter out of its default settings and enable RTS/CTS (if available) and you'll be fine. Another, more sociopathic answer, is to get a 100W 2.4ghz booster (you'll have to build it), mount it on your roof, tune it to one of the 3 non-overlapping channels (I suggest 1 or 11, since most microwave ovens tend to tune at the middle of the band -- channel 6), and then let it run for about 3--5 days. Everyone will bail off that channel because nothing tuned to it will operate over a distance of even a few feet. Again, very illegal, very sociopathic... but very effective. You'll have to do "plow the spectrum" about once every month or two, so count on downtime.
Re:Did the signal degrade, or the noise increase? (Score:4, Interesting)
You've said quite a few interesting and even correct things in your post, but I want to add one point (or quibble if you wish). You are probably not in an ideal urban environment, and don't want to be. That's because an 'ideal' environment might have the broader spacing given by single family homes, but a very low average income, where there would be both fewer access points and fewer average clients per AP. In theory, parts of Detroit that are now full of more abandoned homes than inhabited ones sound 'ideal'. 26 points and 2.25 clients per sounds like you qualify as upper middle class (if anyone does these days). You would actually see less traffic in a wealthier neighborhood, as there would be more average distance between houses, and less in a poorer neighborhood, as there would be fewer people on wireless. In addition, there's a northern/southern states bias in the US, in that there's more brick and concrete block and less wood used in low to medium priced residential neighborhoods as you look at more northernly cities, and this tends to attenuate some traffic over shorter ranges.
I would like to know more about how you measured the number of associated clients though. I thought some of the mixed wired and wireless hubs reported the wired accesses mixed in with wireless, and/or indicated both active and non-active connections, but didn't give out the rest of the information needed to properly interpret what you can get. If you queried my parent's wireless router at their place, for example, and if you could get the info remotely, you might find 14 devices assigned an address, but those are for two Cat 5E wired desktop machines that are currently active, and the rest are for laptops and such the various kids, grandkids, nieces and nephews have brought over when staying for a visit, and are still on the router's lists months later. I've had five different laptop or tablet machines connected through that router at various times visiting, but never more than one on the same weekend or holiday.
You're the first post I've come to on this thread that has mentioned the sad relationship between microwave ovens and channel 6. Other people mentioned the noise floor early in the discussion, but I didn't notice any of them actually referencing that to Shannon's Law, either. You should be modded informative a couple of times, but I won't be surprised if the 'sociopathic answer' part gets you downmodded instead.
Re:Did the signal degrade, or the noise increase? (Score:5, Interesting)
I would like to know more about how you measured the number of associated clients though.
I cheated, and it was probably illegal, though petty. I salvaged a grey fiberglass enclosure pod previously used for housing cable TV equipment, and then modified it so it could be quickly and discreetly attached to a telephone pole. Inside was a 'microATX' computer with a pair of SD cards to boot off of, and was underclocked and undervolted, the fans stripped off and replaced with large passive fins. water-resistant wire netting was added along the inside (the part facing the pole) and thermally bonded to oversized heatsinks. Two wifi antennas were mounted along the front at 45 degree angles (to catch both vertically and horizontally-aligned signals) and attached via SMC connectors to adapters. One was 'a', the other was 'a/b/g/n'. Last was a special diagnostic adapter with three antennas which cost an arm and a leg. It was a spectrum analyzer designed to identify sources of high EMR and localize the source. The remaining space was packed tight with high capacity deep cycle batteries. The unit could run for about 86 hours before it ran out of juice.
It recorded every MAC address and associated BSSID, etc., as well as encoding and some other properties. You can catch new clients because before they associate, they broadcast the SSID they want to connect to in the clear. Over a period of three days it collected all that information and then wrote it out to the SDcards. Drag it home, hook it up to the car charger and pull the cards. Rinse, wash, repeat.
You're the first post I've come to on this thread that has mentioned the sad relationship between microwave ovens and channel 6. Other people mentioned the noise floor early in the discussion, but I didn't notice any of them actually referencing that to Shannon's Law, either. You should be modded informative a couple of times, but I won't be surprised if the 'sociopathic answer' part gets you downmodded instead.
I wouldn't be surprised either. Slashdot ought to just replace their mod system with "like" and "dislike", since that's really what it boils down to. I frequently get downmodded for saying something that is technically correct or possible, without addressing the ethical considerations. I prefer to tell people the whole truth and let them make their own choices, rather than leaving out critical information because I don't trust them enough not to do something stupid. People need to know what's possible, not just what's legal or ethical. This idea was summed up beautifully for me years ago on a forum far, far, away, when someone wrote "You can't expect a terrorist to care that his car bomb is taking up two parking spaces." Criminals don't give a damn about you or the law; They are there to grab the low-hanging fruit, and unless you know what's possible, how they operate, you can't defend against attacks.
You can't be a good white hat without having worn the black hat.
Re:Re Jamming your neighbors (Score:4, Insightful)
Too stupid? Actually most people aren't 'stupid', it's just that most of the population don't live and breathe IT issues. An equally narrow-minded, self-absorbed mom in a typical household would also think you're stupid because you can't cook as well as she can.
When you grow up and leave your parent's basement, here's a tip -- try to be more diplomatic and really try to understand society as a whole. Not everyone knows what you do and conversely you're not stupid because you don't have the various skill-sets of everyone around you. Some day soon you'll be out in the job market and when that happens you won't score points in any job interview by stating how everyone else is stupid because you know something like how to change channels on your wireless router and they don't.
Re:Did the signal degrade, or the noise increase? (Score:5, Interesting)
Everything electronic degrades over time. The only electronics we have that are mostly proof against it are in orbit and cost exorbitant amounts of money.
it didn't used to be that way.
a time-travel into the past, via ebay, can be insightful. over the last year or so, I've been collecting test gear from the 50's 60' and 70's (from good brands, fluke, tek, HP back before agilent, etc). one brand that really impressed me was 'power designs'. search for it with the word 'precision' and you'll see samples of what was built tens of years ago (some half a century!) and yet they still hold their values to high tolerances, more so than quite a lot of 'high end' supplies made by high end brands today! I bought quite a few, tested them (after cleaning them up) and they showed amazing stability, down to microvolts. the switches were good, the meter was good, paint and metal was fine, wiring and circuit boards were fine. even 40 yr old caps were fine, to be honest. I replaced the caps as a matter of course, but I could probably get more time from the gear if I wanted to.
these costed a few hundred dollars back then (30-40 yrs ago). now, you can't even buy that quality (and the ebay sourced PDI precision gear tends to go for $100 level prices, today). but back then, it was expensive but not outrageous. and it lasted more than many peoples' careers! the stuff STILL runs! and stays in spec.
most people have no exposure to this. all they know is what they buy at frys or best buy or worse, ebay china stuff. they *assume* that things won't last more than a few years, usually less than 5. they're OK with junking it and starting with new gear! ;(
older gear was meant to be repairable and last for decades. my test lab is mostly filled with vintage gear that was trivial to restore and would cost me 10x or more to get new equiv stuff, if even possible.
look at 'general radio' and search for pics of what their manual switched voltage dividers look like. I have one that is 50 yrs old and can't be matched with even today's gear.
things USED to be built to last. they really did. and people who are in the 20's and 30's have no idea about this strange old idea, either. that's the pity of it all.
Obligatory (Score:3, Interesting)
It could be the noise floor going up near your house, or just planned obsolescence [imdb.com].
cheap electroytic capacitors (Score:5, Informative)
..have a tendency to degrade and fail over time.
Re:cheap electroytic capacitors (Score:4, Informative)
>cheap electroytic capacitors have a tendency to degrade and fail over time.
Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.
I don't think electrolytics are it.
I have my suspicions about the noise figure of LNAs changing over time. There are some very highly strung, teeny weeny transistors in LNAs (Low Noise Amplifiers) right in the signal path.
Re:cheap electroytic capacitors (Score:5, Informative)
>cheap electroytic capacitors have a tendency to degrade and fail over time.
Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.
True, but you do use them in your cheap switch-mode power supply, and as they degrade, you get additional AC noise on the rails of your amplifiers that are in the IF/RF signal path. Particularly in cheap routers that are operating near the limits of their amplifiers, voltage drops on the rails could cause clipping of the high frequency signal, which will result in dropped packets, required rebroadcasting, etc.
Re:cheap electroytic capacitors (Score:4, Informative)
Electrolytic capacitors can very well be it, even if they are not used in the RF signal path.
They are used in power supplies, and a drying out electrolytic can cause a number of things:
1) Lessened capacitance leads to reduced p/s output capacity, which can cause voltage to droop if the supply hits maximum duty cycle
2) Drying out caps have higher ripple current, and higher heat dissipation, which consumes more of the power supply's capacity, and can also lead to drooping output
3) Lower capacitance can lead to increased ripple voltage on the output, and that ripple can leak into the RF signal path, increasing the symbol error rate, and even causing frequency drift in the VCO
Re:cheap electroytic capacitors (Score:5, Informative)
I don't think so. I'm an RF electronics engineer, and we do all sorts of accelerated stress testing to check out the second half of the bathtub curve, and I've never seen degradation on anything in the RF/IF path inside the chip. The typical "wear out" failure mode is cracks forming in the protective seals around the chip and letting in moisture (from the air), which causes leakage (high current draw) and eventually just shorts out something important and the chip dies. At any rate, that shouldn't be happening in just a couple years, unless the submitter moves his PC back and forth between the freezer and the sauna every week.
One possible explanation would be crystal aging. RF equipment tends to rely on extremely accurate quartz crystals to provide reference frequencies. Those crystals tend to drift as they age. If the design was already near the edge of acceptable frequencies, an extra 10 or 20 ppm from aging could easily result in several dB of degradation.
Another poster pointed out the possibility of the router using a crappy switched-mode power supply, which is also a good explanation. I would hope that they would power the RF chip through a linear regulator with good noise rejection, but who knows? That sort of switching noise can absolutely interfere with the radio's performance.
Magic Smoke (Score:4, Funny)
Obviously the magic smoke, although not released suddenly, does gradually leech out of the components leading to loss of performance over time.
Re:Magic Smoke (Score:5, Funny)
The real secret: The router is an automated mage machine, and needs mana potions to keep casting Lv. 11 Transmit Packet.
(What do you mean "the antenna isn't a Rubber-Sheathed Metal Staff of Aethertransmission +1"!?)
Amplifiers/Filters? (Score:3)
Could the analog components of the amplifier/filter circuits be degrading? If capacitors are leaking, etc then that would definitely make the performance decrease but maybe not enough to completely stop working.
You should consider another option: older equipment may not have firmware as good at dealing with congestion (802.11N helps with this), or maybe the new box has 5Ghz which has much less interference issues? Maybe the real degradation was the neighbors installing access points? You may also have had certain pieces of gear installed that interacted badly with your access point (some of them have really awful firmware or very loose implementations of the standard).
These are just guesses... I haven't personally had any degradation except for interference in the 2.4Ghz band. When I bought this house devices would only detect my network and maybe one other. Now seven show up. Interference isn't just a problem in apartments anymore.
analog transistors age (Score:5, Insightful)
This is a hypothesis based on peripheral involvement with analog and digital RF at 0.5 and 1.5 GHz for twenty years.
AFAIK, the output stage of anything broadcasting above about 2 GHz has to be analog, with the lower frequency signal mixed into a carrier at the higher frequency. Digital synthesizers and chips which can deal with 1.5 GHz directly are still very expensive and are unlikely to be used in the consumer routers. So the final output stage is likely an analog RF transistor.
Analog transistors change characteristics with age at elevated temperature, where elevated is anything over 20C. Implanted ions diffuse with time and temperature, changing junction characteristics. The small structures required by high frequencies are more sensitive to such things.
Re: (Score:3)
Analog transistors change characteristics with age at elevated temperature, where elevated is anything over 20C.
Ever notice how hot wireless routers get, especially when they are stacked? I find this to be the most plausible explanation yet posted...
Re:analog transistors age (Score:5, Interesting)
Check the power supply (Score:3, Informative)
Check the power supply. Usually the electrolytic capacitors are already dry.
Designed to fail (Score:3, Funny)
Not sure about wireless gear, but some devices (e.g. printers, light bulbs, fridges) are designed to break after a certain period of time, so that you would buy a new one.
Familiar with the problem, and here's how I fix it (Score:4, Interesting)
Anyhoo, I fix the problem by simply switching the equipment to another channel, say, 3-4 steps away, to make sure the frequency some of the components will be switching at will be notably different. So far it has worked with all equipment I've had this problem show up on. After a while the signal attenuation develops on the new channel as well, upon which I simply switch back to the one I used before. Rinse, repeat.
Re:Familiar with the problem, and here's how I fix (Score:5, Funny)
"Bingo, I would wager that most households use wireless only now, since wireless only devices are becoming so popular. I just bought a house...not one inch of ethernet in the place. I don't know about you guys...but that would drive me crazy to make all my desktops wireless!"
Same here. My house was built in the 1950s. Guess they used mostly wifi back then, too.
Power adaptors to blame. (Score:5, Informative)
In my experience power adaptor degradation is the main culprit. Over time the adaptor will provide lower voltages and a less stable current. This translates into a lower signal output and higher noise respectably. I've seen bad adaptor turn repeaters into signal jammers - trust me, that was not an easy issue to troubleshoot...
Several causes, but a few that spring to mind... (Score:5, Interesting)
1. slow burnout of emitter gear due to thermal degradation (yes, clock chips and transistors get hot, as do solder tracks and joints). Thermal runaway can occur if a solder joint fails and arcs, or overvoltage causes signal tracks to vapourise.
2. ionising radiation, particularly on unshielded components such as antenna conductors (I've seen something like this occur on an externally mounted amateur radio antenna: the sunward side of the antenna completely degraded, the result being that the only signals received (or sent) were on the shadow side).
3. component quality on consumer gear is not as stringent as it could be. Components can and do fail, and considering the number of components in a lot of consumer gear, it's a wonder any of it actually leaves the factory.
4. the noise floor of several years ago was far, far lower than it is now. The ERP of newer gear is (by design or by necessity) higher than older gear as more and more transmitters have to share the band. As a result, the signal quality taking a dive may be at least partly illusory. The equipment may actually be perfectly fine.
5. parasitic structures in semiconductor packages may be the catalyst for failure, either immediate or delayed. Such structures may be as small as a single atom of chlorine embedded in a crystal of germanium - innocuous at first (undetectable, even), but over time and use, that contamination will alter the chemistry of the semiconductor, possibly causing it to bond with the package material and rendering it useless. This might not even be an issue in high powered gear like regulators but in something like a microprocessor, it's a showstopper.
GREMLINS! (Score:3)
Eating away at the PCB!
Old and tired. (Score:4, Informative)
It's the tech priests (Score:3)
The poor ministrations of the duties of the tech priests leads to decay. It's either that or it's been touched by Nurgle.
Over 100 comments (Score:3)
and nobody asked:
âoeDid you measure the signal quality?â
There are lots of apps that show you various signal parameters for most platforms (Linux, iOS, Windows). Even if you donâ(TM)t know what the signal parameters mean, the magnitude of difference between an old and new router can tell you something.
I think, if there is no difference, it is perception based; otherwise an overworked flux capacitor.
Apple AirPort... (Score:3)
Go get some Apple AirPort Expresses. :)
Note: I'm an Apple fanboy and heavily invested.
I've tried DLink, Linksys, Cisco [which works, but on the $$$ corporate level], a few others, and Zyxel. Zyxel came close -- but the configuration has to be specific [repeater talk to SSID w/ specific MAC id]. The default quick setup could leave the sub-routers chattering amongst themselves... But I digress.
The AirPort's at $99 pay for themselves in setup alone. And frankly, they "just work". Unlike all the others the AirPort DOES PROPERLY PASS ALONG MULTI-CAST THROUGHOUT THE NETWORK. All the other products sub-routers ... dropped multi-cast. No more AirPrint, AirVideo, etc... Yeah -- there's a ton of iOS devices along with Mac's involved on my networks now. :)
They dynamically can be setup as a sub-sub-repeater. Wander the network rather seamlessly. I've just recently gone through this headache and with the AirPort's they will *OWN* the area I want to cover -- add AirPort's as needed to have signal strength / coverage. Just did a 6,000sq/ft house -- all three floors, my home, and the office at 18,000 sq/ft plus yard coverage [as the bay doors are opened :-].
Amazing product.
You need to fill your house with "Monster Air(tm)" (Score:4, Funny)
Monster Cables(tm) are much better for carrying electric audio signals, because they are outrageously expensive.
You need to buy cans of Monster Air to spray around your house.
You can find the product on their web site, and you really will be able to hear the difference on your wireless connections.
Probably.
WRT54G routers (Score:5, Interesting)
Re:Check if your channel is too crowded (Score:5, Informative)
Using the same channel does not increase signal interference. Signal interference comes from APs using neighboring channels in close proximity. If you're looking for greater range, try switching to the same channel as your neighbor. Your bandwidth could be lower, but the interference will be reduced.
Re: (Score:3)
Using the same channel does not increase signal interference. Signal interference comes from APs using neighboring channels in close proximity.
Err, that makes zero sense. Wifi access points do not coordinate their transmissions or do any sort of code division multiplexing or anything else that might help with interference. Two transmitters on the same channel will absolutely interfere, worse than if they were on neighbouring channels. If you are lucky, they will interfere enough that the other access point decides to switch channel.
Re: (Score:3)
Tell their stinkin' signals to get off your lawn!
Lawn, hell! They've gotten into his house...
Re:Capacitors and other parts are not invulnerable (Score:4, Funny)
You are probably on to something here.
I volunteered at Free Geek, a computer recycler / refurbisher, years ago when "fat caps" on the motherboard was a frequent reason for computers to be sent there. AIR, the problem was a lot of counterfit components being sold to reputable manufacturers. There were several big name manufacturers involved. Something like that could be happening in the router market. Going with the lowest bidder for the components is still important in low margin markets.
Another possibility is that a kid in the neighborhood is collecting the innards of smoke detectors as part of his unofficial science project, and is storing them too close to your house. A radioactive environment will shorten the life of capacitors and other components. Have you noticed whether any of your kids glow in the dark?
I'm kidding with that last, of course. Sort of.