Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Wireless Networking Networking Hardware

Ask Slashdot: Why Does Wireless Gear Degrade Over Time? 615

acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Why Does Wireless Gear Degrade Over Time?

Comments Filter:
  • by Anonymous Coward on Sunday October 21, 2012 @03:01PM (#41723023)

    and worn out. Also I think they have pretty short life spans.

  • by Anonymous Coward on Sunday October 21, 2012 @03:02PM (#41723027)

    As all of your neighbors add wireless routers, the noise floor goes up, and the usable signal goes down, even though the signal strength is the same.

    • by mk1004 ( 2488060 ) on Sunday October 21, 2012 @03:44PM (#41723335)
      That wouldn't explain why replacing the router fixes the problem, unless he just happens to be replacing the old router with one that just happens to have a stronger transmitter or better antenna. The pessimist in me says that the chances of that happening can't be 100% of the time.
      • by Migraineman ( 632203 ) on Sunday October 21, 2012 @03:51PM (#41723397)
        More than likely, the older router was expecting a relatively clean RF environment, and was crippled when all the neighbors deployed APs nearby. The newer APs were designed to handle cluttered environments, and their more-advanced algorithms provide improved performance over the previous generations' products. As old equipment is replaced with new, you'll probably see the same degradation in performance until new countermeasures are developed (in the next gen equipment, of course.) Ref: arms race.
        • by Calos ( 2281322 ) on Sunday October 21, 2012 @04:19PM (#41723571)

          "Algorithms" aren't going to change because that requires a standard that must be followed by the transmitter and receiver. Unless s/he's upgrading from something like 802.11b to 802.11g, then there shouldn't be any such change. Possible exception would be a proprietary addition, but the problem remains.

          It would be interesting to know if, when switching out the router, if s/he changed the frequency it's operating on. There are different bands that can be chosen even within the 802.11g spec, a newer router might have selected a less busy band automatically.

          Then of course there's the fact that 802.11n completed changed frequency bands, from the 2.4 GHz region (which is extremely cluttered) to the 5 GHz region, which is relatively empty. That said, the higher frequency would be more impeded by solid barriers, e.g. walls. But it may compensate by higher transmit power, I don't know.

          Hard to say if transmit power is really changing without being able to rule out other factors. But electronics do degrade. First suspect I'd think would be cheap capacitors. Poorly designed transistors could degrade, but this seems unlikely as RF band usually uses BJTs. Dust buildup could increase temperatures, which could hurt the efficiency and gain of these devices, but that's a rather long shot.

          • by sneakyimp ( 1161443 ) on Sunday October 21, 2012 @04:45PM (#41723769)
            I'd wager there are more algorithms involved than just the 802.11 protocol -- that protocol is the top layer in a stack of technology. Before you even get to the part where you are doing any kind of data handshaking, you might have a proprietary algorithm that filters your raw radio signal to weed out interference. There are also implementations of 802.11 on the market with non-standard features [wikipedia.org]. Furthermore there is the inevitable forward march to ever-improving 802.11 standards. 802.11g is really old now. 802.11n is even old news. I've seen gobs of 802.11ac on sale at newegg.com and I don't even think the standard has been formalized. The "algorithms" are absolutely, definitely changing.
          • by zenith1111 ( 1465261 ) on Sunday October 21, 2012 @06:24PM (#41724267) Homepage

            Then of course there's the fact that 802.11n completed changed frequency bands, from the 2.4 GHz region (which is extremely cluttered) to the 5 GHz region, which is relatively empty.

            The better and more expensive Access Points can usually operate at both 2.4 and 5GHz, but most of the affordable 802.11n devices only operate at 2.4GHz (at least those I can find).

            I was looking for a wireless card to enable an old laptop to connect to my 5GHz AP and I found that pretty much all of the cheap wireless cards also only operate at 2.4GHz.

            • by postbigbang ( 761081 ) on Sunday October 21, 2012 @10:17PM (#41725341)

              It's not so simple. Let's review the history:

              802.11a was first (there is another but obscure predecessor), and used 5Ghz. Advanced modulation techniques blew the doors off 802.11b @2.4ghz, which at 11mbit/s (on a good day), had a very low yield.

              802.11g also uses 2.4ghz, but early products had trouble going back and forth between b and g, thus slowing throughput down, despite faster yields in g via advanced modulation techniques.

              Enter N, was an advanced modulation scheme with higher throughput, first largely found on 2.4ghz. At first N was really fast, and had only a bit of trouble SLOWING DOWN for b and g. You have one collision domain unless you break out the b/g radio with the n radio. So, you have to dawdle while b goes through, then the channel is free again for something faster.

              Then come the dual-band radios, and the dual band, dual radio routers which can walk and chew gum-- and handle paralellizing a 2.4ghz and a 5ghz conversation simultaneously-- major thruput.

              The transceivers in the older routers appear to slow down, but in fact, they stay the same compared to newer ones for three reasons: 1) better firmware design that can switch back and forth quickly between protocols (where present) 2) have dual radios for bgn and (maybe a)Highband N and 3) the more recent the device, the more likely it has faster processing power inside the router. The final reason is that your backhaul might be getting faster without you knowing it; DSL gets faster but so also do cable broadband connections. And it's likely the driver in your machine is faster; they change all the time with small improvements, sometimes in real throughput.

              Summary: the router didn't change, but newer stuff is faster given the same conditions for the reasons stated.

          • by Anonymous Coward on Sunday October 21, 2012 @08:05PM (#41724715)

            Actually, radio standards (such as 802.11) never specify the receiver algorithm or architecture, as it is not necessary for interoperability. The receiver is only described in general terms.

            Standards specify what is to be transmitted over-the-air, and might give a reference (typically in efficient) implementation of a transmitter for testing purposes. They might also give some examples of expected receiver output, also for testing.

            Standards deliberately do not specify receivers, to give engineers an opportunity to innovate and provide differentiation in the market. Just one choice, for example, might be whether to do soft or hard decision decoding. (I happen to design receivers and transmitters for a living.)

            As you've said, chances are any degradation are due to electrolytic capacitors drying out (and for that reason electrolytic are seeing less use), dirt and contamination, or maybe corrosion on connectors. Dropping output power might be one issue, but instability or inaccuracy developing in the timebase (crystal oscillator) is probably a bigger issue, and most actual failures end up being a power supply problem.

          • by needsomemoola ( 966634 ) on Sunday October 21, 2012 @08:49PM (#41724923)

            There are 14 channels (frequencies) in 802.11b/g/n (2.4 GHz). There are only 3 that do not overlap though (1,6,11). The best thing to do if you plan to use the 2.4 GHz range is to run something like inSSIDer and see which of those 3 channels are least congested, then set your router to use that range. The problem with 2.4 is the lack of non-overlapping channels, and the fact that most routers have a default setting to pick the "least congested channel" but not conform to the 1,6,11 standard. So therefore you have all your neighbors congesting multiple channels by overlapping 1 and 6 or 6 and 11, using a channel in between. This is a nightmare for high-density areas (I do wireless for large conferences. It's a huge challenge).

            In 802.11a/n (5 GHz), there are 23 channels you can use (depending on if you bond for N or not). This is like comparing a 3 lane highway to a 23 lane highway. Your density capacity is FAR higher than 2.4. The downside is that most mobile devices do not have 5 GHz radios, and 5 GHz, because of the nature of the higher frequency, does not penetrate (giggidy) as far as 2.4 (as you said). From a management point of view this is a benefit, but when you are trying to cover a large house it leads to weak signals at the edges of the house, if you center it. This is a good point to use multiple access points though (routers without the routing in lay terms).

            So, the near future for wireless is 5 GHz (or, "Wireless A and N"). 5 GHz is catching on (iPhone 5 and a few Androids and Tablets have it now) and will start to get much busier, but the great thing about that is it's designed for it. The downside to the less-near future is that 802.11ac will add larger bonding in the 5 GHz range, which will lead to less available channels/capacity. We'll see how that goes...

            One correction to your comment I have is that N is not 5 GHz specific, and it has not at all changed people from 2.4 to 5. It's only allowed more throughput than G or A did on those frequencies. It's caused more problems than not because of the fact it can use the 40 MHz window, taking up 1 of the 3 non-overlapping channels, pissing off your neighbors.

        • by Hadlock ( 143607 ) on Sunday October 21, 2012 @06:22PM (#41724255) Homepage Journal

          Then wouldn't my Linksys WRT54 GS (the one you can swap out the firmware on) start working better when I put DD-WRT or OpenWRT on it? My WRT54 just up and died two months ago (this was already the warranty replacement) and the wifi would just die if two people started using it one day. Both WRT54s were using the same firmware version (it hasn't been updated in a while) and both of them dropped to about 20% of their original range... then just died completely, only allowing you to connect after reboot.

      • by ATMD ( 986401 ) on Sunday October 21, 2012 @04:02PM (#41723465) Journal

        If you read the OP carefully, all he says is that he replaced the routers - not that doing so actually fixed the problem.

        Deteriorating SNR does seem the most likely explanation.

      • by blackicye ( 760472 ) on Sunday October 21, 2012 @04:51PM (#41723803)

        That wouldn't explain why replacing the router fixes the problem, unless he just happens to be replacing the old router with one that just happens to have a stronger transmitter or better antenna. The pessimist in me says that the chances of that happening can't be 100% of the time.

        From my experience with all the major manufacturers of consumer routers and switches, the problem is capacitor quality primarily.
        Transmission range and stability will suffer over time because of unstable or insufficient voltage.

        Also these devices get really hot these days. Most if not all are passively cooled, and don't even have much real ventilation in their casings.
        They're designed to be cheap, and to last for at least 12 months generally.

        I've replaced all the capacitors on several Linksys "Business class" Gigabit switch, they all started failing after about 14 months.
        I did this to my own switch about 4 years ago, and it's still going strong today. I've also done this on an old Linksys WRT54G.

        • by sortius_nod ( 1080919 ) on Sunday October 21, 2012 @05:17PM (#41723941) Homepage

          There was a well known problem of Linksys caps becoming detached from the board due to heat. They'd be fine when cool, but leave them on for a few hours & signal would degrade almost completely. The only solution was to either (as you did) replace them, or, if you were lazy, resolder them to the board.

          There's many factors at play with wireless networks. The only way to be sure it's a failing cap/chip/board or interference from other devices is to manually test every component & run a frequency mapping scanner in every room in the house.

      • by gfxguy ( 98788 ) on Sunday October 21, 2012 @09:30PM (#41725131)
        This is going to sound stupid, but I'm going to say it anyway - I'm not an electrical engineer, but my first wireless router to die died because the wall-wart power supply died. I don't imagine the electronics need all that much to work, but the transmitter might - is it possible to lose some power output over time? When I measured the output with my meter, it was like 3.2V, but it was supposed to put out 9V. I replaced it with a universal wall-wart and brought the router back to life, good as new.
    • by Gorobei ( 127755 ) on Sunday October 21, 2012 @03:44PM (#41723339)

      Damn neighbors - I see 35 wireless networks from my mac: Jes's Awesome Network, Doris Family, Alex, Bellclaire Hotel, I Win, Epsteinland, buckduke, toujoursavectoi, sheilajaffe1, ming. And a bunch of hexcodes. Bit sad to not see HotWorkoutPants on the list today.

    • by currently_awake ( 1248758 ) on Sunday October 21, 2012 @04:21PM (#41723585)
      semiconductors are known to degrade if run hot. home routers don't have fans or heat sinks. budget devices often run hot to save on heat sinks and fans.
    • by tlambert ( 566799 ) on Sunday October 21, 2012 @04:37PM (#41723705)

      Like, oh, say, "smart meters".

      When they started installing "smart meters" in my area because "too many" people were installing solar, and PGE were paniced that they'd have to pay the same rate they were charging instead of cutting off the payout at net zero by having differential rates so they payed less when solar was active, my bandwidth went to hell, and I had to more centrally locate my AP to avoid the interference.

    • by whizbang77045 ( 1342005 ) on Sunday October 21, 2012 @04:56PM (#41723843)
      I'm having problems on a 2+ year old router, yet I live in a remote area. There aren't any other routers any where around here, because there aren't any other people around here. You can scan for other routers, and you won't find any. Thus, in at least some cases, the noise floor isn't the culprit. I suspect there may be more than one problem. I don't doubt the noise floor is a problem in some areas, but I'd bet on component problems in the router in other cases, like mine.
    • by mysidia ( 191772 ) on Sunday October 21, 2012 @05:24PM (#41723975)

      It's also possible, too much power was being transmitted, it wasn't balanced properly, or the antenna was inefficient, resulting in signal deflection, and heat buildup in the transmitter on the old AP, causing eventual long-term damage/degradation to the transciver.

  • built in failure (Score:4, Insightful)

    by Anonymous Coward on Sunday October 21, 2012 @03:02PM (#41723031)

    built in failure. bow to your corporate masters and go consume.

  • by SClitheroe ( 132403 ) on Sunday October 21, 2012 @03:03PM (#41723033) Homepage

    Over 3 years I'd imagine a greater density of wifi devices all sharing the same spectrum to have appeared. Perhaps the signal level is the same, but the noise floor has increased substantially, degrading performance.

    • Re: (Score:3, Funny)

      by epSos-de ( 2741969 )
      Yes you are correct. Switch the WiFi channel or the transmitter to some number that is not used already. Channel 1, 11 and 6 are the most commonly used ones. Just use channel 2 and you will be 50% better than on channel one already.
      • by K. S. Kyosuke ( 729550 ) on Sunday October 21, 2012 @03:20PM (#41723169)
        I thought the channels overlap to a significant degree. If there is interference on channel 1 to such a degree that you feel the need to switch it, are you sure that switching to channel 2 would actually help?
        • by Ironhandx ( 1762146 ) on Sunday October 21, 2012 @03:43PM (#41723331)

          There is overlap to a significant degree, channel 2 won't help, but channel 4-5 might. There are generally about 3 channels worth of overlap on those wifi channels. If its only a small amount of interference thats causing the signal to drop, channel 3 might even do it. However if theres enough interference to cause problems, swapping to channel 2 from channel 1 won't help because they share about 80% of the same band.

        • by realityimpaired ( 1668397 ) on Sunday October 21, 2012 @06:19PM (#41724239)

          Each channel is 20MHz wide, and you need to go 5 channels up to get one with no overlap at all. Do the math, that means the channels are at 4MHz intervals (they're actually 5MHz apart, for extra white space). Channels 1, 6, and 11 do not overlap with each other, channels 2, 7, channels 3, 8, etc.

          If there's enough noise at 2412MHz (Channel 1) that it's essentially unusable, then switching to 2417MHz (Channel 2) will probably not make a difference at all... the frequency listed is the peak frequency, but channel 1 is actually 2402-2422MHz and channel 2 is actually 2407-2427. Going to a higher channel that's farther away, however, probably would make a decent difference.

          That's also why lots of newer routers have a channel bonding 40MHz option... they transmit/receive on two channels at the same time, in the hopes of being able to get better bandwidth through the noise.

      • by pepsikid ( 2226416 ) on Sunday October 21, 2012 @03:20PM (#41723173)
        NO. If you use channel 2, then you're straddling channel 1 and 6, so you actually have to compete with more interference. Unless you live somewhere with no other wifi neighbors, like out in a desert or 3rd world country, never use anything but channels 1, 6, 11 or 14!
        • by NJRoadfan ( 1254248 ) on Sunday October 21, 2012 @03:28PM (#41723237)
          Use of channel 14 isn't permitted in the USA. Routers sold here disable it by default, although you can get the option back by flashing 3rd party firmware onto the router. I ran a router on channel 14 for a brief period of time to see if interference was causing connection issues. The problem I ran into is some wireless devices wouldn't work on channel 14 (like ebook readers) since the radio was region locked.
          • Re: (Score:3, Insightful)

            by Anonymous Coward

            Don't get caught by the FCC, there are some pretty hefty fines for interrupting that reserved space in the spectrum.

        • Re: (Score:3, Interesting)

          by tehrhart ( 1713168 ) *
          I recall that the IETF meeting in Paris this year had some wifi troubles and they ended up using overlapping channels intentionally. It would seem to me that straddling two of the non-overlapping channels would at least allow you to compete for resources in two areas - i.e. if channel 1 was flooded, you'd still have some bandwidth availible in your overlap of 6, but I could be mistaken. Reference article about IETF Paris: http://mobile.slashdot.org/story/12/03/29/140207/ietf-attendees-reengineer-their-hotel [slashdot.org]
        • by Tirs ( 195467 )

          No desert, no third world country (yet). But only ONE WiFi around: mine.

          When I first installed my router, about four years ago, I was able to reach the signal from the garage. Now I'm still able to do it. I didn't measure numerically if the signal degraded along the years or not, but for practical purposes the answer is NO: I'm still able to connect from the garage, with the same two "dots" in the "intensity" display.

          So, in my case, the signal did not degrade with time. Therefore, I think the neibourghs dep

      • by transporter_ii ( 986545 ) on Sunday October 21, 2012 @03:24PM (#41723199) Homepage

        Yeah. Just change it to channel 2 and interfere with everyone using channels 1 and 6, BECAUSE THE ONLY NON-OVERLAPPING CHANNELS on 2.4 Ghz ARE 1, 6, and 11.

        I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.

        • by TubeSteak ( 669689 ) on Sunday October 21, 2012 @04:14PM (#41723541) Journal

          I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.

          I'd like to see you and I'm sure there are multinational corporations that would pay you megabucks to make it happen.
          The problem isn't the standard, it's the amount of unlicensed spectrum available at 2.4 GHz

          The unlicensed 2.4 GHz has 84.5 MHz of bandwidth.
          The unlicensed 5.8 GHz has 125 MHz of bandwidth.
          The unlicensed 60 GHz has 7 GHz of bandwidth from 57GHz to 64GHz

        • by ColdWetDog ( 752185 ) on Sunday October 21, 2012 @05:41PM (#41724071) Homepage

          I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.

          Of course, you're a Slashdot Armchair Expert. We're just better than everybody at everything.

          Goes without saying...

      • by DJRumpy ( 1345787 ) on Sunday October 21, 2012 @03:33PM (#41723265)

        This used to work, but with the common availability of WiFi, any scan of your local neighborhood will often never find a channel with more than 1 channel separating you from neighbors (auto channel switching isn't aggressive enough...why is that?). The only way I've found to keep ahead of it is to invest in new frequencies as they become available. I've had the 5Ghz spectrum for quite a few years with no neighbors using it until this month. The first one popped up on my scanner a few weeks ago.

        The other has to do with the quality of the equipment. I used to use Linksys, then Netgear, and then tried Buffalo as was disappointed with each either through hardware issues, or due to poor performance. The Linksys gear seemed to go down hill after Cisco bought them, but I always thought that Cisco was an industry leader (not in the telecom field so feel free to chime in). My old 10Mb switches are still working after a decade but it seems rare to find one of these that lasts this long these days.

        I finally ended up with an Apple Time Machine which worked well with a mixed environment of Windows and Mac's for wireless backup, and my original printer didn't have WiFi so the print server was ideal. I have a WiFi printer now but that also works as well.

        4 years later and I'm still pulling 16 MB/s (granted with very little competition on the 5Ghz band) with mixed mode (g for the printer, and some older smart phones that can't hit 5Ghz).

        Holding out for the newest frequency, after which I'll switch again.

      • by WilliamGeorge ( 816305 ) on Sunday October 21, 2012 @03:38PM (#41723287)

        How the hell did the parent post get modded funny?

    • by afgam28 ( 48611 ) on Sunday October 21, 2012 @03:09PM (#41723099)

      I once had a router where the signal started to go bad over time. I called up the company and the tech support guy told me that most routers "wear out" after around 2 years, and that I'd need to replace it. He struggled to give a logical answer when I asked him how a device with no moving parts could wear out so quickly.

      If you're right, and if this is the standard advice being given to everyone, we're in for a huge arms race.

      • Solid state amplifiers do degrade over time, faster if poorly cooled and/or driven harder than they are designed for. Two years seems like a pretty dreadful lifespan, even for cheap shit shoved into unventilated plastic boxes; but perhaps cost sensitivity has driven us to that...

    • by girlintraining ( 1395911 ) on Sunday October 21, 2012 @03:37PM (#41723285)

      ...but the noise floor has increased substantially, degrading performance.

      Bingo! You hit the nail on the head. Wifi is now commonly found in most homes. And the overwhelming majority are b/g routers. That means that everyone's "last mile" internet is running on only three non-overlapping channels (in the United States), with a maximum capacity of only 54mbps for each of those channels. While your effective range decreases, your signal still continues to interfere with others out to its maximum range, which is typically around 300 feet. Beyond that, it's only a decibel or so above the noise floor (about -96dB) and is basically ambient. So consider urban density: In a 300 foot hemisphere, how many transmitters will be in that space?

      Well, I live in a residential neighborhood that is mostly single-dwelling homes, which is about as ideal as you can get from a low-density city environment. Using a pringles can, I took a neighborhood survey and found about 26 access points within 300 feet of my home. Now, this is a survey that took several days to complete because of the marginal signal integrity, after which I drove my car in circles matching associated clients to those APs. Each access point had approximately 2.25 clients associated with it. So that's about 60 transmitting devices, in an ideal urban environment. And that's just those using wifi.

      2.4GHz is also used by: Wireless phones, microwaves, wireless "hifi" stereo systems, etc. It's also used by wireless mice/trackballs and keyboards. So, realistically, I've got at least 100 devices that are transmitting with a signal high enough to interfere with the front-end RF of my wifi.

      Shannon's Law stands tall in all of this: As you increase the noise floor, the amount of data you can transmit regardless of encoding scheme or receiver selectivity falls proportionally. Every device added decreases your own devices' performance.

      I found that by setting my router to 'g' only and then forcing the bitrate down to 24mbps, I was able to get a much more reliable and speedy signal. Every WiFi standard is designed to cope with interference by renegotiating to a higher or lower bitrate dynamically. Which would be fine if they were isolated, but in an environment where they're in close proximity to each other, what happens is as each device broadcasts and interferes with the other, they detect this and then renegotiate, generating more interference; And pretty soon you've got routers constantly in a state of renegotiation, with fluxuating bitrates. Manually force your router to a specific bitrate and don't allow re-negotiation, and you'll find that those momentary spikes in the noise level won't wash out your signal -- renegotiation takes 10--30ms, and during that time, you can't send/receive any data. The data burst that caused it is over long before the renegotiation completes.

      So in short, it's not your transmitter, it's the environment. Take your transmitter out of its default settings and enable RTS/CTS (if available) and you'll be fine. Another, more sociopathic answer, is to get a 100W 2.4ghz booster (you'll have to build it), mount it on your roof, tune it to one of the 3 non-overlapping channels (I suggest 1 or 11, since most microwave ovens tend to tune at the middle of the band -- channel 6), and then let it run for about 3--5 days. Everyone will bail off that channel because nothing tuned to it will operate over a distance of even a few feet. Again, very illegal, very sociopathic... but very effective. You'll have to do "plow the spectrum" about once every month or two, so count on downtime.

      • by Artifakt ( 700173 ) on Sunday October 21, 2012 @04:27PM (#41723627)

        You've said quite a few interesting and even correct things in your post, but I want to add one point (or quibble if you wish). You are probably not in an ideal urban environment, and don't want to be. That's because an 'ideal' environment might have the broader spacing given by single family homes, but a very low average income, where there would be both fewer access points and fewer average clients per AP. In theory, parts of Detroit that are now full of more abandoned homes than inhabited ones sound 'ideal'. 26 points and 2.25 clients per sounds like you qualify as upper middle class (if anyone does these days). You would actually see less traffic in a wealthier neighborhood, as there would be more average distance between houses, and less in a poorer neighborhood, as there would be fewer people on wireless. In addition, there's a northern/southern states bias in the US, in that there's more brick and concrete block and less wood used in low to medium priced residential neighborhoods as you look at more northernly cities, and this tends to attenuate some traffic over shorter ranges.

        I would like to know more about how you measured the number of associated clients though. I thought some of the mixed wired and wireless hubs reported the wired accesses mixed in with wireless, and/or indicated both active and non-active connections, but didn't give out the rest of the information needed to properly interpret what you can get. If you queried my parent's wireless router at their place, for example, and if you could get the info remotely, you might find 14 devices assigned an address, but those are for two Cat 5E wired desktop machines that are currently active, and the rest are for laptops and such the various kids, grandkids, nieces and nephews have brought over when staying for a visit, and are still on the router's lists months later. I've had five different laptop or tablet machines connected through that router at various times visiting, but never more than one on the same weekend or holiday.

        You're the first post I've come to on this thread that has mentioned the sad relationship between microwave ovens and channel 6. Other people mentioned the noise floor early in the discussion, but I didn't notice any of them actually referencing that to Shannon's Law, either. You should be modded informative a couple of times, but I won't be surprised if the 'sociopathic answer' part gets you downmodded instead.

        • by girlintraining ( 1395911 ) on Sunday October 21, 2012 @04:53PM (#41723815)
          I'm actually in a suburban city in the midwestern United States, and the neighborhood is very middle class. If I go downtown, there are over 250 access points and well over 1,500 clients visible using the same method of discovery. If I head to the 'rich' part of town, there are fewer access points, but a lot more clients connected to them on average. And the poor parts of town are usually apartment housing, which is much denser and so has a higher number of access points... but a lower number of associated clients per, on average. So overall, except for high density commercial environments, the number of devices per mile doesn't vary much based on income -- it's pretty locked in on the number of people per mile. But again, these are averages from having sampled dozens of neighborhoods. In any given neighborhood, you're going to find locations that are very dense and others almost barren, sometimes separated by only a few hundred feet. Statistics, you know...

          I would like to know more about how you measured the number of associated clients though.

          I cheated, and it was probably illegal, though petty. I salvaged a grey fiberglass enclosure pod previously used for housing cable TV equipment, and then modified it so it could be quickly and discreetly attached to a telephone pole. Inside was a 'microATX' computer with a pair of SD cards to boot off of, and was underclocked and undervolted, the fans stripped off and replaced with large passive fins. water-resistant wire netting was added along the inside (the part facing the pole) and thermally bonded to oversized heatsinks. Two wifi antennas were mounted along the front at 45 degree angles (to catch both vertically and horizontally-aligned signals) and attached via SMC connectors to adapters. One was 'a', the other was 'a/b/g/n'. Last was a special diagnostic adapter with three antennas which cost an arm and a leg. It was a spectrum analyzer designed to identify sources of high EMR and localize the source. The remaining space was packed tight with high capacity deep cycle batteries. The unit could run for about 86 hours before it ran out of juice.

          It recorded every MAC address and associated BSSID, etc., as well as encoding and some other properties. You can catch new clients because before they associate, they broadcast the SSID they want to connect to in the clear. Over a period of three days it collected all that information and then wrote it out to the SDcards. Drag it home, hook it up to the car charger and pull the cards. Rinse, wash, repeat.

          You're the first post I've come to on this thread that has mentioned the sad relationship between microwave ovens and channel 6. Other people mentioned the noise floor early in the discussion, but I didn't notice any of them actually referencing that to Shannon's Law, either. You should be modded informative a couple of times, but I won't be surprised if the 'sociopathic answer' part gets you downmodded instead.

          I wouldn't be surprised either. Slashdot ought to just replace their mod system with "like" and "dislike", since that's really what it boils down to. I frequently get downmodded for saying something that is technically correct or possible, without addressing the ethical considerations. I prefer to tell people the whole truth and let them make their own choices, rather than leaving out critical information because I don't trust them enough not to do something stupid. People need to know what's possible, not just what's legal or ethical. This idea was summed up beautifully for me years ago on a forum far, far, away, when someone wrote "You can't expect a terrorist to care that his car bomb is taking up two parking spaces." Criminals don't give a damn about you or the law; They are there to grab the low-hanging fruit, and unless you know what's possible, how they operate, you can't defend against attacks.

          You can't be a good white hat without having worn the black hat.

  • Obligatory (Score:3, Interesting)

    by SuperMooCow ( 2739821 ) on Sunday October 21, 2012 @03:05PM (#41723061)

    It could be the noise floor going up near your house, or just planned obsolescence [imdb.com].

  • by pentabular ( 2609873 ) on Sunday October 21, 2012 @03:05PM (#41723071)

    ..have a tendency to degrade and fail over time.

    • by TechyImmigrant ( 175943 ) on Sunday October 21, 2012 @03:29PM (#41723241) Homepage Journal

      >cheap electroytic capacitors have a tendency to degrade and fail over time.

      Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.
      I don't think electrolytics are it.

      I have my suspicions about the noise figure of LNAs changing over time. There are some very highly strung, teeny weeny transistors in LNAs (Low Noise Amplifiers) right in the signal path.

      • >cheap electroytic capacitors have a tendency to degrade and fail over time.

        Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.

        True, but you do use them in your cheap switch-mode power supply, and as they degrade, you get additional AC noise on the rails of your amplifiers that are in the IF/RF signal path. Particularly in cheap routers that are operating near the limits of their amplifiers, voltage drops on the rails could cause clipping of the high frequency signal, which will result in dropped packets, required rebroadcasting, etc.

      • by EmagGeek ( 574360 ) <(gterich) (at) (aol.com)> on Sunday October 21, 2012 @04:33PM (#41723685) Journal

        Electrolytic capacitors can very well be it, even if they are not used in the RF signal path.

        They are used in power supplies, and a drying out electrolytic can cause a number of things:

        1) Lessened capacitance leads to reduced p/s output capacity, which can cause voltage to droop if the supply hits maximum duty cycle

        2) Drying out caps have higher ripple current, and higher heat dissipation, which consumes more of the power supply's capacity, and can also lead to drooping output

        3) Lower capacitance can lead to increased ripple voltage on the output, and that ripple can leak into the RF signal path, increasing the symbol error rate, and even causing frequency drift in the VCO

      • by artor3 ( 1344997 ) on Sunday October 21, 2012 @04:58PM (#41723855)

        I don't think so. I'm an RF electronics engineer, and we do all sorts of accelerated stress testing to check out the second half of the bathtub curve, and I've never seen degradation on anything in the RF/IF path inside the chip. The typical "wear out" failure mode is cracks forming in the protective seals around the chip and letting in moisture (from the air), which causes leakage (high current draw) and eventually just shorts out something important and the chip dies. At any rate, that shouldn't be happening in just a couple years, unless the submitter moves his PC back and forth between the freezer and the sauna every week.

        One possible explanation would be crystal aging. RF equipment tends to rely on extremely accurate quartz crystals to provide reference frequencies. Those crystals tend to drift as they age. If the design was already near the edge of acceptable frequencies, an extra 10 or 20 ppm from aging could easily result in several dB of degradation.

        Another poster pointed out the possibility of the router using a crappy switched-mode power supply, which is also a good explanation. I would hope that they would power the RF chip through a linear regulator with good noise rejection, but who knows? That sort of switching noise can absolutely interfere with the radio's performance.

  • Magic Smoke (Score:4, Funny)

    by lobiusmoop ( 305328 ) on Sunday October 21, 2012 @03:08PM (#41723085) Homepage

    Obviously the magic smoke, although not released suddenly, does gradually leech out of the components leading to loss of performance over time.

  • by rabtech ( 223758 ) on Sunday October 21, 2012 @03:08PM (#41723087) Homepage

    Could the analog components of the amplifier/filter circuits be degrading? If capacitors are leaking, etc then that would definitely make the performance decrease but maybe not enough to completely stop working.

    You should consider another option: older equipment may not have firmware as good at dealing with congestion (802.11N helps with this), or maybe the new box has 5Ghz which has much less interference issues? Maybe the real degradation was the neighbors installing access points? You may also have had certain pieces of gear installed that interacted badly with your access point (some of them have really awful firmware or very loose implementations of the standard).

    These are just guesses... I haven't personally had any degradation except for interference in the 2.4Ghz band. When I bought this house devices would only detect my network and maybe one other. Now seven show up. Interference isn't just a problem in apartments anymore.

  • by jfb2252 ( 1172123 ) on Sunday October 21, 2012 @03:10PM (#41723103)

    This is a hypothesis based on peripheral involvement with analog and digital RF at 0.5 and 1.5 GHz for twenty years.

    AFAIK, the output stage of anything broadcasting above about 2 GHz has to be analog, with the lower frequency signal mixed into a carrier at the higher frequency. Digital synthesizers and chips which can deal with 1.5 GHz directly are still very expensive and are unlikely to be used in the consumer routers. So the final output stage is likely an analog RF transistor.

    Analog transistors change characteristics with age at elevated temperature, where elevated is anything over 20C. Implanted ions diffuse with time and temperature, changing junction characteristics. The small structures required by high frequencies are more sensitive to such things.

    • by pongo000 ( 97357 )

      Analog transistors change characteristics with age at elevated temperature, where elevated is anything over 20C.

      Ever notice how hot wireless routers get, especially when they are stacked? I find this to be the most plausible explanation yet posted...

    • by NixieBunny ( 859050 ) on Sunday October 21, 2012 @04:43PM (#41723749) Homepage
      The RF transistors used to make a sizable fraction of a watt at 2.4 GHz tend to be made with many tiny transistor junctions in parallel on one die. Individual transistor junctions can fail, causing the output power to be reduced yet still non-zero.
  • by Anonymous Coward on Sunday October 21, 2012 @03:12PM (#41723117)

    Check the power supply. Usually the electrolytic capacitors are already dry.

  • by Anonymous Coward on Sunday October 21, 2012 @03:12PM (#41723121)

    Not sure about wireless gear, but some devices (e.g. printers, light bulbs, fridges) are designed to break after a certain period of time, so that you would buy a new one.

  • by carlhaagen ( 1021273 ) on Sunday October 21, 2012 @03:13PM (#41723127)
    I've been using wifi instead of ethernet for about 7 years now. Almost all of the NICs/APs I've used have displayed this problem with time. It's as if the equipment somehow develops creeping signal attenuation. My guess is that it's something relating to capacitors gathering a slow overcharge of some sort, causing them to block current in a growing fashion - I seem to recall this being possible from my early days of electronics studies.

    Anyhoo, I fix the problem by simply switching the equipment to another channel, say, 3-4 steps away, to make sure the frequency some of the components will be switching at will be notably different. So far it has worked with all equipment I've had this problem show up on. After a while the signal attenuation develops on the new channel as well, upon which I simply switch back to the one I used before. Rinse, repeat.
  • by Anonymous Coward on Sunday October 21, 2012 @03:16PM (#41723143)

    In my experience power adaptor degradation is the main culprit. Over time the adaptor will provide lower voltages and a less stable current. This translates into a lower signal output and higher noise respectably. I've seen bad adaptor turn repeaters into signal jammers - trust me, that was not an easy issue to troubleshoot...

  • by Tastecicles ( 1153671 ) on Sunday October 21, 2012 @03:22PM (#41723181)

    1. slow burnout of emitter gear due to thermal degradation (yes, clock chips and transistors get hot, as do solder tracks and joints). Thermal runaway can occur if a solder joint fails and arcs, or overvoltage causes signal tracks to vapourise.
    2. ionising radiation, particularly on unshielded components such as antenna conductors (I've seen something like this occur on an externally mounted amateur radio antenna: the sunward side of the antenna completely degraded, the result being that the only signals received (or sent) were on the shadow side).
    3. component quality on consumer gear is not as stringent as it could be. Components can and do fail, and considering the number of components in a lot of consumer gear, it's a wonder any of it actually leaves the factory.
    4. the noise floor of several years ago was far, far lower than it is now. The ERP of newer gear is (by design or by necessity) higher than older gear as more and more transmitters have to share the band. As a result, the signal quality taking a dive may be at least partly illusory. The equipment may actually be perfectly fine.
    5. parasitic structures in semiconductor packages may be the catalyst for failure, either immediate or delayed. Such structures may be as small as a single atom of chlorine embedded in a crystal of germanium - innocuous at first (undetectable, even), but over time and use, that contamination will alter the chemistry of the semiconductor, possibly causing it to bond with the package material and rendering it useless. This might not even be an issue in high powered gear like regulators but in something like a microprocessor, it's a showstopper.

  • by zrbyte ( 1666979 ) on Sunday October 21, 2012 @03:38PM (#41723291)

    Eating away at the PCB!

  • Old and tired. (Score:4, Informative)

    by SuperTechnoNerd ( 964528 ) on Sunday October 21, 2012 @03:49PM (#41723385)
    Perhaps it's frequency drift. As components age their values change slightly. And when dealing with 2.4 Ghz and above tolerances are strict. It's just my guess.. Take it or leave it.
  • by __aaqvdr516 ( 975138 ) on Sunday October 21, 2012 @04:24PM (#41723609)

    The poor ministrations of the duties of the tech priests leads to decay. It's either that or it's been touched by Nurgle.

  • by mevets ( 322601 ) on Sunday October 21, 2012 @04:44PM (#41723765)

    and nobody asked:
    âoeDid you measure the signal quality?â

    There are lots of apps that show you various signal parameters for most platforms (Linux, iOS, Windows). Even if you donâ(TM)t know what the signal parameters mean, the magnitude of difference between an old and new router can tell you something.
    I think, if there is no difference, it is perception based; otherwise an overworked flux capacitor.

  • by krray ( 605395 ) on Sunday October 21, 2012 @04:51PM (#41723801)

    Go get some Apple AirPort Expresses.
    Note: I'm an Apple fanboy and heavily invested. :)

    I've tried DLink, Linksys, Cisco [which works, but on the $$$ corporate level], a few others, and Zyxel. Zyxel came close -- but the configuration has to be specific [repeater talk to SSID w/ specific MAC id]. The default quick setup could leave the sub-routers chattering amongst themselves... But I digress.

    The AirPort's at $99 pay for themselves in setup alone. And frankly, they "just work". Unlike all the others the AirPort DOES PROPERLY PASS ALONG MULTI-CAST THROUGHOUT THE NETWORK. All the other products sub-routers ... dropped multi-cast. No more AirPrint, AirVideo, etc... Yeah -- there's a ton of iOS devices along with Mac's involved on my networks now. :)

    They dynamically can be setup as a sub-sub-repeater. Wander the network rather seamlessly. I've just recently gone through this headache and with the AirPort's they will *OWN* the area I want to cover -- add AirPort's as needed to have signal strength / coverage. Just did a 6,000sq/ft house -- all three floors, my home, and the office at 18,000 sq/ft plus yard coverage [as the bay doors are opened :-].

    Amazing product.

  • Monster Cables(tm) are much better for carrying electric audio signals, because they are outrageously expensive.

    You need to buy cans of Monster Air to spray around your house.

    You can find the product on their web site, and you really will be able to hear the difference on your wireless connections.


  • WRT54G routers (Score:5, Interesting)

    by XB-70 ( 812342 ) on Sunday October 21, 2012 @06:51PM (#41724401)
    Six years ago, I took some tupperware cereal containers, drilled two holes in the bottom and pushed a WRT54G router's twin antennae through each hole. I caulked both holes. I inverted the containers, took an old broom stick and jammed it up into the contaners and screwed each router up high outdoors on buildings spaced 400' apart. I configured all three routers with DD-WRT in bridged/AP mode. I have never taken them down. This is in Vermont. They work great and cover some 18 acres. What more can I say?

Always leave room to add an explanation if it doesn't work out.