Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Wireless Networking Media Television Hardware Science Technology

How Bad Can Wi-fi Be? 434

An anonymous reader writes "Sunday night in the UK, the BBC broadcast an alarmist Panorama news programme that suggested wireless networking might be damaging our health. Their evidence? Well, they admitted there wasn't any, but they made liberal use of the word 'radiation', along with scary graphics of pulsating wifi base stations. They rounded-up a handful of worried scientists, but ignored the majority of those who believe wifi is perfectly harmless. Some quotes from the BBC News website companion piece: 'The radiation Wi-Fi emits is similar to that from mobile phone masts ... children's skulls are thinner and still forming and tests have shown they absorb more radiation than adults'. What's the science here? Can skulls really 'absorb' EM radiation? The wifi signal is in the same part of the EM spectrum as cellphones but it's not 'similar' to mobile phone masts, is it? Isn't a phone mast several hundred/thousand times stronger? Wasn't safety considered when they drew up the 802.11 specs?"
This discussion has been archived. No new comments can be posted.

How Bad Can Wi-fi Be?

Comments Filter:
  • WiFi is microwaves (Score:5, Informative)

    by QuietLagoon ( 813062 ) on Tuesday May 22, 2007 @09:44AM (#19220423)
    Can skulls really 'absorb' EM radiation?

    802.11b/g uses 2.4GHz radio waves. That's the same frequency range as microwave ovens. Microwave ovens work because the microwaves are absorbed by the bonds in the water molecules of food (which is why dry food does not cook in microwave ovens).

    So yes, human tissue that contains water can absorb WiFi radiation. That is a fact.

    What is not known is: how much absorption of that radiation is bad for the kids?

  • by peppy ( 312411 ) on Tuesday May 22, 2007 @09:45AM (#19220443)
    And in other news from the BBC http://news.bbc.co.uk/2/hi/technology/6676129.stm [bbc.co.uk]
  • by swschrad ( 312009 ) on Tuesday May 22, 2007 @09:46AM (#19220455) Homepage Journal
    not a pretty sight, is it?

    the FCC has specifications of radiation density versus frequency that are limits in their rulebooks, limits used to isolate access to radio facilities from microwaves to commercial broadcasters... to ham radio operators burning electrons in the basement. these have been codified by medical research. if you're going for an advanced ham license, you have to study the milliwatts per meter limits, the question occasionally comes up on the test.

    so there are 3/4 million americans who know this, not just ten academics in the tower.

    where the hell did this whining of Luddites come from, and why wasn't it left there?
  • by supersnail ( 106701 ) on Tuesday May 22, 2007 @09:47AM (#19220473)
    Here [theregister.co.uk]

    Basicaly in the old country they have a government official who is unprepared to admit radio waves, mobile phones etc, are safe; no matter what the evidence.

     
  • by QuietLagoon ( 813062 ) on Tuesday May 22, 2007 @09:54AM (#19220577)
    802.11a uses the 5GHz range, out of the way of microwave ovens.

    2.4GHz was used because it was available for use, i.e., it would not interfere with frequencies already allocated to other services in the microwave area.

    In other words, the thought process (if you can call it that) was not, "let's find a frequency for 802.11b that is free of interference from other sources". It was more along the lines of, "let's find a frequency for 802.11b so that 802.11b won't mess up anything of import, i.e., microwave ovens don't really care about interference from your wireless router.

    By the way, the same "thought" process was used to pick a frequency for the 2.4GHx wireless phones.

  • by Shakrai ( 717556 ) on Tuesday May 22, 2007 @09:54AM (#19220585) Journal

    I've always wondered why these networks use 2.4GHz radio waves.

    I think it mainly had to do with the fact that the same part of 2.4GHz is open for unlicensed use globally. The other unlicensed ISM (industrial-scientific-medical) bands in the United States are used for other stuff in other nations. The easiest example is 900mhz. Part of it is available for unlicensed use in the United States. But as anybody with a quad-band GSM phone knows, that's a cellular band in most of the rest of the world.

  • Re:Eek! (Score:3, Informative)

    by Xest ( 935314 ) * on Tuesday May 22, 2007 @09:54AM (#19220593)
    You obviously didn't see the program, one person in it complaining wifi gives her headaches had covered her entire room in tin foil to protect her from it all :p
  • by morgan_greywolf ( 835522 ) * on Tuesday May 22, 2007 @10:06AM (#19220787) Homepage Journal
    Typical 802.11b/g = 1 mW - 100mW
    Typical microwave oven = 750W-1500W (750,000 - 1,500,000 mW)

    Big difference.
  • Re:FUD (Score:3, Informative)

    by mdsolar ( 1045926 ) on Tuesday May 22, 2007 @10:07AM (#19220801) Homepage Journal
    So, your WIFI is 1 meter away and the cell tower is 1 kilometer away, which delivers more power where you are at. Take the cell tower number and divide by a million (1000^2) and you'll see that WIFI yields greater exposure. Doesn't mean there is a problem, but it is not just power level at the antenna that is important.
    --
    Fusion power from your roof: http://mdsolar.blogspot.com/2007/01/slashdot-users -selling-solar.html [blogspot.com]
  • by Ellis D. Tripp ( 755736 ) on Tuesday May 22, 2007 @10:07AM (#19220813) Homepage
    The operating frequency of microwave ovens was chosen to be in an unlicensed (ISM) frequency band, that would provide good penetration into foods, and lent itself to the mass production of inexpensive magnetron tubes.

    The lowest resonant frequency for a water molecule is 22.235 GHz, or nearly 10X the operating frequency of a microwave oven.
  • by Moby Cock ( 771358 ) on Tuesday May 22, 2007 @10:08AM (#19220831) Homepage
    And the oven is a resonant cavity. Huge difference.
  • by Flying pig ( 925874 ) on Tuesday May 22, 2007 @10:13AM (#19220903)
    After re-reading Richard Feynman's lecture on Cargo Cult Science. With its demolition of "experiments" without controls, and how people kept on doing pointless lab rat experiments after the methodology was debunked, it's a sad saga - which is just as true today after so much "progress".

    Unfortunately, in the UK at least, the number of scientifically trained journalists can probably be counted on one of Ben Goldacre's fingers.

    Interesting that none of the phone mast posts seem to have remembered the inverse square law - sorry if you did and I missed you - which mean that radiation levels at the ground are a tiny fraction of what you get from the phone. And that nobody has mentioned all the radiation we used to get from TV and radio sets. As I recall, the radiation you get from an old tube superhet set (from the IF) is much more intense than the radiation from WiFi. It is lower frequency, but then the skin effect is less, and as anybody who ever played about with NMR will recall, VHF does things to organic molecules.

    We'd better take action now. Let's get rid of all that nasty radioactivity - oops, Madam, there goes your granite kitchen work surfaces and your low-sodium salt. And all the radiation sources beginning with the most intense. So we've now turned off the Sun, mobile phones, radio, TV, electrical generating. We can't use coal (have you looked at what you get in the ash). So we can just sit in the dark and freeze.

    As for the leukaemia cases - I have long believed that a far more convincing explanation is exposure to farm chemicals, pesticides, and the new virus and bacterial strains resulting from population movement. It is possible that farming overspray with chemicals which have been subsequently banned is a more probable cause of leukaemia clusters than, say, living near a rural electrical supply line. In the UK, and probably in the US too, the parts of Government which deal with farming tend to be extremely secretive and their decisions are often hard to understand. To my mind, they are far more likely to suppress information about such things than the relatively open parts of Government which deal with non-farming health and safety.

  • by littleghoti ( 637230 ) on Tuesday May 22, 2007 @10:15AM (#19220931) Journal
    Actually, 2.45 GHz isn't the maximum of the absorbance for microwaves. If it was, all the energy would be dumped at the surface of food, and there would be virtually no penetration. Water absorbs over a broad spectral range, at least in the liquid phase, where quantised rotational bands can be ignored.

    And what you say about the different energies of radiation is mostly true, although EM radiation covers a range that includes UV, x-rays and gamma radiation, which are not very good for you.
  • by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Tuesday May 22, 2007 @10:16AM (#19220951) Homepage
    I believe there is a scientific reason for the ISM band being there - I think water has a bit of an absorption peak in the 2.4 GHz region.

    For this reason, 2.4 GHz wasn't too hot for long-haul communications due to water vapor in the air, so no one was in a rush to license spectrum for it, and no one fought designating it as an "Industrial, Scientific, Medical" band. (with the primary use in all three of those categories being to take advantage of that water absorption peak for heating.) Now, because the band is such a cesspool, no one minded allowing low-power unlicensed communications in that band.

    Now, as to the health effects of this - Yes, the water in your body is more likely to absorb 2.4 GHz RF. No, that absorption will not do any cumulative damage. Absorbing 2.4 GHz RF will make the water molecules in your body vibrate a little more (i.e. it will heat you up.) At high powers, this does become dangerous as the heat basically cooks you from the inside (just like a microwave oven). At low powers (with 802.11 being a great example), the body is able to safely dissipate the heat rapidly enough so that not only is no damage done, the change in temperature at any point in the body is negligible. You're more likely to get burned by touching the heatsink of the RF amp than you are by touching a circuit trace carrying RF at those power levels.

    RF radiation is nothing like nuclear radiation - the critical difference is that nuclear radiation is ionizing, that is to say that it can not only vibrate molecules a bit, but it has enough energy to alter them. This has the effect of "flipping bits" in your DNA and other such nasty stuff. Since "bit flipping" can have cumulative effects, low levels of ionizing radiation can be dangerous in the long term, because the damage accumulates. With RF, it doesn't unless power levels are so high as to induce temperatures that cause thermal damage.

    Prior to graduate school, I worked at a company that built RF power amplifiers for cell towers (30-45W average power output), and many of my coworkers had been working with microwave RF amps since the very first cell system Motorola deployed. (Yes, we had some ex-Motorola old hands there, who had interesting stories from the early days when the system designers were also heavily involved with the installation process of new base stations.) No health problems whatsoever.

    Since graduate school, one of the tasks of my department is taking equipment through EMI testing. We're frequently right at OSHA RF exposure limits - no health problems with any of us (Well, at least no new ones that weren't preexisting conditions), even our mentor who has been doing this for 20-30 years.
  • rubbish! (Score:1, Informative)

    by ico2 ( 817589 ) <ico2ico2@gmIIIail.com minus threevowels> on Tuesday May 22, 2007 @10:19AM (#19220997)
    The frequency of wireless networks is well below that of light which is not dangerous. It only begins to become dangerous at around the frequency of UV radiation.
    The only possibly danger (as with microwave ovens) is a heating effect, but the transmittors are far to weak and it would be relatively easy to prove if they were powerful enough.

  • by kebes ( 861706 ) on Tuesday May 22, 2007 @10:28AM (#19221129) Journal

    Planck's constant is so small that interactions between electromagnetic waves and molecules cannot be chemically specific.
    What do you mean by that? If that were true, then spectroscopy [wikipedia.org] wouldn't be possible. Different molecules do indeed interact with the EM-spectrum quite differently. They absorb at different wavelengths, and exhibit other effects (like Raman scattering [wikipedia.org]) that are indeed chemically-specific. In fact, spectroscopy is the most common way of identifying chemical species.

    Different parts of the EM-spectrum probe different aspects of molecules. (Visible light probes electronic structure, infrared light interacts with molecular vibrations, etc.) Even the radiofrequency range of the spectrum interacts with molecules in a chemically-specific way: microwave-region EM-radiation probes the rotational modes of molecules, and radiofrequency spectroscopy can also probe nuclear states (see NMR [wikipedia.org]).

    If I've misunderstood what you meant, please set me straight.

    (By the way, I do agree that the energy from a WiFi signal will be absorbed by most common materials and lead to a barely noticeable increase in temperature. But that doesn't mean that the process is not chemical-specific. For instance, some materials will absorb more of the WiFi signal than others.)
  • by ishmalius ( 153450 ) on Tuesday May 22, 2007 @10:32AM (#19221215)
    I stand corrected. I found out that my knowledge of the topic was totally wrong:
    http://en.wikipedia.org/wiki/Microwave_oven [wikipedia.org]
  • by malsdavis ( 542216 ) on Tuesday May 22, 2007 @10:38AM (#19221341)
    Panorama isn't a "news" program, it's an investigative reporting program, which is quite different.

    Towards the end of the program in question, they did start to admit more and more that there is absolutely no evidence or even much likelihood of harm from Wi-Fi, which was good although it was maybe too little, too late. My (and I think many others') main issue with the program was their over-use of the scare word "radiation" in a way that implied every Wi-Fi router is a mini unshielded nuclear reactor.

    But, I've seen many far worse "this common piece of technology is going to kill us all" programs on TV and was really expecting it to be far more "scare story" like than it actually was.
  • Re:Eek! (Score:4, Informative)

    by metamatic ( 202216 ) on Tuesday May 22, 2007 @10:42AM (#19221399) Homepage Journal
    7 real studies have been done.

    The "electrosensitive" crackpots couldn't detect a mobile phone signal even after 50 minutes of continuous exposure.

    http://www.bmj.com/cgi/eletters/bmj.38765.519850.5 5v1 [bmj.com]

    It could be psychosomatic, it could be some other mental or physical illness, but it isn't EM radiation that's making them ill.
  • by Rocketship Underpant ( 804162 ) on Tuesday May 22, 2007 @10:43AM (#19221423)
    Guess what:

    1. Your body absorbs EM radiation from the infra-red band! Also known as heat, IR sources are everywhere and can eliminate the need for you to wear thick clothing.

    2. Your skin absorbs EM radiation from the optical spectrum! Black people are particularly vulnerable to this type of radiation absorption.

    3. Your skin absorbs radiation from the UV spectrum! Millions of people develop tans and synthesize vitamin D every year due to UV radiation absorption.

    Notice that in all these cases, we're talking about the conversion of energy to *heat* by the absorbing tissue. Raising an alarm about this is like getting up in arms about the dangers of "dihydrogen monoxide". In fact, radio-band emissions are even lower-energy than the energy spectra listed above, and is thus generally even more benign.

    Dangerous radiation is high-energy ionizing radiation, like that found in the X-ray and gamma spectra. Such radiation has the capacity to damage cell DNA and cause radiation sickness, but that's a completely different animal than what this article is dealing with.

  • by anticypher ( 48312 ) <anticypher.gmail@com> on Tuesday May 22, 2007 @10:48AM (#19221493) Homepage
    This has been the way of the BBC for as long as anyone can remember.

    There are two sides to every story. Exactly TWO. Two diametrically opposed sides. Never a third. Never just one. Always TWO. No shades of grey permitted. No announcing a discovery without finding a skeptic to denounce it.

    If 99 scientists were to state that the sky is blue, the BBC would go out of their way to find some crackpot to claim the sky is actually red. And then give the two sides equal standing.

    Worse, Panorama has never been held up even to the standards of the BBC, as they go after the tabloid illiterate crowd.

    the AC
  • by amper ( 33785 ) * on Tuesday May 22, 2007 @10:50AM (#19221525) Journal
    Oblig. citation...

    http://www.fcc.gov/Bureaus/Engineering_Technology/ Documents/bulletins/oet56/oet56e4.pdf [fcc.gov]

    see page 15 for limits on acceptable uncontrolled exposure in the relevant frequency range (1 mW/ cm^2).
  • by amper ( 33785 ) * on Tuesday May 22, 2007 @10:56AM (#19221651) Journal
    First of all, just because a microwave oven dissipates 1500 W of power, that doesn't mean that it actually *radiates* 1500 W of power. Second of all, the FCC has guidelines for microwave oven emissions. Total leakage at the time of manufacture is limited to 1 mW/ cm^2, and 5 mW/ cm^2 over the lifetime of the unit. This generally falls into the acceptable ANSI/IEEE C95.1-1992 guidelines for exposure, given that microwave oven usage is generally intermittent.
  • by richard.cs ( 1062366 ) on Tuesday May 22, 2007 @11:19AM (#19222007) Homepage

    RF radiation is nothing like nuclear radiation Except, you know...the nuclear radiation that is RF radiation...which is all of it.

    There are 3 forms of nuclear radiation, two of which are particles and have *nothing* in common with RF radiation whatsoever. Then there's gamma which is electromagnetic but have wavelengths about ten orders of magnitude shorter than microwave radiation. The energy per photon is hence around 10**10 times greater. You could argue that the total energy emmitted by a large microwave transmitter can be higher than that from a gamma source but this only effects its ability to heat things. To cause molecular changes many microwave photons would have to strike the same molecule on a small enough time that the energy is not re-radiated. In practise this is so unlikely as to never happen.

  • by j_square ( 320800 ) on Tuesday May 22, 2007 @11:32AM (#19222227)

    I believe there is a scientific reason for the ISM band being there - I think water has a bit of an absorption peak in the 2.4 GHz region.

    For this reason, 2.4 GHz wasn't too hot for long-haul communications due to water vapor in the air, so no one was in a rush to license spectrum for it, and no one fought designating it as an "Industrial, Scientific, Medical" band. (with the primary use in all three of those categories being to take advantage of that water absorption peak for heating.)
    This is plain wrong. Water vapor has an absorption peak at about 22 GHz. Liquid water has a very broad resonance centered around 10 - 30 GHz (depending on temperature).
    The ISM frequency band around 2.4 GHz is a trade-off between absorption, penetration depth, uniformity of heating, availabity of cheap sources (magnetrons), and thus just a regulatory thing. There was an alternative band at 905 MHz as well.
    2-8 GHz (S- and C-band) is actually optimum for low-noise operation (e.g. deep-space probe comms) due to the absorption loss from atmospheric gases, background radiation, etc being minimum her.

    Please stop propagating myths, and please stop labeling junk like this as "informative".
  • Re:Eek! (Score:3, Informative)

    by justasecond ( 789358 ) on Tuesday May 22, 2007 @11:49AM (#19222513)

    They did however mention that Sweden recognises electro-sensitivity as an official disability

    The show's out of date then. There was a WSJ article last week or the week before that specifically discussed Sweden kicking so-called electro-sensitive people off disability.

  • by jmv ( 93421 ) on Tuesday May 22, 2007 @11:54AM (#19222569) Homepage
    Except, you know...the nuclear radiation that is RF radiation...which is all of it.

    While gamma radiation is indeed electromagnetic, what pretty much everyone calls RF is actually whatever is below the infrared (i.e. microwave downward). Also, not all nuclear radiation is electromagnetic. Ever heard of alpha and beta particles -- those are ionising too.

    What about UV? That causes mutations too. Does that have as much energy as gamma (the answer: not if the amplitude is the same)? This is just crap. Any kind of radiation can have three effects on cells:

    What the hell is "amplitude" supposed to mean. This isn't about the amount of power, but the nature of the radiation (quantum physics 101). Either a certain radiation is ionising or it's not (well of course, there's a range where it depends on the exact molecule). For both UV and gamma, the energy of a photon is enough to eject an electron (or move it where it's not supposed to be) and thus cause damage to the DNA. For microwaves, you can pour as much energy as you like, it's just not going to happen. The only potential harm from microwave is the fact that it can potentially heat up the body (but it takes more than a few mW).

    The more energy, the more likely to get #3. However, there are agents in the skin to absorb most of the energy in most of the RF spectrum. Any part of the spectrum can cause mutations if you can get it to do step #2 and not step #3.

    No, mutations can only be caused by ionising radiation. A microwave oven will cook you, but it will *not* cause mutations because the microwave photons simply don't have enough energy to displace electrons. Also, why do you think we put sunscreen to protect our skin from UV radiation while leaving it fully exposed to infrared and visible light, which make up most of the total radiated power from the sun (and far more than UVs)?

    Your story aside, that much power could easily burn someone to cinders if they happened to be sitting on the focal point of a microwave dish.

    No, it will have about the same effect as using a 20 cm magnifier in the sun. Would probably hurt, but not kill you.
  • by Brandon30X ( 34344 ) on Tuesday May 22, 2007 @11:55AM (#19222581)
    This is so incredibly wrong that it makes my brain hurt thinking about it. In fact no one should be able to discuss these types of threads without an EE degree, and even then mosts EE's don't bother to study electromagnetics because its hard, or boring or whatever.

    Ok, first off here is a discalimer: I do research in microwave wireless power transmission. AKA we send power (about 40-60 Watts) in a beam of 5.8GHz microwaves to a receiving antenna that converts it back to DC. And guess what, at the receiver you can stand in front of the beam because the power DENSITY (key word) is under the IEEE standard limit for safety. Your calculation above is completely wrong because you forgot a few key things.

    1. WiFi will use a lower power if it can, so its not always 100mW
    2. Its not always transmitting, the signal is modulated so the average power is lower.
    3. And this is the most important, its 100mW delivered to the antenna! Which assuming its isotropic will radiate in all directictions. So as the spherical "shell" of power is radiated the power DENSITY goes down with the square of the distance.

    power density = (Power to the antenna)/(4*pi*r^2) (assuming isotropic, AKA in all directions)
    so for 100mW that is 2.65mW per meter squared at 3 meters away (10 ft away)

    For a comparison, noontime sunshine has a power density of about 1000 WATTS per meter squared. (http://en.wikipedia.org/wiki/Solar_power)

    So the WiFi is 370,000 times less powerful than daylight when standing 10ft away.

  • by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Tuesday May 22, 2007 @12:02PM (#19222689) Homepage
    I stand corrected regarding the "cooks from inside", although I don't think things are as clear cut as the article says. Some things will have much deeper skin depth because of low water content (for example, biscuits cook quite evenly.) Humans do have much higher water content, but there are still plenty of situations where permanent internal damage is suffered due to high power RF exposure before external damage becomes visible.

    I do stand corrected regarding 2.4 GHz absorption. My bad.

    "Except, you know...the nuclear radiation that is RF radiation...which is all of it."
    Nope.
    Yes, RF radiation and gamma radiation are both electromagnetic radiation. They have vastly different wavelengths and energies per photon though - RF radiation has extremely long wavelength (low energy) compared to visible light. (Yes, visible light falls into the same category) Gamma, on the other hand, has a much shorter wavelength (higher energy per photon) than visible light. UV is very close to gamma in terms of energy and wavelength. In the case of UV and gamma, individual photons have the ability to make the electron shells of individual atoms change states. (hence the term "ionizing"). As a result, UV and gamma can change the chemical makeup of molecules by breaking and rearranging individual molecular bonds (this is why it can damage DNA permanently). RF, on the other hand, acts to cause entire molecules to vibrate (heating) but does not change their chemical composition unless the temperature exceeds that required to start a chemical reaction.

    Let's not forget that nuclear radiation also can be particle radiation (alpha and beta particles), which are also ionizing.

    "Your story aside, that much power could easily burn someone to cinders if they happened to be sitting on the focal point of a microwave dish. They don't actually get 45W of microwave energy hitting them ever, so it's not a problem."
    Oh yeah, agreed, many such coworkers did get RF burns from accidental brief contact with circuit traces in open PAs they were working with. That said, most of them (including myself) did far more tissue damage with soldering iron accidents than with RF burns. Still, even when not having accidental contact with traces, we were all (in general) exposed to far more RF than the average person is on a regular basis.
  • by VeriTea ( 795384 ) on Tuesday May 22, 2007 @12:06PM (#19222767) Journal
    It's must be embarrassing when you write a post to discredit someone, and it ends up revealing that you didn't understand what was being said.

    Lets go back to quantum physics / physical chemistry / modern physics (depending on the curriculum you studied in college). Electromagnetic energy has a dual wave-particle nature. The particle nature revealed by the fact that EM has a specific quanta (photon for EM in the light frequency range) of energy that is directly related to its frequency. The higher the frequency the greater the energy contained in the quanta or in the photon. This means that high frequency EM sources like X-rays, gamma rays, and beta rays (in order of increasing frequency) contain much more powerful quanta then low frequency EM sources (radio waves).

    So why is the energy level in the quanta important? Well, if you recall your chemistry, electrons can be moved to higher orbits, or even dislodged from an atom by adding an exact amounts of energy to them (only the exact amount will cause a change, energy amounts greater or lower then the exact amount needed will have no effect on the electrons of an atom). The very lowest level of energy required to disturb an electron from the outermost shell of any atom just happens to correspond to the energy level of a quanta of an EM wave at the frequency of ultraviolet light. This means that all EM energy below this minimum frequency threshold are unable to disturb electrons in an atom, but above this frequency they can begin to alter the atom structure of matter, and the higher the frequency the greater they can alter the structure. Radiation capable of changing atomic structures is known as ionizing radiation, radiation incapable of causing changes is known as non-ionizing radiation. So this explains why ultraviolet light is carcinogenic, it is just over the threshold of ionizing radiation, while red, orange, yellow, green, and blue light (Roy G. Biv) are perfectly safe (well, not carcinogenic anyway).

    So, back to the whole point, RF radiation is nothing like nuclear radiation, unless you are ignorant and easily swayed by scaremongering tactics that use the word 'radiation' as a synonym for 'evil'.

  • by Brandon30X ( 34344 ) on Tuesday May 22, 2007 @12:09PM (#19222815)
    The U.S. legal limit of leaking radiation is 1 mW/cm at 5 cm (about 2 inches) from a new oven.

    Quoted from wikipedia.
    -Brandon
  • by Fantastic Lad ( 198284 ) on Tuesday May 22, 2007 @02:07PM (#19224661)
    Why do people constantly focus on ionization as the problem?

    Brain cells respond to EM in ways inherent in biological design. EM has been demonstrated to have all manner of effects upon the human body and nervous system. Acupuncture is one of the more obvious ones; (metal needle inserted and set to rotating cuts through the Earth's magnetic field and 'injects' a current into the patient. This affects how cells function. Pain responses can be turned off.)

    Basically EM in a random noise makes the brain fuzz out and it makes people easier to manipulate. It makes them dozey and dumb.


    -FL

  • by jstomel ( 985001 ) on Tuesday May 22, 2007 @04:08PM (#19226555)
    All of what you said is true, except that UV is not ionizing radiation. DNA absorbs in the UV range at 260 nm. UV radiation at that specific wavelenth causes the DNA to become a reactive species and chemically crosslink with nearby DNA. It never enters an ionic state

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...