Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Music Media Toys Technology

Do the 5.1 Stereo Headphones Really Work? 84

Tamor asks: "Zalman, the company behind some extremely high quality PC noise-reducing products are now selling real 5.1 surround sound headphones. The surround effect is achieved by placing 3 drivers in each ear-piece. As a geek-with-young-family this product's pushing all the right buttons for me, it looks cool, and means I can finally achieve surround sound without waking the kids. Or does it? I was sure that to place a sound spatially your brain relies on the delay between hearing the sound in one ear and then the other. If your left ear only hears the left 3 channels, and your right ear only hears the right 3 channels isn't this making it more difficult for spatial placement to happen? Do you know if/how these are achieving surround effect if each ear is only hearing half of the audio field?"
This discussion has been archived. No new comments can be posted.

Do the 5.1 Stereo Headphones Really Work?

Comments Filter:
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Monday January 26, 2004 @04:40PM (#8092110)
    Comment removed based on user account deletion
  • Re:Physics Problem (Score:4, Interesting)

    by pbox ( 146337 ) on Monday January 26, 2004 @04:42PM (#8092134) Homepage Journal
    Beyond that, unless you have a really big head, the difference in arrival time to each ear is less than a microsecond. That is surely too small for your brain to comprehend.

    No it is not. Strange but true. You can always tell the direction (not just left-right but any degree in 3 dimension) where sounds come from (true only for tone above 100Hz or so). Therefore your ears/brain can somehow decipher the minite differences in sounds arriving to your ears.

    However, back to the topic. You have 2 ears, therefore 2 speakers are enough to create a complete 3D soundscape. The 5.1 headphones are pure gimmick. You are better off spending some money on a decent pair of 2 speaker headphones, like AKG/Grado and my personal favorite, Sennheiser. If you like music, you are much better off spending $500 on a pair of headphones, than spending the same on 5.1 speakers.

    There are some (classical) recordings out there that are done using a fake head, with mic in place of the eardrums. When using in-ear-canal headphones (think Shure / ER) you are placed in the sound environment exactly like where that head was. I belive it is called aural recording, but please post reply if you have correct info.

    BTW, comp buffs, EAX by Creative is a model, which creates the 3D sound enviroment. It is a model on how our ears work and you can think of it as a 3D modeling, where you specify the 3D coordinates and the enclosing space (and types of walls), and the system outputs a L/R (or 5.1) signal which tries (with various degree of success) to place it in the right spot.
  • Re:Physics Problem (Score:5, Interesting)

    by Otter ( 3800 ) on Monday January 26, 2004 @04:46PM (#8092175) Journal
    There was a famous neuroethology experiment with barn owls. They have asymmetrical ruffs on their ears, one pointing up and one down. Sounds have a different volume in each ear depending on the altitude of the source.

    But, it was also showed (by putting headphones on them, playing a mouse sound and watching how their heads moved) that they use volume to determine altitude and time offset to determine bearing. So it's definitely possible -- although I have no idea what system human perception uses for the same problem.

  • Sony alternative (Score:3, Interesting)

    by HeroicAutobot ( 171588 ) * on Monday January 26, 2004 @04:47PM (#8092188) Homepage
    Sony also has the much more expensive MDR-DS5100 [sonystyle.com] and the still even more expensive MDR-DS8000 [sonystyle.com].

    I've been very tempted by these, but haven't been able to find many reviews. (I haven't looked for a few months though. Maybe there's more information available now.)

  • by esm ( 54188 ) * on Monday January 26, 2004 @04:51PM (#8092236) Homepage
    Sorry, I have no experience with these headphones. But: about 10 years ago, I volunteered as a test subject for some experiments done at NASA Ames (Mountain View, CA, USA). One of them was for a device called the Convolvotron [o-art.org]. It's a thingy (sorry for the technical term) which takes multiple sound sources and "localizes" each one so it sounds like it's coming from a different place. It worked incredibly well with only two speakers. The big problem was distinguishing between straight-in-front and straight-behind. With headphones and human ears, I suspect that's just a darn difficult problem. But side-front, side, and side-rear were very easy to differentiate.

    Although the tests took place in a sound chamber, they were kind enough to give me a demo tape -- and this tape is amazing. They demo about 5 different voices (simultaneous ATC conversations), both flat and spatialized. Flat, it's impossible to differentiate them. With the convolvotron, it was possible and easy to track each conversation separately. Each one sounded like it came from a different place.

    This was early 90s. Processing power has certainly increased since then. It should be possible, and relatively cheap, for someone to use Convolvotron-like technology to convert a 5.1-channel signal to spatialized L-and-R ones for use with regular headphones. There shouldn't be a need for special headphones.

    Lots of Google hits [google.com] for "Convolvotron". Enjoy.

  • cues (Score:5, Interesting)

    by ballpoint ( 192660 ) on Monday January 26, 2004 @04:53PM (#8092254)
    Left-right stereo has been here a long time and it works wonders with headphones. No doubt about that.

    And since any sound arrives at your two cochleas, it must be possible to simulate any sound position just by exciting your two ears, preferably with in-ear phones.

    But I have a hinch that cues about whether a sound is at the back or front come subconsciously from:

    1. Turning your head and registering the changes in sound.
    2. Echoes and reverb. This only works if you know and 'feel' the room. (*)
    3. Changes in frequency response due to the structure of your ears. This only works for sounds you know.

    As the headphones are fixed to your head the first, and probably the most important, cue disappears. The room where the sounds were recorded does not match the room you're in, so the second cue disappears. And finally you will be listening to new, unknown sounds. There goes the third cue as well.

    But in true /. fashion, I'm posting this without actually having experienced 5.1 headphones with more than one speaker on each side. I'd like to try though.

    (*) While I'm listening with isolating in-ear buds, it is strange that the sound changes dramatically the moment I enter a building from the outside. Hard to explain by reverb and echo as there is little sound leakage from the buds to the outside and vice-versa.
  • Hearing (Score:3, Interesting)

    by DarkDust ( 239124 ) <marc@darkdust.net> on Monday January 26, 2004 @04:53PM (#8092260) Homepage

    I was sure that to place a sound spatially your brain relies on the delay between hearing the sound in one ear and then the other.

    Yes, this information is used for left/right locating. But AFAIK (IANAES, I am not an ear specialist) also interference caused by sonic reflections from your shoulders are needed for locating whether a sound comes from above or below. I don't know how the distinguishes front/rear locating, though.

  • by TrebleJunkie ( 208060 ) <ezahurakNO@SPAMatlanticbb.net> on Monday January 26, 2004 @05:01PM (#8092346) Homepage Journal
    Many many moons ago, when I was doing video production work, I received a sample CD from an audio library collection billed as "3D-sound".

    I don't know how the stuff was recorded, but it was recorded such that you really could localize the sound, in space, in 3 dimensions, from regular ol' stereo headphones. The most memorable tracks on the CD was of someone getting a haircut. You could hear *where* the scissors "were" around your head. You could tell where the hairdryer was blowing. Not just left-or-right, but *around* your head. The stuff was amazing.

    I'm guessing that not just volume and left-or-right determines where you hear things, but phase as well.

    But, anyhoo, the point being that you can very likely achieve good surround-sound sounding stuff with just one speaker per ear, and not three.
  • No. (Score:3, Interesting)

    by 3-State Bit ( 225583 ) on Monday January 26, 2004 @05:15PM (#8092541)
    I was sure that to place a sound spatially your brain relies on the delay between hearing the sound in one ear and then the other.

    Knowing nothing about human hearing we can almost rule out this conjecture. Noise travels at about 761.207051 mph [google.com] and your ears are about a foot apart.

    That means there is a difference of 895.706603 microseconds [google.com] between when the first ear would hear the sound and when the second one would.

    This is 1/1116th [google.com] of a second, meaning that if your brain 'ticks' subconsciously at anything less than 1100 hertz its timing would be too coarse to catch this minute difference.
    The brain, in fact, ticks [everything2.com] a couple of orders of magnitude slower than this, and moreover the theoretical maximum a single neuron can tick is 2000 hertz [mit.edu], so there would have to be ~0 ms delay in signal propagation between neurons, and the signals would have to make a straight line from each ear toward the area in which the signal is to be processed in order for comparison to occur together with pertinent timing information. (The brain, of course, is not so precisely wired that it could take into account some kind of fixed minute differences in timing among various input sources.)

    So we can rule that out. The next idea continues with your implicit assumption that each ear is, logically, a fixed point of input, with the brain reconstructing all spatial information. (Ears, in fact, have a complex set of ridges precisely because they do convey spatial information)

    But if we thought of ears as mere fixed points of frequency/amplitude sampling, we might be tempted to think that all spatial information is reconstructed from minute differences in amplitude -- the ear nearer the sound source would hear it more loudly. We can also eliminate this conjecture because the two spheres of possible sound location a given distance from each ear intersect not in one point but a whole arc of possible places. What I mean is, if all your brain knew is : "Ear 1 hears source at A loudness and ear 2 hears source at B loudness, and ear1 is at (x1, y1) and ear2 is at (x2, y2)", then, together with information about how sound loses amplitude with the square of the distance it travels and inversely with the frequency (assume the pertinent natural laws are hard-wired), it could produce the fact: A-ha! The source must be 10 feet from ear1 but 10.23 feet from ear2.

    The problem is, there is not ONE point that fits those descriptions, but an infinitely many.

    If your ears were just input points, then, if you start playing a sound file on the computer in front if you, it should sound the same with your eyes closed now as it would if you turned around and heard it from behind: Each ear hears an equally loud sound, only now from behind instead of in front. The problem is, you can tell that it's from behind and not from in front of you! (Try a double-blind test if you're not sure -- place one speaker dead in front of you and one speaker an equal distance dead behind you, write a script to randomly play either full left or full right balance, close your eyes and listen to the random tests; you'll always be able to tell where the sound source is coming from.).

    Okay, so now we've long-windedly debunked the naive assumptions about how the brain might reconstruct spatial information. How does it?

    Beats me.
  • by stvangel ( 638594 ) on Tuesday January 27, 2004 @12:49AM (#8096923)
    I've worn hearing aids over the last 20 years or so, and I've come to the realization that sound and perception is a lot more than just what the ears hear. My hearing isn't that bad, but it's difficult to get along without them.

    The first thing you have to realize, is it isn't just what your ears hear, it's the vibrations that you feel all over your body that affect your spatial perception. Think about it, if spatial perception was just the difference in sound arrival time between each ear there should be no way for you to tell whether a sound is coming from in front of your or behind you. Yet it's obvious when you close your eyes what direction it's coming from.

    All sound is is air pressure changes. Your entire body is a hearing device. Your body can feel the sound waves hitting your front and back and can deduce direction pretty easily from that. Your skull makes a very good resonating cavity to collect and amplify these vibrations. Just go to a rave or live concert and feel the vibrations. One of the tests they always give me is bone conduction, where they put two transducers on my skull behind my ears. I can hear the sounds they produce almost as well as if it was audible coming through my ears.

    I doubt the three transducers in each ear is worth the effect. I'd think it'd be much more successful if you had some sort of a band all the way around your head that would vibrate your skull from various directions. You need your ears for Clarity, to understand what you're hearing; but your skin and body can handle sound sensations. Even completely deaf people can sense sounds and directions through the vibrations.

    I'd think true 5.1 is just unachievable through a simple pair of headphones, no matter how many in each ear. Granted it may still sound good, but still not as good as a good set of speakers and a subwoofer.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...