Jump to content

Don't get too overhyped in the newest and best!!


IndyKlipschFan

Recommended Posts

leds are inside lasers
lasers use leds and a semi transparent mirror to focus the led's light to produce a laser

I would like to see the implementation of led's into lcd monitors like they are doing to some of the ultra portable laptops lcd's. It will produce a better picture color wise and reduce energy, also it will make a lcd really thin like an inch thin. Currently lcds use cold cathode tubes (like flourescent light bulbs).

Link to comment
Share on other sites

You are sending a signal to a converter, and presumably using the embedded clock, in which case an optical cable--even in short 'home use' lengths--certainly may degrade a 'digital' signal enough to be audible.

Do you have a source for that claim? Modern CODECs have multiple clocks and even detect the edges of the signal itself. They have their own little buffer inside where I'm sure one could verify that the signal is always bit for bit identical to the input. If I was feeling more energetic, I could run any DAC through my lab bench to verify accuracy rates...but it's really a lost cause. ANY errors are going to be drastically apparent as clicks and pops. You can't get partial degredation of a digital signal. I'm certainly not an expert on the subject, but I'm doing two design projects right now that involve direct hardware and software interfacing with Codecs, DACs and ADCs (from both TI and Analog Devices). Digital accuracy certainly is not an issue. In fact, if one were to claim that there is data loss, then one would need to explain why the I2S or SPI standards have no problems perfectly communicating digital information in just about every digital device on the market...it's just serial data. Heck, how long has RS-232 been around?

As with most things digital this subject rapidly becomes complex but the notion of perfect digital is frequently forfeited in the hardware/software reality--unless *lots* of care is taken. The scale of this particular problem will ultimately depend on the quality of the D/A converter. My very minimal exposure to HT products (brief look at a high-endish $4k receiver) did not look encouraging re D/A quality--which is not a surprise.

The D/A's being used in "HT products" are identical to what you're going to find in most any over priced exotic product. If there is any difference, it is probably going to be in the analog output section for the unit - I've been told that usually the biggest difference is the cheap DC blocking electrolytic caps in series with the output, but I suppose crappy opamps could be used too. IF really really cheap D/A's are being used, then the audible result is most likely going to be in the noise floor (usually due to crappy dithering and anti-aliasing filters). Nevertheless, D/A's are a totally different issue than digital input buffers...

In fact, I would be kinda impressed to see any digital transport showing a loss of accuracy. All you gotta do is send a known digital signal and then store the output of the receiving end into a buffer and then compare the results. I could probably write a script that automatically compares the difference just to see how long it takes until a single error pops up...then I can store the sample and listen to the audible result to see how much it matters.

All I gotta say is that if accuracy were a concern, then engineers would have already come out with a solution that is 100% perfect. The fact of the matter is that the issues involving serial data transfer were solved in the 60's...

It's quite possible that I could be wrong about all of this, but I would be much more interested in facts or demonstrations that I could repeat on my own. I think some of my profs would be interested too (one in particular who worked extensively on digital standards).

Nothing in my post is unusual or controversial--just basic digital practice, principles, and stuff.
Do you have a digi text book?--it will be in there.

OK. I'll try to be helpful :-)

Here is the 'principle'--slightly expanded. Two scenarios:
1) Send a signal from a digital source to a 'digital' destination, (something which only needs numbers: a recorder of some sort, say a HD) via optical cable, and.....no problem! 100% perfect capture! No worries. But we are not doing that.
2) Send a signal from a digital source, via optical cable, to an *analog* device (an A/D converter), and use the embedded clock (also analog)....then there is a potential problem: you have to deal with *jitter*.

So, how much jitter is there? And does it matter?
Optical cable can cause significant jitter--generally much more than a coax connection.
Basic modeling of a short section of optical cable: 3m will cause c.15 nanoseconds of jitter. Bend the cable a bit and jitter increases!
And yes, 15ns at the converter is within the worrisome range for audibility.

That was the important bit re this thread.
Then you can move on to the rest of the complexities.....e.g., how good is your box converter at rejecting jitter?
At 5kHz? At 10kHz? 20kHz? etc......
Hopefully the answer to the second half of your question is now apparent as well.

Mark

Link to comment
Share on other sites

So, how much jitter is there? And does it matter?
Optical cable can cause significant jitter--generally much more than a coax connection.
Basic
modeling of a short section of optical cable: 3m will cause c.15
nanoseconds of jitter. Bend the cable a bit and jitter increases!
And yes, 15ns at the converter is within the worrisome range for audibility.

Not
to be picky, but what is the source of this information? Did you mean
15ps? It's not very difficult to find ADC's that have rejection above
100ps.

In the end, the audible result of jitter is clicks...if you're not hearing clicks, then there is nothing to worry about.

Link to comment
Share on other sites

Not to be picky, but what is the source of this information? Did you mean 15ps? It's not very difficult to find ADC's that have rejection above 100ps.

In the end, the audible result of jitter is clicks...if you're not hearing clicks, then there is nothing to worry about.

That is wrong.
The audible result of jitter is NOT clicks.

Jitter modulates the signal. It creates distortion.
Only extreme levels (beyond anything we are talking about) might cause the converter to break lock/click.
My example above of 15ns of jitter (from an optical cable) would create c. -70dB side bands in relation to a 10kHz test tone.

Regarding the attenuation capabilities--if any--of a particular converter....you will almost never find that specified (it is almost never tested)--and if it is it will be in fairly meaningless numbers (for good reason)--except in expensive pro boxes.
Finding an 'ADC' (I assume that is a typo)--is this an HT box?-- which claims 'rejection above 100ps' (a meaningless number when unqualified) is not the issue. Assumptions of competency are often wrong when dealing with digital, and jitter attenuation is the kind of info you will only know for most converters by measuring yourself.

Remember.....this is about "does it (cable induced distortion in 'digital') exist" in this scenario--see my first post--and the answer is unequivocally 'yes'. How important it is in any particular usage is much more complex--but not trivial.

I will put together and post some refs for you later today/tonight....if I get time!

Mark

Link to comment
Share on other sites

Sorry Riv - you are incorrect. The result of jitter in audio would in fact be audible as it would result in lost bits with the subsequent discarding. Lost bits = incomplete data. Incomplete data in the audio realm without any type of buffering or ACK will result in incomplete program material, not distorted material, as the D/A converters do not have the ability to magically "bridge" and "smooth out" over the missing data.

You are also making the medium out to be far more volatile than it really is, especially in that capactiy. The routers and gear I run at the data center have very extensive logging and error monitoring. I have extremely long and radically curved optical cable runs between buildings and between floors, and run them at full OC-48 capacity (2.4 Gb/s) with recorded jitter rates at less than .005 % at all points. This is a level of quality and service that we are *required* to keep due mostly to the data rates and volumes we are talking.

Home audio doesn't even begin to scratch that kind of traffic rate, nor push the "digital envelope" for streaming traffic. As such - any jitter would be horribly and exponentially amplified and most certainly would be audible. The rate is so slow that if there was a resulting time shift, the only thing that would happen is the lock between the components would be lost entirely as there is no buffering, no ACK and no retry, and it would carry across ALL the data traffic, not sporadically here and there... the data rate is not remotely high enough on home audio gear to be that touchy and at any real risk.

The closest you could ever come would be to lay optical cable between components in an un-stressed state, play your source material, then rapidly bend it past treshold, and then back again - hoping the distortion in the cable itself isn't so great that it renders it useless. Your result will only be a sharp click in your program material, then silence - while the lock is lost, and tries to re-establish. Again - it is streamed 1's and 0's with no buffer, no ACK, and no retry. It either works, or it doesn't. The slow data rates and short cable lengths do not warrant any worthwhile measurements of jitter on a digital level, and certainly not on an audible one.

Link to comment
Share on other sites

Sorry Riv - you are incorrect. The result of jitter in audio would in fact be audible as it would result in lost bits with the subsequent discarding. Lost bits = incomplete data. Incomplete data in the audio realm without any type of buffering or ACK will result in incomplete program material, not distorted material, as the D/A converters do not have the ability to magically "bridge" and "smooth out" over the missing data.

You are also making the medium out to be far more volatile than it really is, especially in that capactiy. The routers and gear I run at the data center have very extensive logging and error monitoring. I have extremely long and radically curved optical cable runs between buildings and between floors, and run them at full OC-48 capacity (2.4 Gb/s) with recorded jitter rates at less than .005 % at all points. This is a level of quality and service that we are *required* to keep due mostly to the data rates and volumes we are talking.

Home audio doesn't even begin to scratch that kind of traffic rate, nor push the "digital envelope" for streaming traffic. As such - any jitter would be horribly and exponentially amplified and most certainly would be audible. The rate is so slow that if there was a resulting time shift, the only thing that would happen is the lock between the components would be lost entirely as there is no buffering, no ACK and no retry, and it would carry across ALL the data traffic, not sporadically here and there... the data rate is not remotely high enough on home audio gear to be that touchy and at any real risk.

The closest you could ever come would be to lay optical cable between components in an un-stressed state, play your source material, then rapidly bend it past treshold, and then back again - hoping the distortion in the cable itself isn't so great that it renders it useless. Your result will only be a sharp click in your program material, then silence - while the lock is lost, and tries to re-establish. Again - it is streamed 1's and 0's with no buffer, no ACK, and no retry. It either works, or it doesn't. The slow data rates and short cable lengths do not warrant any worthwhile measurements of jitter on a digital level, and certainly not on an audible one.

Srobak--
I don't want to be too hard on you but....yikes. You are misapplying your experience to audio.
What I described above re jitter is basic: Digital Audio 101.
Maybe it will help if I repeat this: 'digital' must be be viewed (from an engineering perspective) as at least semi-analog when it goes to an analog device (and uses the embedded clock) ala most consumer use.

Srobak says: "It either works, or it doesn't. The slow data rates and short cable lengths do not warrant any worthwhile measurements of jitter on a digital level, and certainly not on an audible one"

Here....since you deny the fundamental point--that jitter modulates the signal, adding distortion --read this:

"Jitter modulates the audio signal. The result is that phase modulation sidebands are created above and below every tone in the audio signal. Worse yet, these sidebands are often widely separated from the audio tone they are modulating. Consequently, the distortion caused by jitter is far more audible than THD+N measurements would suggest. Phase modulation sidebands are not musical in nature (they are not harmonically related to the audio), and they are poorly masked (because of the wide separation from the fundamental). In short jitter induced distortion closely resembles intermodulation distortion (IMD) and is much more audible than harmonic distortion (THD). Jitter creates new “audio” that is not harmonically related to the original audio signal. This “audio” is unexpected and unwanted. It will cause a loss of imaging in the stereo signal and when severe, will create what some have called muddiness (a lot of extra low and mid frequency sound that was not in the original). Jitter induced sidebands can be measured very easily using an FFT analyzer."
--John Siau, Director of Engineering at Benchmark (formerly of CBS Labs-- one of the better digi engineers out there).....


Mark

Link to comment
Share on other sites

You are trying to mix analog signal and digital into the same realm on the same cable. It doesn't work that way. There is no "audio signal" on an optical cable. There is only a digital bitstream - regardless what *type* of traffic is on it - at that point it is just bits. Yes - of course phase modulation causes distortion... that is not what is at issue. The issue is two-fold, per your argument: a> is there any phase modulation in the digital signal that would distort the digital signal stream (because it's not audio at this point), and b> would it consequently cause audible distortion?

The answer to both is no. Now - understand I am not saying that the light in an un-stressed optical cable cannot be distorted - of course it can. The question is will it distort to the point of corrupting even one bit within the stream. The answer is - at that very slow speed (even 96k) - no. I don't care what you do to it - a 1 is not going to metamorphise into a 0, nor a 0 to a 1... there will be no long 0's or short 1's due to phasing or wavelength changes. If anything other than a 1 or a 0 is received - then it is discarded. It is not managed, error corrected, or otherwise interpreted or handled. If it isn't a 0 or a 1 then it is discarded, and the next bit is handled. This discarded bit would be treated as missing or incomplete data by the DAC, and would appear as a click or silence on the analog end and would be clearly audible, but it would not appear as otherwise distorted audio (boomy bass, etc that you mentioned before) . If too many bits are discarded in too short of a period of time - then the lock is lost entirely.

Yes, you can bend and distort light, of course, and this is what one can measure. But at such a slow data rate - the distortions would not change the content or order of the light stream. The distortion would also be a constant unless the cable was being moved around all the time. The bitstream is transmitted in a specific order - and no distortion of even the cheapest poly optical cable is going to magically change that bit order short of cutting it off entirely. As long as the bits are received in the same order they are transmitted - any distortion you might perceive is the fault of your DAC or other components after the bitstream.

Sorry, but that's how physics work... you cannot transmit two different light pulses one after the other and expect the latter to arrive at it's destination first. The speed of light is a constant - and if the first bit goes through an occlussion on an optical cable, then so will the 2nd, and they will both be "slowed" or distorted exactly the same, making them arrive in exactly the same order they were sent.

Measurable light distortion on a 48 or 96k bitstream does not equal measurable audio distortion. If the bits arrive in the same order they were transmitted - then your optical cable is not the source of any distortion or other problem.

Link to comment
Share on other sites

It really just comes down to what you can actually see.

This chart has been passed around a lot and deserves attention: http://www.soundandvisionmag.com/assets/download/0602_tech_talk2_large.jpg

As can be seen, if you're 10 feet away from your screen, then the display will have to exceed 50 inches before you can actually see a difference in anything better than 720p. Heck, even a regular DVD will look just as good as HD if you're looking at a 30 inch display at 10 feet! Fret all you want about the details but the eye has limitations.

This is the same story as on the audio side...new "hi-def" audio formats are not distinguishable (in detail thru 2 channel) in double-blind listening tests from the old "Redbook" CD standard. Check it out: http://theaudiocritic.com/blog/index.php?op=ViewArticle&articleId=41&blogId=1

Once again....there are physical limitations....and we are just about there. Beware the hype!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...