Jump to content

iTunes w/ AirPlay to Apple TV -- Does it stay lossless?


Beechnut

Recommended Posts

So the lil apple tv box has HDMI and digital audio out. Two questions?

If you use HDMI, that means there is some processing that happens before it hits your DACs?

And if you use digital audio out, it continues to pass it digital, with no processing, then feeds the DAC so you get the best lossless transmission to your DACs?

Trying to make sure nothing happens to it before it hits the DACs. I've chosen to use apple lossless for my digital library.

Link to comment
Share on other sites

And if you use digital audio out, it continues to pass it digital, with no processing, then feeds the DAC so you get the best lossless transmission to your DACs

no....when using optical....a Digital signal is sent to a digital to optical converter....then it is sent optically to the optical receiver in your dac or audio device....then the optical receiver converts it to ditigal....then it goes to the dac decoder chip. So any method that bypasses this conversion and de-conversion process is the most straight forward one.

Link to comment
Share on other sites

Alright, I should have thought that through. So avoid the optical converter. HDMI is digital so that'd be the best choice with this setup. Thanks Speakerfritz!

So apple tv does send video also. I wonder if the audio and video are combined to one signal or kept separate while they are passed through HDMI to the DAC. I know it's on the same cable but different wire paths inside the cable.

Wifi is very convenient. Trying to minimize the compromises.

Link to comment
Share on other sites

There should be no loss in content going from electrical digital to optical digital to electrical digital in the sense that a bit is a bit is a bit. Formats such as Apple Lossless and FLAC preserve all the information that was presented to those converters in a more compressed form than on CDs for example.

Jitter can be an issue when converting digital formats and that must be considered when you go from one box to another either in the optical or electrical domain. A DAC requires a clock to determine when it is to decide whether a zero or a one has been received. Typically, that clock is derived from the incoming signal itself using a circuit that locks on the transitions in the signal. When the transitions are exactly spaced, the jitter is low; when the time between transitions varies, the jitter increases and the recovered clock may cause the DAC to produce the analog signal earlier or later than it should. Under normal circumstances, this should not be a problem.

A bigger problem is that when streaming bits from the cloud, fewer bits means less transmission expense. Transmitting a highly compressed signal might be good enough for ipod use, but disappointing on a quality system. Material that is specially mastered for Itunes might be pleasing for most, but still may lack some of the information on the original.

Bottom line: Rather than worry about optical or electrical interconnections, be aware of the quality of the source material. Wifi is convenient, but may limit what you are hearing.

Link to comment
Share on other sites

it's true that optical is 1's and 0's, but....the technological short fall is the error rate....all types of technologies have been introduced to minimize the loss of 1's and 0's during optical processing. Some of the loss is the sender, some the medium, some the receiver. the standard as not yet achieved 100% error free transmission of 1's and 0's in both missing bits and latent bits. SO it remains an optimal scenario to by pass or avoid where possible optical processing if other topologies are available. This is less of an issue as you climb the cost ladder....once you get into multiple thousand dollar DAC's the contributing factors to error rates and latency are stabilized thru the use of elaborate technologies. So you can spend 300 bucks on a DAC solution that does not involve optical......or you can spend 2000 on a DAC solution that has an elaborate optical topology designed to minimize error rates and latency.

Link to comment
Share on other sites

I'm not sure how much BER (bit error rate) comes into play when going from digital to optial but it is a coversition so I'm sure it exists in some respect.

Also, I won't be transmitting from an online cloud. It will transmit wirelessly from my PC, to the apple tv unit, which will be the receiver. Out of that, it goes HDMI directly into the DAC.

I guess the optimal thing would be to leave out the wifi upconverstion and just go straight HDMI. The upconverstion to the wifi transmit frequency may inject errors into the signal, just as the other types of media conversion do. Would it be bad enought to be audible...don't know. It be interesting to put a BER tester in line just before the DAC. It would need an HDMI in and out for a throughput. Then you could compair the wifi vs straight HDMI. Or, going fiber for that matter.

Also, I understand jitter and timing. We have dejitterizers in my line of work. (even if I didn't spell that right...)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...