Jump to content

LP errors vs. CD errors


Parrot

Recommended Posts

----------------

On 12/7/2004 3:21:20 PM paulparrot wrote:

I listen to LPs almost every day, and I wouldn't do it if it were unpleasant.

----------------

I think you would enjoy a far greater level of enjoyment from your LPs if you paid a bit closer attention to the ISOLATION of your vinyl rig. Even if the Belle Klipsch you are resting your table on is not hooked up to your music system (and I assume it is NOT!), it makes for a terrible turntable stand. In other pictures of your room I see the table sitting NEXT to your power amps on the SAME rack! YIKES--that's probably worse than sitting on the Belle! I can see why you keep moving the table around to try to improve the sound! I have spent quite a bit of time working out the problems of turntable isolation with both suspended and non-suspended designs and I would be happy to help find a workable solution so that you might better enjoy your record albums. I just can't IMAGINE how bad they must sound, how much "error" you are adding to the mix with your current set up.

Just drop me an email and I'd be happy to offer some good advice!

Link to comment
Share on other sites

  • Replies 82
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

----------------

On 12/7/2004 2:40:55 PM paulparrot wrote:

We're well acquainted with the concept of tens of thousands of errors per CD, albeit corrected before we hear them. With easy digital analysis tools, one can check errors of a burned CDR against its source.

Because of the nature of analog, particularly the LP, it seems to me that it gets a free pass on this score. But if we'd consider it an error whenever a stylus in a groove is not tracking absolutely precisely perfectly and failing to recreate the changes of the waveform absolutely precisely perfectly, wouldn't it be fair to say that there might be millions of small errors per side? Depending on what degree of approximate perfection you're shooting for, it seems like it'd be fair to say that the entire side of a record is a continuous error!

When playing a record we're listening to an approximation of the recording, analogous to it, not a perfect display of it. Based solely on accuracy, it seems to me that a quality digital recording must win hands down.

----------------

The digital is a not so perfect display, as well as merely a sample, of a not so perfect display. The LP is at least a complete not so perfect display. There is no real accuracy with either, if you think about it, but the LP trumps digital as it is more complete by definition to begin with. If you think digital beats LP hands down even in theory you must be deaf. WCF

Link to comment
Share on other sites

I don't get it about purported errors in digital music copies. I just don't get it. Bits are bits and digital words are made up of them.

I make up files in a spread sheet or word processor of digitally encoded words and numbers (and isn't this about 16 bits like a music encoding on a CD); and write to hard drive or CD-ROM, and make copies back and forth. Everything comes back perfect. We all do that.

Yet people tell me there are systemic errors in using the same hardware and OS when there is music encoded digitally. Same word lenght, etc. How can this be?

Error correction in music to fill in gaps?. Why think so in good conditions? Yeah, sure, we have some messed up music CDs with skips. But those flaws are obvious and last for milliseconds. I'm talking good files.

There is the overall supposition that music CDs have thousand of errors in playback which are patched by error correction algorithims which degrade the sound.

If so, does this mean to say that letters and numbers in the files created by word processors or spreadsheets get corrupted and error correction gets them back correctly? I doubt there are such errors to the extent imagined.

But if there are, and there is some error correction to make word processing or spreadsheets work, why doesn't it work the same for the numbers in music?

So, you can see my rant. Why is anyone assuming that our music data files are suffering from lack of integrity and accuracy? We see that when we, ourselves, make up data files, they work well in all forms of transcription.

I've done this rant before. We make back up copies of Bill Gate's software and it works, well, when loaded into the Intel processor. Yet when the data is music and it is going into a B-B DAC, someone paints the devil on the wall. "Oh, errors, which are poorly corrected."

Ya can't have it both ways. Bits are bits.

I'll go to sleep and check in tomorrow.

Smile,

Gil

Link to comment
Share on other sites

"There is the overall supposition that music CDs have thousand of errors in playback which are patched by error correction algorithims which degrade the sound.  "

There are two methods a CD player deals with errors... and the distinction between them is important and hasn't been made in this thread yet.

The first has been mentioned... 'error correction.' When a CD is made it has a lot of extra data on it specifically for the purpose of error checking and error correction. Error correction is as its name implies... it *corrects* errors. If/when the error correction does its job the data that comes off the disc is 100% accurate to what is was supposed to be. The error was corrected and the data is accurate. This may happen fairly frequently in playback but again the data is accurate after it goes through the error correction. IOW the delivery was accurate to what was encoded on the disc.

For major problems (scratches or whatever) there is another level of action that can be taken. This is called 'error concealment' and as its name implies its job is to conceal errors that can not be corrected for. This IS sort of guess to fill in problems. Unless there are major playback problems this very very rarely occurs during playback.

As a delivery format CD is far more accurate then vinyl. A person may not like what is being delivered but like I said earlier that is another issue entirely.

Shawn

Link to comment
Share on other sites

Gil,

You are absolutely right about that point. I don't believe that digital's shortcomings are due to to errors in the way that it is stored or played back. Analog sure does suffer from that. A bad LP pressing is not subtle. The accuracy and precision with which the information is stored and retrieved is not the problem. It's the fact that dividing up a second's worth of music into 44,100 discrete events, or even 96,000 events eliminates a subtle yet significant element of the music.

Folks talk about jitter, and it can be an issue if not dealt with, but I don't believe that it is at the heart of what is lacking to many of us when dealing with digital media. Jitter will exacerbate the issue of information loss that I describe, but even theoretically perfect digital playback will not solve the problems of missing information. I'm not trying to say that digital doesn't work. It can work very well, and continues to be refined in its implementation. I'm just saying that it is too limited in scope to compete with better analog reproduction for my ears and brain.

It's too bad for me, really. Some of my favorite records are CDs.

Ben

Link to comment
Share on other sites

Shawn,

Crossed posts here. It took some effort and time to try to be clear and concise in what I'm trying to get across. I see it as a limitation of design, not implementation. This is not a contradiction of many have asserted lately, and really renders error correction and jitter somewhat immaterial. I think this goes along with what you're hinting at. Wondering what you think...

Ben

Link to comment
Share on other sites

Hmmm... Another way to put it- accuracy and fidelity may be two differnent things. Accuracy to what is recorded-edge goes to digital, obviously. Repeatabilty and faith to the digital master is really its strength. But when we're talking about fidelity to the music, my choice is well-handled analog recordings.

Link to comment
Share on other sites

As a delivery format CD is far more accurate then vinyl. A person may not like what is being delivered but like I said earlier that is another issue entirely.

Shawn

Delivery format? They are both a rotating disc encoded with data. Far more accurate? According to and measured against what, the live event, the all powerful master tape? One methodology suffers only electro- mechanical limitations, the other the damage derived by at least two deconstruction reconstructions of only a sample of the real time event. All improvements in digital are made after the fact , not unlike tone controls with regard to flaws in frequency response.It never ceases to amaze me the reluctance in todays people to acknowledge the superiority of old technology. In this case 100 years old. WCF

Link to comment
Share on other sites

A number of minor comments:

1. Someone here said that vinyl has not got a long shelf life. I dont know how that can be stated. I have records from the 1950's that still play fine. Perfect sound forever was a lie in both ways. Niether perfect, nor apparently, forever. I have a number of CD's that will no longer play - rather higher as a proportion that vinyl disks (although that is not a really fair comparison - I test play disks before I buy them - I tend to take CD's on trust).

2. Recording an Audio CD - how can it not be an exact copy? It is an interesting point becuase obviously in theory a copy should be identical to the original. That it is not is easily demonstrable - just copy a CD - then copy the copy and repeat for 10 generations (10 copies - not 10 * 25 years).

You will find the 10th gen copy is unlistenable to. A part of the reason for this is the error correction system - as Paul said a disk can have thousands of errors on it and still comply. The problem is that your copy has the thousands of errors from the original plus an unknown number of its own. Each further copy adds errors till you get beyond the limits of the error correction process.

Vinyl errors - well guess what - it is full of them - almost innumerable per side I would guess. In fact I would not be surprised to discover that true accuracy - as in the reproduction of something EXACTLY as it was is a random chance event and happens for less than 1 second in 1000 records. that is a truely wild guess of course.

The question, however, would be: How important is ABSOLUTE accuracy? My guess is not at all. It has to be close (How close? - no idea), at least enough to fool the ear, or to fall within the ear's own orders of accuracy.

I like the accuracy assessment - compare the CD or LP to the studio sound and/or master tape because it is completely useless.

As I have concluded in another thread Acoustic memory is unsound to say the least - so comparisons would have to be done within seconds - not easy to accomplish with either format. You might be able to monitor the streams going to the medium this way - but not the performance from the medium itself - which is kind of self defeating.

Add in the issues of where you are doing your listening, over what speakers - via what amps etc. etc. and there are just too many variables to provide any meaningful analysis.

Finally - to beat the drum that is my own favorite - Synergy!

Accept for the moment that no source is perfect and each has it strengths and weaknesses. Imagine that one of those sources happens to be much stronger on, for example, treble than the other. If your system happens to boost treble - for example with a Horn loaded tweeter (now who would run one of those?) that source that is strong and actually more accurate in this area may sound far worse ON THAT SYSTEM.

Further, and this is where is gets interesting - because of the nature of the synergy of the overall playback system the least accurate medium in absolute terms might produce the most accurate rendition on your system simply because the system is biased towards compensating for its weaknesses.

Just a theory...

Link to comment
Share on other sites

"According to and measured against what"

Against what was input to the format to deliver.

I can make dozens of generations from a CD (meaning a copy of a copy... then copying the copy etc..etc..) and it will still be the same as the original version. That means the deliver is accurate to the source.

Try doing the same thing on analog. Every error in the playback is cumulative... every generation is going to pick up the playback errors of every earlier copy and add its own errors to it. It won't take many generation before the sound quality is very degraded. That is the sign of an inaccurate deliver format.

Shawn

Link to comment
Share on other sites

"You will find the 10th gen copy is unlistenable to. A part of the reason for this is the error correction system - as Paul said a disk can have thousands of errors on it and still comply. The problem is that your copy has the thousands of errors from the original plus an unknown number of its own. Each further copy adds errors till you get beyond the limits of the error correction process."

No it won't. The error correct on playback will correct the errors and they will not make it onto the copy.... that is the whole point of the error correction.

Shawn

Link to comment
Share on other sites

"It's the fact that dividing up a second's worth of music into 44,100 discrete events, or even 96,000 events..."

On playback the music that is spit out isn't divided up at all, that is what the reconstruction filter does it spits out a continuous analog wave. You can input a 1kHz test signal through an A/D and a D/A and look at it on playback. You will not see divisions... it will look like what was input.

"but even theoretically perfect digital playback will not solve the problems of missing information. "

The only missing information that is related to the sampling rate is that at or above half the sampling rate of the system. So for CDs they won't have information in them above about 22kHz. A 96kHz recording makes no difference at all to the material below 22kHz... it just lets you 'capture' to a higher frequency... that being just under 48kHz in this case.

Shawn

Link to comment
Share on other sites

Audio Flynn,

I agree, the average mastering for a popular (non-classical) CD nowadays is abysmal, with heavy compression, maximizing of levels, and too much high end EQ boost. It doesn't have to be that way, that's the sad thing.

++++++++++++++++++++++

Not that I do not like the music but new digital realeses on Windham Hill, Blue Note or Maple Shades for example are mcuh more musical than mass market productiona and mastering.

I have quite a bit of old Santana on LP and CD. THe latest 2 Santana CDs seem to have some pretty good playign but I just cannot get past the compression. I hate listeneingn to them.

Carlos could sound much better.

Link to comment
Share on other sites

Sfogg,

"I can make dozens of generations from a CD (meaning a copy of a copy... then copying the copy etc..etc..) and it will still be the same as the original version. That means the deliver is accurate to the source."

Are you using some specialist software to do this? We did it once on a PC at the office and the 10th copy sounded nothing like the original.

"No it won't. The error correct on playback will correct the errors and they will not make it onto the copy.... that is the whole point of the error correction."

Error correction is not perfect. Some errors are not corrected successfully - although the effects are inaudible the vast majority of the time. Copying copies increases the chances of accumulation of untrappable, un-fixable errors. 10 generations is normally enough to accomplish this. This should be true even on a theoretical level - assuming using the best copying equipment - although there the number of generations may jump by a factor of 10. For the vast majority of CD recorders out there - that are VERY cheaply made - the 10 generations should suffice.

Link to comment
Share on other sites

I remember that a Radio Station copy of an LP sounded better than the usual off the shelf copies.

Wouldn't it be extremely nice for the listener if copies were made specifically for Radio Stations and a second mix, utilizing the full dynamic range were made for public sale?

Shawn - I think that you are being a little generous by saying the 10 LP would be not listenable. I believe by the 7th or 8th.

Now another question, some CD writers take up the information then spit it out at high speed. Would that not put some of the frequencies above the human hearing level?

That would be akin to recording at 7 1/2 ips with playback at 3 3/4 ips. Doubling brought frequencies beyond hearing range.

Or is it that I got 1/2 hour of sleep last night?

dodger

Link to comment
Share on other sites

Gil,

The "bits is bits" argument misses one fundamental, and very significant, difference between reading / writing data files and reconstructing an analog music signal out of a digital datastream.

The difference is timing.

In a data environment, when retrieving a file off a disc, as long as the timing of the read mechanism is within tolerance, the data will be read. If a disc is so badly bollexed up that the system cannot accurately retrieve data, you'll get some sort of error message whining about the file being corrupt or unreadable. You will not ever get a file with jumbled data.

Likewise, when reading a music file, you'll either retrieve it (with any read errors 100% corrected) or you won't.

The problem is that once you've retrieved it, you've got to convert it back to an analog signal. This requires that each sample be "clocked out" to the reconstruction filter at precisely the right time - if the data word is converted to an analog voltage a snick to early, or a snack to late, the resulting analog signal will not be a perfect reproduction of the original signal.

It's not too hard to design circuits that do this clocking with a sufficient degree of accuracy (and we're talking accuracy on the magnitude of a hundred picoseconds here to be inaudible), but it does cost some money for the dual phase locked loops, or highly accurate ram buffers, or whatever technique is used. Unfortunately, many DAC's are made cheaply, and suffer some more or less of this timing error ("jitter"). Also, on the recording side, if you don't set things up right so that one master clock is syncing all the devices (recorders, mixers, DSP, whatever) in the recording chain, but each has their own clock, then you can "jitter" the signal that being written to the digital master, and once that's happened, you are fubar.

You're probably old enough to appreciate this analogy. Remember those big old line printers that IBM built in the late 60's early 70's, when print speed was measured in lines per minute? They had these looped chains with letter blocks that continually spun around inside the cabinet, and 136 little hammers in a line from left to right that would bang the letter block against the paper when the proper letter was in front of the hammer. They could print 1100 lines per minute. If you looked at te output, each letter was correct - they never hit the wrong letter. However, if the timing was off just a bit (yuck yuck) the letters wouldn't line up in perfect columns - there'd be this kind of wavy looking sway to the report because the hammer hit a letter just a bit earlier or later than it was supposed to. You could print the same document 100 times, and every one would have exactly the same words made up of exactly the same letters, but if you simply looked at them, every one was different.

Do a google on "jitter" and you'll get a bazillion hits that go into painfully deep levels of detail about the phenominon, it's effects, and it's prevention.

Link to comment
Share on other sites

Just to add a little to this I headed over to a rather useful site http://www.cdrfaq.org/faq02.html#S2-17

The following is extracted from that page - the highlighting is mine:

"Subject: <2-17> Why don't audio CDs use error correction?

(2001/08/01)

Actually, they do. It is true that audio CDs use all 2352 bytes per block for sound samples, while CD-ROMs use only 2048 bytes per block, with most of the rest going to ECC (Error Correcting Code) data. The error correction that keeps your CDs sounding the way they're supposed to, even when scratched or dirty, is applied at a lower level. So while there isn't as much protection on an audio CD as there is on a CD-ROM, there's still enough to provide perfect or near-perfect sound quality under adverse conditions.

All of the data written to a CD uses CIRC (Cross-Interleaved Reed-Solomon Code) encoding. Every CD has two layers of error correction, called C1 and C2. C1 corrects bit errors at the lowest level, C2 applies to bytes in a frame (24 bytes per frame, 98 frames per sector). In addition, the data is interleaved and spread over a large arc. (This is why you should always clean CDs from the center out, not in a circular motion. A circular scratch causes multiple errors within a frame, while a radial scratch distributes the errors across multiple frames.)

If there are too many errors, the CD player will interpolate samples to get a reasonable value. This way you don't get nasty clicks and pops in your music, even if the CD is dirty and the errors are uncorrectable. Interpolating adjacent data bytes on a CD-ROM wouldn't work very well, so the data is returned without the interpolation. The second level of ECC and EDC (Error Detection Codes) works to make sure your CD-ROM stays readable with even more errors.

It should be noted that not all CD players are created equal. There are different strategies for decoding CIRC, some better than others.

The 2 highlighted sections are a better explanation of why - I think - you cant make endless copies of copies of CD's.

Link to comment
Share on other sites

And at the risk of flogging a good source to death I found this:

"Subject: <3-18> Can I make copies of copies?

(2002/12/09)

The following was part of an e-mail message from Jeff Arnold back in mid-1997:

"I do not recommend making "copies of copies" with SNAPSHOT. The reason this does not always work is because many CDROM readers do not perform error correction of the data when doing raw sectors reads. As a result, you end up with errors on the copy that may or may not be correctable. When you make a second-generation copy of the same disc, you will make a disc that has all of the errors of the first copy, plus all of the new errors from the second reading of the disc. The cumulative errors from multiple copies will result in a disc that is no longer readable."

This initially generated some confusion, so further explanation is needed. The heart of the problem is the way that that the data is read from the source device. When a program does "raw" sector reads, it gets the entire 2352-byte block, which includes the CD-ROM error correction data (ECC) for the sector. Instead of applying the ECC to the sector data, many drives just hand back the entire block, including any errors that couldn't be corrected by the first C1/C2 layer of error correction (see section (2-17)). When the block is written to the CD-R, the uncorrected errors are written along with it.

The problem can be avoided completely by using "cooked" reads and writes. Rather than create an exact duplicate of the 2352-byte source sector, cooked reads pull off the error-corrected 2048-byte sector. The CD recorder regenerates the appropriate error correction when the data is written.

Some drives and some software will error-correct the 2048 bytes of CD-ROM data read in "raw" mode. This limits the risk of generation loss to errors introduced in the ECC bytes. If the software also regenerates the ECC, it is effectively emulating "cooked" reads and writes in "raw" mode.

This begs the question, why not just use cooked writes all the time? First of all, some older recorders (e.g. Philips CDD2000 and HP4020i) didn't support cooked writes. (Some others will do cooked but can't do raw, e.g. the Pinnacle RCD-5040.) Second, not all discs use 2048-byte MODE-1 sectors. There is no true "cooked" mode for MODE-2 data tracks; even a block length of 2336 is considered raw, so using cooked reads won't prevent generation loss.

It is important to emphasize that the error correction included in the data sector is a *second* layer of protection. A clean original disc may well have no uncorrectable errors, and will yield an exact duplicate even when copying in "raw" mode. After a few generations, though, the duplicates are likely to suffer some generation loss.

The original version of this quote went on to comment that Plextor and Sony CD-ROM drives were not recommended for making copies of copies. The reason they were singled out is because they are the only drives that explicitly warned about this problem in their programming manuals. It is possible that *all* CD-ROM drives behave the same way. (In fact, it is arguably the correct behavior... you want raw data, you get raw data.)

The final answer to this question is, you can safely make copies of copies, so long as the disc is a MODE-1 CD-ROM and you're using "cooked" writes. Copies made with "raw" writes may suffer generation loss because of uncorrected errors.

Audio tracks don't have the second layer of ECC, and will be susceptible to the same generation loss as data discs duplicated in "raw" mode. Some drives may turn off some error-correcting features, such as dropped-sample interpolation, during digital audio extraction, or may only use them when extracting at 1x. If you want to find out what your drive is capable of, try extracting the same track from a CD several times at different speeds, then do a binary comparison on the results. PC owners can use the DOS "FC" command to do this, as described in section (3-3).

It's worth noting that the C1/C2 error correction present on all types of CDs is pretty good, so it is entirely possible to make multi-generation copies with no errors whatsoever. The "cooked" approach for CD-ROMs just happens to be safer."

Which - in summary - means I am right but Sfogg is not wrong!!

Chuckle chuckle chuckle....

Link to comment
Share on other sites

" Are you using some specialist software to do this? We did it once on a PC at the office and the 10th copy sounded nothing like the original."

If you are talking about ripping the files many software packages now will automatically re-read sections in which it thinks there were possibly playback errors. And some CD-R drivers are much better at extracting the data then others... Plextors have always been very good at this. If you are using good extraction and writing software you can do bit for bit perfect copies just about ad infinitum.

How many times have your burnt data CDs where the data was not exactly what you put onto it?

There are CD players (Meridians) that will read the disc at high speed and store that data into memory. The actual playback occurs out of the memory buffer. If the player has any problems on reading it can go back and re-read those sections of the discs multiple times to try to avoid any sort of data concealment.

"Error correction is not perfect. Some errors are not corrected successfully "

When the error correction works it is perfect. When it doesn't work the player uses error concealment like I already mentioned in this thread. I've seen tests of players that are modded to have counters for when either of the the two above occur during playback. I'm trying to find that info now but as I recall the error correction kicked in like maybe 10 times during the complete playback of a CD, the error concealment never kicked in.

" For the vast majority of CD recorders out there - that are VERY cheaply made - the 10 generations should suffice."

Were you recording on the analog or digital input on them? Using cheap equipment doesn't show that the format itself isn't accurate... just that the equipment you used wasn't.

Shawn

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...