Jump to content

Rivendell61

Regulars
  • Posts

    331
  • Joined

  • Last visited

Posts posted by Rivendell61

  1. Btw, a CD source can contain at best only 96dB of dynamic range and it's usually going to be closer to 90dB. So in this example, if you're using a CD player for the playback source, the amplifier isn't going to be the limiting factor in loudest versus softest signal. It also doesn't help that digital noise floors tend to sound relatively nasty. Vinyl is going to be even worse, but it has the advantage of a much more pleasing sound to its noisefloor. 24-bit audio (like on dvd's) is capable of 144dB, but you're usually only going to get about 130dB or so. I believe human hearing is usually quoted to be around 140dB total, which puts some perspective on just how good our ears are.

    Not really.....

    If the CD was done by a competent recording engineer the noise floor will be random noise, aka 'white' (at c.93dB): it will sound like some tape hiss.

    Signals far below it can be encoded and heard.

    If the noise floor sounds bad.....it ain't the CDs fault: blame the incompetent recording engineer (they are out there!).

    Mark
  2. The MR-1s slightly bigger brother, the MR-1000, has been voluminously discussed in pro recording forums since hitting US shores c. April 2007.
    Mainly quite favorably.

    There was a question in the other recent Korg thread about burning an SACD from the MR-1 DSD file: you can not, must create a Scarlet Book file and have it pressed--really not feasible.

    But.....see here:
    http://www.ps3sacd.com/dsddiscguide.html
    here:
    http://taperssection.com/index.php/topic,89333.0.html
    and here:
    http://www.ps3sacd.com/downloads/DSDDiscFormatSpecs.pdf

    I'm not too clear on playback options for DSD Disks but I think the newer Sony VAIOs can do it--and PS3s (only at 2.8 mH the rate, not 5.6), there may be other ways.....one of those links above seems to have a plug-in for playing DSD Discs on Windows Media Player.

    The included Audiogate software does--from reports--seem to contain a fairly good decimator, so going right to PCM may be without much loss.


    Mark

  3. PWK had a pilots license and used to fly around the country to sell his products. It would not have been terribly unusual for him to 'set up' a set of Klipschorns for a customer.

    In 1958 Bill Bell (owner of the Wellesley Music Box) and Don Davis (then--I believe--V.P. of Klipsch) took a pair of Klipschorns and Marantz amps to the World's Fair in Brussels to demonstrate.....Stereo.

    Bell knew PWK.

    So the basic story of PWK helping him with an install in the 60's does not seem too unlikely--may have been visiting......


    Mark

  4. CDs may measure very well, but low-level information tends to be missing, since the 16-bit (15 bits for signal plus one bit for checksum) spec doesn't allow for as much dynamic range as you might expect.

    Islander,
    There really is no 'low level' information missing (if it is in the original signal arriving from the microphones).

    A correctly made 16 bit recording will encode and retrieve signal at -120-130dBFS....or lower.
    There is no theoretical low limit--just the practical limits of microphones, mic pres, converter quality, rooms, etc.
    And those very smallest digitally recorded signals will have.....no distortion.
    (FYI: average low perceptible signal in good home listening environments is c.3dB SPL....maybe a bit sub zero in some cases.

    If you are hearing otherwise do not blame the medium.

    Mark

  5. I am not saying they may not have other qualities that outshine CD's to your ears, but it really cannot be dynamics.

    Yes, what people hear--or think they hear--can be misleading.

    Actual LP dynamic range is not bad in the 'middle' frequencies but....not equal to CD.

    LPs can give the impression of greater dynamic range than they have.
    As J.D. Johnston points out in his loudness vs intensity pp:

    "LP distortion grows with level. That means that as level grows, the signal bandwidth (including the distortion) increases.
    -- This means that an increase in intensity is over-represented by the increase in loudness.
    + This can create an illusion of 'more dynamic range'."


    Mark

  6. Dave,

    This is going all around the barn --
    My posts made a few basic points:

    Sampling rates have nothing to do with 'accuracy' or 'resolution'.
    To sample fast will not create a more accurate signal.
    There is an 'optimal' rate for sound quality of around 60kHz.
    The output of a converter is analog and has no steps and no missing 'detail'.
    Redbook CD has nanosecond timing accuracy.
    The limits of Redbook are limits of filters.
    If a converter sounds better faster it indicates some converter quality 'issues'.
    Very fast rates--e.g., past c.96kHz--are damaging (add distortion) to the audio signal.

    Nothing new or really controversial.
    Page and quote from a good digital audio textbook.....
    Broadly understood and accepted even in the 'field' .
    Particularly in high-end 'classical' recording circles where signal transparency tends to be an obsession.
    Katz is quite well respected by both the audiophile and engineering folks. He did that very elegant experiment (1996) with many good ears and it is something of a landmark in that it made the information more widely available and understandable. It is simply a good example of one point.


    Mark

  7. I have decided not to respond in detail until I've read the book. It's been some years since I read Mr. Katz. I certainly recall his poohpoohing of upsampling, but do not recall any suggestion that less accurate is better. Such a suggestion would indicate a need for filters in wide band amplifiers and preamps and other things that appear counterintuitive.

    Further, even if higher accuracy digital waveforms sound worse than filtered lower sample rates, why not go the cheaper route and use a more accurate sample rate? Not everyone wants to spend a lot of money on an expensive filter.

    Dr. Diamonds original findings back in 1980 or so were what took me down the road of deterimining whether he was right or not. He suggested that the minimum accuracy of the digital stair steps chosen for Rebook somehow registered in the brain. His tests were done in old folks homes where high frequency information was certainly not much a of factor. They were blasted at the time, but have much more credibility today.

    As to my DAC, I am unprepared to believe there is better, at least audibly. As I've said before, I have something of a photographic memory of live audio events (or at least believe that I do) and I am satisfied when the playback meshes perfectly with my memory of the event. If it doesn't, it will never see the light of day. There is absolutely no question that I can hear the lessor accuracy of the Redbook samples as opposed to the 24/88.2, and I certainly do not believe I need a filter to rid myself of the extra accuracy of the waveforms.

    Dave,
    No one is suggesting that "less accurate is better".
    Sampling rate is not connected to accuracy.
    A faster rate is not more accurate--there is no accuracy problem--it is imaginary.
    Changing sampling rate really only changes the bandwidth.

    No stair steps. That Dr Diamonds was misled. The output of a AD/DA conversion cycle is.....analog.
    All the original signal detail--fully curvaceous, no steps-- within the chosen bandwidth is output totally intact.
    Even at poor old 44.1--no lost resolution. There may be other issues (those filters) but no 'resolution' problem.
    The sampled signal even reconstructs what happens between the samples.....

    It is non-intuitive. This is why so much misinformation about digital has stayed in circulation all these years.

    Re better converters.....they are excessively expensive.
    Rough estimate: Johnson runs about $65,000 worth for 8 channels.
    Yours seems to be fine. It just isn't perfect.
    There is always something better.


    Mark

  8. Since my 58 year old ears clearly hear these differences...

    Dave, not at all questioning that you hear what you hear.
    My point is that what you are hearing--in this instance--is a converter misbehaving.

    The faster sampling=better notion has gone through a transformation over the past several years.
    Very high sampling rates/attributions to ultrasonics--have died a well deserved death in high-end recording.
    More information/better understanding, etc.
    A few points:

    1) Try the same comparisons with a high-end converter.

    2) Read an excellent short paper by Bob Katz (not an EE but an 'audiophile' recording/mastering engineer) called the Ultimate Listening Test. He thought 96kHz sounded much better--always liked high bandwidth in analog gear, etc.
    Exceptionally well designed audio test, high end gear, many testers around the world.
    Used EEs to write digital filters, etc.
    Recorded orchestra at 96kHz sample rate, created a secondary files filtered with cut at 20kHz bandwidth.
    What they found was that the FILTER quality made the file difference audible or inaudible.
    **Repeated with various sources including castanets (high ultrasonics).
    Same result in all cases.
    It is the quality of the FILTER not the sample rate.
    It can't be said too often--if you hear a difference at different sample rates.....you need better filters.

    4) There is an interesting near consensus among high-end converter designers that about 60kHz is the ideal sampling rate for best sonics.
    Not faster, faster, faster.....
    And even slower CAN be just as good--filters are just harder to write.

    5) Fast rates (beyond say 60/88/96) are actually worse, they damage the signal and you do not need them to get a transparent result.
    If you seem to....again....look at your converter. All chips universally perform worse when operating at 192.

    From listeners reports you seem to get good results.
    I am only objecting to the assumptions about causation made from use of your particular converter--or extrapolating to theory from it.
    There is a reason better converters are made.


    Mark

  9. If the equipment were there, I'd record at 24/352.8. Why? Why not!! Is there such a thing as too much accuracy?

    Hmmm, as I recollect the AES article does not add much constructive to the CD vs DSD discussion.
    Not a very well designed ABX test if it was trying to claim inaudibility....but maybe it was not.

    The prime article Don links has some misinformation:
    High sampling rates do NOT provide more accurate timing resolution.
    Redbook timing is in the nanosecond range--timing resolution is NOT constrained by sample rate.
    Maybe I'm nit picking but that incorrect contention breeds constant false implications.....


    Mallette,
    Am I misinterpreting you?...or are you falling into the 'higher sample rates capture more detail' fallacy?
    Rates have nothing to do with what is usually implied by 'resolution'.
    A good converter should NOT sound better at very high rates. Quite the opposite in fact.
    If your CardDeluxe seems to sound better--it may be it's way of suggesting that you one day invest in a better converter :-)


    Mark

  10. I guess there is no interest in measuring the slew rate?

    Raw slew rate by itself--high or low--is not reliably useful re performance/sonic info.
    An amplifier with a high slew rate may have elevated slew rate distortion at lower levels vs a slower circuit.

    What we really trying to discover is: how linear is a particular amp at the fastest slewing frequency?
    A twin tone IMD (CCIF/ITU-R 18.5 kHz and 19.5 kHz) = an appropriate diagnostic tool.
    If that result is good.....amplifier slew rate is fine.

    Mark

  11. InnerTuber:
    That Benchmark DAC1 USB--would be a great choice (and outboard conversion tends to be preferred).
    It will stream 96kHz/24 via USB, *bit perfect on native drivers*, no ASIO needed (the only device that can do it).
    Hard to beat signal quality.

    ASIO has been suggested earlier but....it does not guarantee bit-perfect.
    Benchmark tested several ASIO USB devices and found NONE of those were bit-perfect (which does not mean none exist).

    Consumer/pro-sumer digital stuff (ala 'm-audio') does not always perform to claim/spec.
    If you 'need' a high quality signal--for whatever reason--be cautious if looking there.
    Unfortunatly pro stuff tends to be expensive--so priorities must be considered.
    Likewise 'player' software such as i-Tunes (an evolving minefield, in v.7 of i-Tunes SRC + vol. control is better/CoreAudio still poor). VLC is good (OS X friendly), Foobar v. good (but not for OS X).

    Mark

  12. Since he was asking about harpsichords, I was wondering what are some excellent recordings for harpsichord and flute?

    There is a set of two CDs by Janet See (flute) and Davitt Moroney (harpsichord): 'Bach--Flute Sonatas'.
    Very good, sold cheap on a sub-label of Harmonia Mundi called 'Classical Express'.

    For Scarlatti the Pierre Hantai recordings on Mirare are a good choice. They seem to be more available at Presto Classical (UK) than places I see in the US.

    Mark

  13. Something completely different--
    Not termed a 'pre-amp' but....it is--and it would make a superb home pre.

    If you are into maximal SQ and have moderately deep pockets....hard to beat.
    By Chris Muth (respected designer of high-end custom pro gear)--used in many mastering studios (Sterling), etc....
    Will take a digi signal too (built in Troisi).
    http://www.dangerousmusic.com/monitor.html

    Another maker (nice remote vol.):
    http://www.cranesong.com/AVOCET.html

    OTOH--
    If remote/multi channel is essential:
    http://www.dangerousmusic.com/stsr.html


    Mark

  14. I am unaware of any quirks to providing complete coverage of common sample rates. It's done in 30.00 sound cards these days. My leap of faith was the number of audiophile niceties provided causing me to hope for this one. I think they will do it soon. Someone will. Not a big market, but a market nonetheless.

    Consumer and 'pro-sumer' gear sometimes make claims that are not entirely true--they usually present an output signal at most common input sample rates but that output signal is not always bit-perfect--or output at the original rates.
    For example, the Oppo 970HD reduces all bit widths to 16 (and then the question arises: having added truncation distortion....does it dither correctly? Not all DVD players do! Goodbye high-res....).
    That Oppo also converts all input sample rates to 44 or 48.
    None of which says it may not have an appealing sound.
    Maybe the newer Oppos have changed this behavior.

    Per Mike, for the past many years converters are WAY oversampling (MHz-even GHz) at the input--mainly to aid skating past filter issues. Very few converters (maybe 2-3) use 'correct' filters......

    Mark

  15. Sorry Riv - you are incorrect. The result of jitter in audio would in fact be audible as it would result in lost bits with the subsequent discarding. Lost bits = incomplete data. Incomplete data in the audio realm without any type of buffering or ACK will result in incomplete program material, not distorted material, as the D/A converters do not have the ability to magically "bridge" and "smooth out" over the missing data.

    You are also making the medium out to be far more volatile than it really is, especially in that capactiy. The routers and gear I run at the data center have very extensive logging and error monitoring. I have extremely long and radically curved optical cable runs between buildings and between floors, and run them at full OC-48 capacity (2.4 Gb/s) with recorded jitter rates at less than .005 % at all points. This is a level of quality and service that we are *required* to keep due mostly to the data rates and volumes we are talking.

    Home audio doesn't even begin to scratch that kind of traffic rate, nor push the "digital envelope" for streaming traffic. As such - any jitter would be horribly and exponentially amplified and most certainly would be audible. The rate is so slow that if there was a resulting time shift, the only thing that would happen is the lock between the components would be lost entirely as there is no buffering, no ACK and no retry, and it would carry across ALL the data traffic, not sporadically here and there... the data rate is not remotely high enough on home audio gear to be that touchy and at any real risk.

    The closest you could ever come would be to lay optical cable between components in an un-stressed state, play your source material, then rapidly bend it past treshold, and then back again - hoping the distortion in the cable itself isn't so great that it renders it useless. Your result will only be a sharp click in your program material, then silence - while the lock is lost, and tries to re-establish. Again - it is streamed 1's and 0's with no buffer, no ACK, and no retry. It either works, or it doesn't. The slow data rates and short cable lengths do not warrant any worthwhile measurements of jitter on a digital level, and certainly not on an audible one.

    Srobak--
    I don't want to be too hard on you but....yikes. You are misapplying your experience to audio.
    What I described above re jitter is basic: Digital Audio 101.
    Maybe it will help if I repeat this: 'digital' must be be viewed (from an engineering perspective) as at least semi-analog when it goes to an analog device (and uses the embedded clock) ala most consumer use.

    Srobak says: "It either works, or it doesn't. The slow data rates and short cable lengths do not warrant any worthwhile measurements of jitter on a digital level, and certainly not on an audible one"

    Here....since you deny the fundamental point--that jitter modulates the signal, adding distortion --read this:

    "Jitter modulates the audio signal. The result is that phase modulation sidebands are created above and below every tone in the audio signal. Worse yet, these sidebands are often widely separated from the audio tone they are modulating. Consequently, the distortion caused by jitter is far more audible than THD+N measurements would suggest. Phase modulation sidebands are not musical in nature (they are not harmonically related to the audio), and they are poorly masked (because of the wide separation from the fundamental). In short jitter induced distortion closely resembles intermodulation distortion (IMD) and is much more audible than harmonic distortion (THD). Jitter creates new “audio” that is not harmonically related to the original audio signal. This “audio” is unexpected and unwanted. It will cause a loss of imaging in the stereo signal and when severe, will create what some have called muddiness (a lot of extra low and mid frequency sound that was not in the original). Jitter induced sidebands can be measured very easily using an FFT analyzer."
    --John Siau, Director of Engineering at Benchmark (formerly of CBS Labs-- one of the better digi engineers out there).....


    Mark

  16. Not to be picky, but what is the source of this information? Did you mean 15ps? It's not very difficult to find ADC's that have rejection above 100ps.

    In the end, the audible result of jitter is clicks...if you're not hearing clicks, then there is nothing to worry about.

    That is wrong.
    The audible result of jitter is NOT clicks.

    Jitter modulates the signal. It creates distortion.
    Only extreme levels (beyond anything we are talking about) might cause the converter to break lock/click.
    My example above of 15ns of jitter (from an optical cable) would create c. -70dB side bands in relation to a 10kHz test tone.

    Regarding the attenuation capabilities--if any--of a particular converter....you will almost never find that specified (it is almost never tested)--and if it is it will be in fairly meaningless numbers (for good reason)--except in expensive pro boxes.
    Finding an 'ADC' (I assume that is a typo)--is this an HT box?-- which claims 'rejection above 100ps' (a meaningless number when unqualified) is not the issue. Assumptions of competency are often wrong when dealing with digital, and jitter attenuation is the kind of info you will only know for most converters by measuring yourself.

    Remember.....this is about "does it (cable induced distortion in 'digital') exist" in this scenario--see my first post--and the answer is unequivocally 'yes'. How important it is in any particular usage is much more complex--but not trivial.

    I will put together and post some refs for you later today/tonight....if I get time!

    Mark

  17. You are sending a signal to a converter, and presumably using the embedded clock, in which case an optical cable--even in short 'home use' lengths--certainly may degrade a 'digital' signal enough to be audible.

    Do you have a source for that claim? Modern CODECs have multiple clocks and even detect the edges of the signal itself. They have their own little buffer inside where I'm sure one could verify that the signal is always bit for bit identical to the input. If I was feeling more energetic, I could run any DAC through my lab bench to verify accuracy rates...but it's really a lost cause. ANY errors are going to be drastically apparent as clicks and pops. You can't get partial degredation of a digital signal. I'm certainly not an expert on the subject, but I'm doing two design projects right now that involve direct hardware and software interfacing with Codecs, DACs and ADCs (from both TI and Analog Devices). Digital accuracy certainly is not an issue. In fact, if one were to claim that there is data loss, then one would need to explain why the I2S or SPI standards have no problems perfectly communicating digital information in just about every digital device on the market...it's just serial data. Heck, how long has RS-232 been around?

    As with most things digital this subject rapidly becomes complex but the notion of perfect digital is frequently forfeited in the hardware/software reality--unless *lots* of care is taken. The scale of this particular problem will ultimately depend on the quality of the D/A converter. My very minimal exposure to HT products (brief look at a high-endish $4k receiver) did not look encouraging re D/A quality--which is not a surprise.

    The D/A's being used in "HT products" are identical to what you're going to find in most any over priced exotic product. If there is any difference, it is probably going to be in the analog output section for the unit - I've been told that usually the biggest difference is the cheap DC blocking electrolytic caps in series with the output, but I suppose crappy opamps could be used too. IF really really cheap D/A's are being used, then the audible result is most likely going to be in the noise floor (usually due to crappy dithering and anti-aliasing filters). Nevertheless, D/A's are a totally different issue than digital input buffers...

    In fact, I would be kinda impressed to see any digital transport showing a loss of accuracy. All you gotta do is send a known digital signal and then store the output of the receiving end into a buffer and then compare the results. I could probably write a script that automatically compares the difference just to see how long it takes until a single error pops up...then I can store the sample and listen to the audible result to see how much it matters.

    All I gotta say is that if accuracy were a concern, then engineers would have already come out with a solution that is 100% perfect. The fact of the matter is that the issues involving serial data transfer were solved in the 60's...

    It's quite possible that I could be wrong about all of this, but I would be much more interested in facts or demonstrations that I could repeat on my own. I think some of my profs would be interested too (one in particular who worked extensively on digital standards).

    Nothing in my post is unusual or controversial--just basic digital practice, principles, and stuff.
    Do you have a digi text book?--it will be in there.

    OK. I'll try to be helpful :-)

    Here is the 'principle'--slightly expanded. Two scenarios:
    1) Send a signal from a digital source to a 'digital' destination, (something which only needs numbers: a recorder of some sort, say a HD) via optical cable, and.....no problem! 100% perfect capture! No worries. But we are not doing that.
    2) Send a signal from a digital source, via optical cable, to an *analog* device (an A/D converter), and use the embedded clock (also analog)....then there is a potential problem: you have to deal with *jitter*.

    So, how much jitter is there? And does it matter?
    Optical cable can cause significant jitter--generally much more than a coax connection.
    Basic modeling of a short section of optical cable: 3m will cause c.15 nanoseconds of jitter. Bend the cable a bit and jitter increases!
    And yes, 15ns at the converter is within the worrisome range for audibility.

    That was the important bit re this thread.
    Then you can move on to the rest of the complexities.....e.g., how good is your box converter at rejecting jitter?
    At 5kHz? At 10kHz? 20kHz? etc......
    Hopefully the answer to the second half of your question is now apparent as well.

    Mark

  18. of course the optics have to be clear and clean and perfect... otherwise it won't work period. It is an all or nothing technology. If you are getting sound, then it is working. Dirty or bad optics aren't going to randomly choose 1's and 0's to drop... if it will drop one, it will drop them all... it is not a dynamic medium.

    You are sending a signal to a converter, and presumably using the embedded clock, in which case an optical cable--even in short 'home use' lengths--certainly may degrade a 'digital' signal enough to be audible.
    As with most things digital this subject rapidly becomes complex but the notion of perfect digital is frequently forfeited in the hardware/software reality--unless *lots* of care is taken.
    The scale of this particular problem will ultimately depend on the quality of the D/A converter. My very minimal exposure to HT products (brief look at a high-endish $4k receiver) did not look encouraging re D/A quality--which is not a surprise.
    Should anyone worry about this for watching movies? I doubt it.

    Mark

  19. PS - now would be a good time to stock up on Olive Oil - the Peloponese is a major region for growing olives here - and much of the "Italian" Product you get over there is actually from over here - exported to Italy and then re-badged to go to you (old Mafia thing apparently).

    The New Yorker article, Slippery Business: The trade in adulterated olive oil discusses what is virtually worldwide olive oil fraud! Not exactly like French control over wine-making, AFAIK.

    Good link Larry - nice to see the Romans had better control over quality than the Italians do today....

    Re those Romans....a few years prior to the yuppie introduction of drizzled olive oil and granite topped kitchen counters to Slough and the Edinburgh suburbs the Romans were importing and enjoying the extra virgin at leafy places like Chichester, Cirencester, and even northern Haltwhistle.

    After the Romans left, c.410AD--in the so called "Dark Ages"--someone was still importing ship loads of the stuff (along with mediteranean wines, etc) via Tintagel in Cornwall.

    Then it was back to lard, sausage, beans and chips for 1400 years.

    Mark

×
×
  • Create New...