Jump to content

Rivendell61

Regulars
  • Posts

    331
  • Joined

  • Last visited

Everything posted by Rivendell61

  1. http://news.bbc.co.uk/2/hi/uk_news/scotland/8368895.stm
  2. Not really..... If the CD was done by a competent recording engineer the noise floor will be random noise, aka 'white' (at c.93dB): it will sound like some tape hiss. Signals far below it can be encoded and heard. If the noise floor sounds bad.....it ain't the CDs fault: blame the incompetent recording engineer (they are out there!). Mark
  3. The MR-1s slightly bigger brother, the MR-1000, has been voluminously discussed in pro recording forums since hitting US shores c. April 2007. Mainly quite favorably. There was a question in the other recent Korg thread about burning an SACD from the MR-1 DSD file: you can not, must create a Scarlet Book file and have it pressed--really not feasible. But.....see here: http://www.ps3sacd.com/dsddiscguide.html here: http://taperssection.com/index.php/topic,89333.0.html and here: http://www.ps3sacd.com/downloads/DSDDiscFormatSpecs.pdf I'm not too clear on playback options for DSD Disks but I think the newer Sony VAIOs can do it--and PS3s (only at 2.8 mH the rate, not 5.6), there may be other ways.....one of those links above seems to have a plug-in for playing DSD Discs on Windows Media Player. The included Audiogate software does--from reports--seem to contain a fairly good decimator, so going right to PCM may be without much loss. Mark
  4. In 1958 Bill Bell (owner of the Wellesley Music Box) and Don Davis (then--I believe--V.P. of Klipsch) took a pair of Klipschorns and Marantz amps to the World's Fair in Brussels to demonstrate.....Stereo. Bell knew PWK. So the basic story of PWK helping him with an install in the 60's does not seem too unlikely--may have been visiting...... Mark
  5. Islander, There really is no 'low level' information missing (if it is in the original signal arriving from the microphones). A correctly made 16 bit recording will encode and retrieve signal at -120-130dBFS....or lower. There is no theoretical low limit--just the practical limits of microphones, mic pres, converter quality, rooms, etc. And those very smallest digitally recorded signals will have.....no distortion. (FYI: average low perceptible signal in good home listening environments is c.3dB SPL....maybe a bit sub zero in some cases. If you are hearing otherwise do not blame the medium. Mark
  6. Yes, what people hear--or think they hear--can be misleading. Actual LP dynamic range is not bad in the 'middle' frequencies but....not equal to CD. LPs can give the impression of greater dynamic range than they have. As J.D. Johnston points out in his loudness vs intensity pp: "LP distortion grows with level. That means that as level grows, the signal bandwidth (including the distortion) increases. -- This means that an increase in intensity is over-represented by the increase in loudness. + This can create an illusion of 'more dynamic range'." Mark
  7. Dave, This is going all around the barn -- My posts made a few basic points: Sampling rates have nothing to do with 'accuracy' or 'resolution'. To sample fast will not create a more accurate signal. There is an 'optimal' rate for sound quality of around 60kHz. The output of a converter is analog and has no steps and no missing 'detail'. Redbook CD has nanosecond timing accuracy. The limits of Redbook are limits of filters. If a converter sounds better faster it indicates some converter quality 'issues'. Very fast rates--e.g., past c.96kHz--are damaging (add distortion) to the audio signal. Nothing new or really controversial. Page and quote from a good digital audio textbook..... Broadly understood and accepted even in the 'field' . Particularly in high-end 'classical' recording circles where signal transparency tends to be an obsession. Katz is quite well respected by both the audiophile and engineering folks. He did that very elegant experiment (1996) with many good ears and it is something of a landmark in that it made the information more widely available and understandable. It is simply a good example of one point. Mark
  8. Dave, No one is suggesting that "less accurate is better". Sampling rate is not connected to accuracy. A faster rate is not more accurate--there is no accuracy problem--it is imaginary. Changing sampling rate really only changes the bandwidth. No stair steps. That Dr Diamonds was misled. The output of a AD/DA conversion cycle is.....analog. All the original signal detail--fully curvaceous, no steps-- within the chosen bandwidth is output totally intact. Even at poor old 44.1--no lost resolution. There may be other issues (those filters) but no 'resolution' problem. The sampled signal even reconstructs what happens between the samples..... It is non-intuitive. This is why so much misinformation about digital has stayed in circulation all these years. Re better converters.....they are excessively expensive. Rough estimate: Johnson runs about $65,000 worth for 8 channels. Yours seems to be fine. It just isn't perfect. There is always something better. Mark
  9. Dave, not at all questioning that you hear what you hear. My point is that what you are hearing--in this instance--is a converter misbehaving. The faster sampling=better notion has gone through a transformation over the past several years. Very high sampling rates/attributions to ultrasonics--have died a well deserved death in high-end recording. More information/better understanding, etc. A few points: 1) Try the same comparisons with a high-end converter. 2) Read an excellent short paper by Bob Katz (not an EE but an 'audiophile' recording/mastering engineer) called the Ultimate Listening Test. He thought 96kHz sounded much better--always liked high bandwidth in analog gear, etc. Exceptionally well designed audio test, high end gear, many testers around the world. Used EEs to write digital filters, etc. Recorded orchestra at 96kHz sample rate, created a secondary files filtered with cut at 20kHz bandwidth. What they found was that the FILTER quality made the file difference audible or inaudible. **Repeated with various sources including castanets (high ultrasonics). Same result in all cases. It is the quality of the FILTER not the sample rate. It can't be said too often--if you hear a difference at different sample rates.....you need better filters. 4) There is an interesting near consensus among high-end converter designers that about 60kHz is the ideal sampling rate for best sonics. Not faster, faster, faster..... And even slower CAN be just as good--filters are just harder to write. 5) Fast rates (beyond say 60/88/96) are actually worse, they damage the signal and you do not need them to get a transparent result. If you seem to....again....look at your converter. All chips universally perform worse when operating at 192. From listeners reports you seem to get good results. I am only objecting to the assumptions about causation made from use of your particular converter--or extrapolating to theory from it. There is a reason better converters are made. Mark
  10. Hmmm, as I recollect the AES article does not add much constructive to the CD vs DSD discussion. Not a very well designed ABX test if it was trying to claim inaudibility....but maybe it was not. The prime article Don links has some misinformation: High sampling rates do NOT provide more accurate timing resolution. Redbook timing is in the nanosecond range--timing resolution is NOT constrained by sample rate. Maybe I'm nit picking but that incorrect contention breeds constant false implications..... Mallette, Am I misinterpreting you?...or are you falling into the 'higher sample rates capture more detail' fallacy? Rates have nothing to do with what is usually implied by 'resolution'. A good converter should NOT sound better at very high rates. Quite the opposite in fact. If your CardDeluxe seems to sound better--it may be it's way of suggesting that you one day invest in a better converter :-) Mark
  11. Raw slew rate by itself--high or low--is not reliably useful re performance/sonic info. An amplifier with a high slew rate may have elevated slew rate distortion at lower levels vs a slower circuit. What we really trying to discover is: how linear is a particular amp at the fastest slewing frequency? A twin tone IMD (CCIF/ITU-R 18.5 kHz and 19.5 kHz) = an appropriate diagnostic tool. If that result is good.....amplifier slew rate is fine. Mark
  12. InnerTuber: That Benchmark DAC1 USB--would be a great choice (and outboard conversion tends to be preferred). It will stream 96kHz/24 via USB, *bit perfect on native drivers*, no ASIO needed (the only device that can do it). Hard to beat signal quality. ASIO has been suggested earlier but....it does not guarantee bit-perfect. Benchmark tested several ASIO USB devices and found NONE of those were bit-perfect (which does not mean none exist). Consumer/pro-sumer digital stuff (ala 'm-audio') does not always perform to claim/spec. If you 'need' a high quality signal--for whatever reason--be cautious if looking there. Unfortunatly pro stuff tends to be expensive--so priorities must be considered. Likewise 'player' software such as i-Tunes (an evolving minefield, in v.7 of i-Tunes SRC + vol. control is better/CoreAudio still poor). VLC is good (OS X friendly), Foobar v. good (but not for OS X). Mark
  13. There is a set of two CDs by Janet See (flute) and Davitt Moroney (harpsichord): 'Bach--Flute Sonatas'. Very good, sold cheap on a sub-label of Harmonia Mundi called 'Classical Express'. For Scarlatti the Pierre Hantai recordings on Mirare are a good choice. They seem to be more available at Presto Classical (UK) than places I see in the US. Mark
  14. Something completely different-- Not termed a 'pre-amp' but....it is--and it would make a superb home pre. If you are into maximal SQ and have moderately deep pockets....hard to beat. By Chris Muth (respected designer of high-end custom pro gear)--used in many mastering studios (Sterling), etc.... Will take a digi signal too (built in Troisi). http://www.dangerousmusic.com/monitor.html Another maker (nice remote vol.): http://www.cranesong.com/AVOCET.html OTOH-- If remote/multi channel is essential: http://www.dangerousmusic.com/stsr.html Mark
  15. Consumer and 'pro-sumer' gear sometimes make claims that are not entirely true--they usually present an output signal at most common input sample rates but that output signal is not always bit-perfect--or output at the original rates. For example, the Oppo 970HD reduces all bit widths to 16 (and then the question arises: having added truncation distortion....does it dither correctly? Not all DVD players do! Goodbye high-res....). That Oppo also converts all input sample rates to 44 or 48. None of which says it may not have an appealing sound. Maybe the newer Oppos have changed this behavior. Per Mike, for the past many years converters are WAY oversampling (MHz-even GHz) at the input--mainly to aid skating past filter issues. Very few converters (maybe 2-3) use 'correct' filters...... Mark
  16. Srobak-- I don't want to be too hard on you but....yikes. You are misapplying your experience to audio. What I described above re jitter is basic: Digital Audio 101. Maybe it will help if I repeat this: 'digital' must be be viewed (from an engineering perspective) as at least semi-analog when it goes to an analog device (and uses the embedded clock) ala most consumer use. Srobak says: "It either works, or it doesn't. The slow data rates and short cable lengths do not warrant any worthwhile measurements of jitter on a digital level, and certainly not on an audible one" Here....since you deny the fundamental point--that jitter modulates the signal, adding distortion --read this: "Jitter modulates the audio signal. The result is that phase modulation sidebands are created above and below every tone in the audio signal. Worse yet, these sidebands are often widely separated from the audio tone they are modulating. Consequently, the distortion caused by jitter is far more audible than THD+N measurements would suggest. Phase modulation sidebands are not musical in nature (they are not harmonically related to the audio), and they are poorly masked (because of the wide separation from the fundamental). In short jitter induced distortion closely resembles intermodulation distortion (IMD) and is much more audible than harmonic distortion (THD). Jitter creates new “audio” that is not harmonically related to the original audio signal. This “audio” is unexpected and unwanted. It will cause a loss of imaging in the stereo signal and when severe, will create what some have called muddiness (a lot of extra low and mid frequency sound that was not in the original). Jitter induced sidebands can be measured very easily using an FFT analyzer." --John Siau, Director of Engineering at Benchmark (formerly of CBS Labs-- one of the better digi engineers out there)..... Mark
  17. That is wrong. The audible result of jitter is NOT clicks. Jitter modulates the signal. It creates distortion. Only extreme levels (beyond anything we are talking about) might cause the converter to break lock/click. My example above of 15ns of jitter (from an optical cable) would create c. -70dB side bands in relation to a 10kHz test tone. Regarding the attenuation capabilities--if any--of a particular converter....you will almost never find that specified (it is almost never tested)--and if it is it will be in fairly meaningless numbers (for good reason)--except in expensive pro boxes. Finding an 'ADC' (I assume that is a typo)--is this an HT box?-- which claims 'rejection above 100ps' (a meaningless number when unqualified) is not the issue. Assumptions of competency are often wrong when dealing with digital, and jitter attenuation is the kind of info you will only know for most converters by measuring yourself. Remember.....this is about "does it (cable induced distortion in 'digital') exist" in this scenario--see my first post--and the answer is unequivocally 'yes'. How important it is in any particular usage is much more complex--but not trivial. I will put together and post some refs for you later today/tonight....if I get time! Mark
  18. Do you have a source for that claim? Modern CODECs have multiple clocks and even detect the edges of the signal itself. They have their own little buffer inside where I'm sure one could verify that the signal is always bit for bit identical to the input. If I was feeling more energetic, I could run any DAC through my lab bench to verify accuracy rates...but it's really a lost cause. ANY errors are going to be drastically apparent as clicks and pops. You can't get partial degredation of a digital signal. I'm certainly not an expert on the subject, but I'm doing two design projects right now that involve direct hardware and software interfacing with Codecs, DACs and ADCs (from both TI and Analog Devices). Digital accuracy certainly is not an issue. In fact, if one were to claim that there is data loss, then one would need to explain why the I2S or SPI standards have no problems perfectly communicating digital information in just about every digital device on the market...it's just serial data. Heck, how long has RS-232 been around? The D/A's being used in "HT products" are identical to what you're going to find in most any over priced exotic product. If there is any difference, it is probably going to be in the analog output section for the unit - I've been told that usually the biggest difference is the cheap DC blocking electrolytic caps in series with the output, but I suppose crappy opamps could be used too. IF really really cheap D/A's are being used, then the audible result is most likely going to be in the noise floor (usually due to crappy dithering and anti-aliasing filters). Nevertheless, D/A's are a totally different issue than digital input buffers... In fact, I would be kinda impressed to see any digital transport showing a loss of accuracy. All you gotta do is send a known digital signal and then store the output of the receiving end into a buffer and then compare the results. I could probably write a script that automatically compares the difference just to see how long it takes until a single error pops up...then I can store the sample and listen to the audible result to see how much it matters. All I gotta say is that if accuracy were a concern, then engineers would have already come out with a solution that is 100% perfect. The fact of the matter is that the issues involving serial data transfer were solved in the 60's... It's quite possible that I could be wrong about all of this, but I would be much more interested in facts or demonstrations that I could repeat on my own. I think some of my profs would be interested too (one in particular who worked extensively on digital standards). Nothing in my post is unusual or controversial--just basic digital practice, principles, and stuff. Do you have a digi text book?--it will be in there. OK. I'll try to be helpful :-) Here is the 'principle'--slightly expanded. Two scenarios: 1) Send a signal from a digital source to a 'digital' destination, (something which only needs numbers: a recorder of some sort, say a HD) via optical cable, and.....no problem! 100% perfect capture! No worries. But we are not doing that. 2) Send a signal from a digital source, via optical cable, to an *analog* device (an A/D converter), and use the embedded clock (also analog)....then there is a potential problem: you have to deal with *jitter*. So, how much jitter is there? And does it matter? Optical cable can cause significant jitter--generally much more than a coax connection. Basic modeling of a short section of optical cable: 3m will cause c.15 nanoseconds of jitter. Bend the cable a bit and jitter increases! And yes, 15ns at the converter is within the worrisome range for audibility. That was the important bit re this thread. Then you can move on to the rest of the complexities.....e.g., how good is your box converter at rejecting jitter? At 5kHz? At 10kHz? 20kHz? etc...... Hopefully the answer to the second half of your question is now apparent as well. Mark
  19. You are sending a signal to a converter, and presumably using the embedded clock, in which case an optical cable--even in short 'home use' lengths--certainly may degrade a 'digital' signal enough to be audible. As with most things digital this subject rapidly becomes complex but the notion of perfect digital is frequently forfeited in the hardware/software reality--unless *lots* of care is taken. The scale of this particular problem will ultimately depend on the quality of the D/A converter. My very minimal exposure to HT products (brief look at a high-endish $4k receiver) did not look encouraging re D/A quality--which is not a surprise. Should anyone worry about this for watching movies? I doubt it. Mark
  20. This Epson (V750-M Pro) is reputed to rival the Nikon Coolscans for film--costs less--and does the flat bed stuff too: http://www.epson.com/cgi-bin/Store/consumer/consDetail.jsp?cookies=no&oid=63056500&com.broadvision.session.new=Yes Mark
  21. It's from the Turn Me Up! web site. If you don't like it go here: http://www.turnmeup.org/ And join up-- Mark
  22. A new initiative for those concerned re the 'loudness war': http://www.turnmeup.org/ Mark
  23. The New Yorker article, Slippery Business: The trade in adulterated olive oil discusses what is virtually worldwide olive oil fraud! Not exactly like French control over wine-making, AFAIK. Good link Larry - nice to see the Romans had better control over quality than the Italians do today.... Re those Romans....a few years prior to the yuppie introduction of drizzled olive oil and granite topped kitchen counters to Slough and the Edinburgh suburbs the Romans were importing and enjoying the extra virgin at leafy places like Chichester, Cirencester, and even northern Haltwhistle. After the Romans left, c.410AD--in the so called "Dark Ages"--someone was still importing ship loads of the stuff (along with mediteranean wines, etc) via Tintagel in Cornwall. Then it was back to lard, sausage, beans and chips for 1400 years. Mark
  24. I use an SI (battery powered) the same way: hooked to a pair of Allisons where I often work late at night. At those low vols they sound great. Mark
×
×
  • Create New...