I usually try to refrain from such discussions because my belief is that audio is 100% a matter of personal preference and hence debating whether a beef steak tastes better than a plate of spaghetti or should you put cheese on your spaghetti is a total waste of time.
But heck, since everyone is somewhat against cable upgrades, I felt too tempted not to post a reply. A matter of angst?
Anyway having said that, lets proceed on to my findings.
While I do know that in theory, cables shouldn't make that great a difference in your electronics, (as I am an electronics engineering undergrad afterall), the fact remains that I can hear distinct differences in cables. And I think that is what matters in the end.
Yes, you can prove to me using electrical principles that cables shouldn't make a difference, but the fact remains that I can hear a difference.
What I noticed however from auditioning hifis in salons and hifis in other audiophile's homes, is that different hifis have different sensitivity to cable changes.
And usually(USUALLY I said, I elaborate on the exceptions later), hi resolution systems are more sensitive to cables than a low res one. For example, when testing spik cables of a really high resolution system, the differences between the cables were like night and day. However on a low resolution hifi, I found myself thinking "did I just imagine that change?" when testing out the different cables.
However like I said, there are exceptions. I have auditioned hi resolution hifis where the cables make absolutely no difference to the sound. Again a matter of system cable sensitivity.
As for my rf3s, I found them to be somewhat sensitive to interconnect changes when using a rotel RCD971 cdp as the source. But when I swapped the source to a modified marantz cd63, the sound differences caused by swapping interconnects became alot more subtle. So I suppose every component does play a role in cable sensitivity I guess.
Oh well, like I said, it's your hobby and ears and hifi and room and etc. It's up to you.