The whole thing -- from the "scientific" standpoint (what you can see on an oscilloscope, for instance) -- is pretty esoteric. I well remember some of the first CDs I heard -- some 30 or 40 years ago now -- and they were pretty awful. Could best be described as harsh. Theoretically, the 44.1 khz sampling rate should be able to reproduce up to 22 khz -- which should be ample. The problem arises with random noise affecting the precise instant sampled. I'm not sure, honestly, how the more up to date analog to digital converters operate, but my suspicion is that they have a very sharp roll off filter (12 db per octave or greater) for high frequencies. Digital to analog converters would, of necessity, have the same sort of thing. In principle, a higher sampling rate would reproduce higher frequencies, subject to the same filtering concept. Most high fidelity equipment won't reproduce much above say 20 khz anyway -- but some very strange intermodulation things at much lower frequencies can happen on the way to the speaker, if the higher frequencies aren't filtered out at the source.
What I have heard with lossy files, however, is different -- and is almost always a perceived loss in low frequency response, oddly enough, at least to my ears. More evident in organ music, perhaps, although also noticeable (to me) in piano music -- but the first time I noticed it was in listening to an .mp3 squashed recording of Wagner's Rheingold; the opening contrabass E flats (about 21 hz fundamental) just weren't right at all. And honestly I have no idea why this should be the case -- but now that I've noticed it, it's pretty consistent.