It’s not just the cutoff in frequency, it also has to do with the digital sampling rate. The higher the frequency of the waveform, the less accurate the sample will be. With the CD sample rate, you start to lose some of the quality of the reproduction of the analog signal long before 20Khz
Also depends on the frequency response and phase shifting of your analog hardware along the way to your ear.
Flat wrong. No, really, that is absolutely wrong. Go read up on the Nyquist Theorem, to wit: every frequency up to X Hz can be perfectly reproduced by sampling at 2X Hz. CDs sample at 44kHz. Human hearing maxes out around 20kHz (most adults are lower than that). Filter out everything over 20kHz, sample at 44kHz, and the data representation of that sound is complete (any subsequent limitations are due to analog microphone & speakers).