5
2
Every place I look that discusses digital audio quality seems to talk about almost nothing but bit-rate. However, it seems to me that the bit-rate only indicates the number of kbps, and not the actual quality.
Let's say we have this 128kbps file and we convert it to 256kbps. Effectively, we've done nothing but increase the file size as no new information has been added. But is it possible to tell that the audio came from a file with 128kbps quality and that that's all the 'quality' it contains? If I play the 2 files over high quality speakers, will there be an difference, however unnoticeable by humans?
(I know this is an odd question as you virtually never want to rip audio to a higher bit-rate than it was originally, but I'm curious about the true relationship between audio bit-rate and actual sound quality and this seemed to be a good way to examine the difference. :)
I think lack of certain frequencies and presence of certain typical distortions can be attributed to a particular codec and bitrate so presence of these attributes in a high-bitrate or a lossless audio file can potentially reveal the fact they have been converted this way to fake quality. I wish there was a program that could do this automatically so I could just scan my music collection, find such files and compress them back to save space without loosing anything... – Ivan – 2016-07-25T05:18:37.783
When you attempt to up convert, it increases the file size due to oversampling the original 128, actually there would be a slight loss in quality when up converting. Human ears vary, some may be able to hear the difference. – Moab – 2011-07-08T03:05:35.063