73
16
Why does the discrepancy in the number of bytes in a kilobyte exist? In some places I've seen the number 1024 (210) while in others it's 1000 (and the difference gets increasingly large with M, G, T, etc.).
This is not a discussion about whether it should be 1024 or 1000 (though you can discuss it in the comments) but about where/when this situation originated and/or became widespread.
As far as I know, Linux and hardware manufacturers never use the 1024 variant. That, and hearsay, make me think MS-DOS made this version common, but what are the facts?
bits and bytes are two different things, a kilobit is still 1000 bits, but a kilobyte is 1024 bytes and a byte is 8 bits. the reason for the 1024 is so that everything is still in binary 2^10 = 1024 – Malachi – 2014-10-14T15:00:26.037
23One representation is in binary (2^10, or 1024) while the other is in decimal (10^3 or 1000)... – Trezoid – 2011-05-23T11:04:11.507
5ditto. Plus the fact that 1024 bytes is called a Kibibyte (KiB) and not a Kilobyte (kB -> 1000 bytes). – Isaac Clarke – 2011-05-23T11:29:55.907
13If you are looking for specific people to blame, point towards hard drive manufacturers. It makes their devices look like they have more capacity when it is expressed in decimal MBs, GBs, etc. They've always done this I believe but it hasn't been until fairly recently that the gap between decimal and binary has grown wide enough to matter significantly. – LawrenceC – 2011-05-23T11:52:13.573
23@ultrasawblade: You want to blame HDD manufacturers for being one of the only groups that use the term
gigabyte
correctly? – paradroid – 2011-05-23T11:57:33.99313It just always seemed to me to be an understood thing that 1K=1024 with anything computer related before 20GB or so drives became commonplace. This also roughly coincides with the time when many non-technical people started using PCs on a regular basis. Vast amounts of computer literature (technical and nontechnical) from the early 90's and before doesn't mention anything about "decimal" KBs or "kibibytes". – LawrenceC – 2011-05-23T13:38:46.843
1@ultrasawblade: That's because it is a new unit, only defined in 1999. – paradroid – 2011-05-23T14:33:43.047
6Sheesh. Who really calls it a Kibibyte anyway? Have you ever heard anyone call it a Kibibyte in ordinary conversation without someone suppressing laughter? – Robert Harvey – 2011-05-23T15:27:57.920
13@paradroid: gigabyte originally meant 1024 mb (and mb = 1024 kb, etc). It was retconned to mean 1000 because HDD manufactures insisted on using it wrong. Admittedly, this does depend on what you consider "right" since kilo does mean 1000, but within computer science, kilobyte was always 1024 for technical reasons until it was changed in 1999. – James – 2011-05-23T15:40:01.713
7@James: This whole conversation is about how using the SI unit prefixes on binary units was wrong in the first place. – paradroid – 2011-05-23T15:42:45.743
1The government! – Wipqozn – 2011-05-23T16:23:34.310
5@paradroid: byte and bit aren't SI units. Personally I think it's just better and natural to use 1KB = 1024 bytes notation exceptionally for that area. – Grzegorz Szpetkowski – 2011-05-23T16:33:08.477
5@Grzegorz Szpetkowski They're IEC 80000 units and are subject to standard prefix rules! – AndrejaKo – 2011-05-23T17:08:25.310
3Don't you mean, "Who is to be praised for K = 1024"? – Mateen Ulhaq – 2011-05-23T17:52:57.753
2Until your message, I had never seen K = 1000 in a computer context. – None – 2011-05-24T03:27:06.583
1@Ashish So you never bought a hard drive? Sure, they're more about G and T than K or M, but it's only a few zeros more... also, data transfer rates (as in 64kbit/s or 100Mbit/s) are also multiples of 1000. – Daniel Beck – 2011-05-24T08:11:02.817
2The real reason is because memory is addressed in binary. It doesn't matter if you're storing and processing bits, trits, or decimal digits in each address. It's the addressing that creates the power-of-two sizes. @Ashish: Your networking hardware is all measured in 1000s, as are CPU clock rates, memory buses, DVD sizes, etc. – endolith – 2011-06-06T21:24:09.413
1
@ultrasawblade: "It just always seemed to me to be an understood thing that 1K=1024 with anything computer related" Nonsense. See http://en.wikipedia.org/wiki/Kilobyte and http://en.wikipedia.org/wiki/Timeline_of_binary_prefixes for some examples. It's been ambiguous since the beginning of time.
– endolith – 2011-07-24T22:50:38.920