I’m tired of people thinking that a gigabyte is 1024 megabytes because Microsoft says that in file explorer. A gigabyte is 1000 megabytes and a gibibyte is 1024 mebibytes.
I’m tired of people thinking this is commonly used parlance because of some Microsoft file explorer choice and not because that is how it was taught in Comp Sci classes all over the world for decades, even before Microsoft. I hate Microsoft as much as the next guy, but this isn’t really anything to do with them… The new SI standards for Kibi et al were standardized around 1999, and college CS classes were still teaching without using them at all up to at least 5 years later when I attended. And the tech boom of the early 2000s was rife with “well actually, you should know” misinformation spread by news and media outlets and “a kilobyte is actually 1024 bytes” was part of that. Then you get wrapped up in the awful awful corporate lies spread by ISPs, RAM, and Storage drive companies that will happily exploit the confusion of the consumer to their advantage every fucking time, trying to hide behind “bits vs bytes” confusion…
I feel like the blame has shifted over to the companies that still use the binary standard nowadays. The SI standard has been around for over 25 years now and yet they still use it. As a gen Zer myself, pretty much all the people I know that still use the binary standard are doing it because they saw it being used in their file software (Microsoft file explorer.)
Also not trying to put the blame off of the SI standard itself here. If the SI standard said that a KiB is 1000 bytes and a KB is 1024 bytes, then there would be no need to argue about whether to use the binary or SI standard when referring to KB.
I’m tired of people thinking that a gigabyte is 1024 megabytes because Microsoft says that in file explorer. A gigabyte is 1000 megabytes and a gibibyte is 1024 mebibytes.
deleted by creator
I’m tired of people thinking this is commonly used parlance because of some Microsoft file explorer choice and not because that is how it was taught in Comp Sci classes all over the world for decades, even before Microsoft. I hate Microsoft as much as the next guy, but this isn’t really anything to do with them… The new SI standards for Kibi et al were standardized around 1999, and college CS classes were still teaching without using them at all up to at least 5 years later when I attended. And the tech boom of the early 2000s was rife with “well actually, you should know” misinformation spread by news and media outlets and “a kilobyte is actually 1024 bytes” was part of that. Then you get wrapped up in the awful awful corporate lies spread by ISPs, RAM, and Storage drive companies that will happily exploit the confusion of the consumer to their advantage every fucking time, trying to hide behind “bits vs bytes” confusion…
I feel like the blame has shifted over to the companies that still use the binary standard nowadays. The SI standard has been around for over 25 years now and yet they still use it. As a gen Zer myself, pretty much all the people I know that still use the binary standard are doing it because they saw it being used in their file software (Microsoft file explorer.)
Also not trying to put the blame off of the SI standard itself here. If the SI standard said that a KiB is 1000 bytes and a KB is 1024 bytes, then there would be no need to argue about whether to use the binary or SI standard when referring to KB.
RAM manufacturers disagree. I’ve seen many sticks of RAM in my day and none of them have ever said “gibibyte” or “GiB” on them.
I will die on the hill that binary prefixes are stupid and unnecessary.
A byte is 8 bit.