Common Questions – What does 100 MB mean?

Binary CodeIt’s one of those things that everyone knows and everyone asks. How big is 100 MB? This question is fundamentally flawed as the answer is in the question. So we should ask what does 100 MB mean? Well, to start with, a byte is a basic unit of measurement in computer science. A byte consists of 8 bits as a standard. A bit is the foundational unit of measurement in computer science and can only be one of two values – 0 or 1. It really represents one of two stable states of something such as the two positions of an electrical switch or two distinct voltage/current levels allowed by a circuit; you can also think of it as 1 = “on” and 0 = “off”.

So think of a byte as a collection of bits where the end pattern such as something like “01000001” represents something we’ve defined as the letter “A” in ASCII code (a table of binary patterns that correspond to letters of the alphabet). As more and more bits come together, more complex and larger data can be represented. Additionaly, more storage (such as maybe a 100 gigabyte or 1 terabyte hard drive) is needed to store the data.

So, for larger measurements of bytes, the value is increased in binary not decimal. One kilobyte (KB) equals 1024 (2^10) bytes. One megabyte (MB) equals 1024 KB or 1,048,576 (2^20) bytes. One gigabyte (GB) equals 1024 MB or 1,073,741,824 (2^30) bytes. One terabyte (TB) equals 1024 GB or 1,099,511,627,776 (2^40) bytes.

Now for bits, the value is increased in decimal not binary. A kilobit (Kbit) equals 1000 (10^3) bits. A megabit (Mbit) equals 1000 Kbits or 1,000,000 (10^6) bits. A gigabit (Gbit) equals 1000 Mbits or 1,000,000,000 (10^9) bits. A terabit (Tbit) equals 1000 Gbits or 1,000,000,000,000 (10^12) bits.

This entry was posted in Computer Science, Math and tagged , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *


9 − three =