Data in computers is measured in specific units, from the smallest (bit) to the largest (petabyte). Understanding these units and being able to convert between them is fundamental to Computer Science.
Key points
Definition:Bit: the smallest unit of data, a single binary digit (0 or 1).
Exam Tip:The OCR specification uses DECIMAL prefixes (multiply by 1,000). Using 1,024 is also acceptable but 1,000 is the default.
Common Mistake:Confusing bits and bytes. There are 8 bits in 1 byte. File sizes are usually given in bytes; data transfer speeds are often in bits per second.
Exam Tip:When comparing file sizes, convert ALL to the SAME unit first. E.g. to compare 2.1 GB, 300 MB, 200,000 KB and 0.0021 TB, convert all to MB or KB.