Back

What is Endianness?

Understanding how computers store data.

15 January 2026

The basics: What is endianness?

Endianness refers to how computers store multi-byte data — like integers — in memory. Take the hexadecimal number 0x12345678. It’s made up of four bytes, but the question is: which byte gets stored first?

  • Big-endian systems are the posh, logical ones. They store the most significant byte first — so the bytes go in this order: 0x12, then 0x34, 0x56, and finally 0x78. This matches how humans naturally read numbers, from left to right.
  • Little-endian systems, on the other hand, store the least significant byte first — so it’s 0x78, then 0x56, 0x34, and 0x12. This upside-down ordering was popularised by Intel processors and dominates in personal computers today.

Why do both big-endian and little-endian exist?

The story dates back to the 1980s, when different CPU designers chose different conventions — much like the infamous “egg war” from Jonathan Swift’s Gulliver’s Travels, where two factions fought over which end of an egg to crack: the big end or the little end. The term “endianness” actually comes from this satirical tale — proof that some computing disputes are centuries old!

  • Big-endian is commonly used in networking protocols, making data transmission more predictable.
  • Little-endian is widespread in everyday computing, especially on Intel and compatible processors.
  • Some processors, like ARM, can switch between both — like a chameleon adapting its colours.

Is one better than the other?

Not really. Endianness is just a convention — similar to whether we drive on the left or right side of the road. The trouble only comes when data is shared between systems using different conventions, leading to potential confusion (and program crashes).

So next time your code glitches because bytes are in the wrong order, blame it on the great egg war of computing history!

Want to dive deeper into how computers work behind the scenes?

Watch the full video to explore endianness and more fascinating computer science concepts. 

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Festival Of Computing OCR Fringe Event 2026

The Festival of Computing 2026, co-founded and hosted by Bromsgrove School with AQA as headline sponsor, is the UK’s ultimate […]

24 March 2026

VEX Robotics is inspiring the next generation of Computer Scientists

If you’ve ever wondered how to make computing more engaging for your students, you need to know about VEX Robotics.  […]

18 March 2026

Why do we make chips out of silicon?

Why is silicon the go-to material for microchips? It’s cheap, clever, and just right for packing billions of transistors into your tech—without setting it on fire.

5 March 2026

How do you make a transistor?

Discover how billions of tiny transistors are made from a simple silicon wafer using light, acid, and precise engineering. Dive into the fascinating process that powers every device you use daily.

What Are Transistors?

Transistors are tiny electric switches that revolutionised technology. They power everything from smartphones to supercomputers, making modern life possible. Find out how.

How does MP3 compression work?

Ever wondered how your music stays crisp while using barely any storage or data? MP3 compression cleverly shrinks audio files by removing sounds your ears won’t even notice.

When AI plays the music: The Velvet Sundown hoax that fooled the internet

An AI-generated band with 850,000 listeners and no real members?
The Velvet Sundown hoax reveals just how blurred the line between real music and artificial intelligence has become.

CPD at the Craig’n’Dave Festival of Computing 2026

One of the highlights of the Craig’n’Dave Festival of Computing each year is the sheer breadth and quality of CPD […]

24 February 2026