Back

Why do arrays start at zero?

15 July 2025

If you’ve ever dived into programming, you’ve probably asked yourself: why on earth do arrays start at zero instead of one? At first glance, it seems counterintuitive, but the answer lies in efficiency and logic.

Visualising arrays: the row of lockers analogy

Think of an array as a row of lockers. Each locker has a position, starting at the very beginning of the row. The first locker is zero steps from the start, the second locker is one step away, and so on. If you want to access the third locker, you count two steps from the beginning: 0, 1, 2. This is the essence of zero-based indexing—it measures the offset from the starting point.

The link between arrays and memory

Arrays in programming map directly to how memory works in a computer. When an array is created, it’s stored as a block of memory. Accessing an element at array[i] involves the computer locating the base address of the array in memory and adding i to it. Starting at zero simplifies this calculation, making it faster and more efficient. In essence, zero-based indexing aligns perfectly with how hardware is designed to operate.

Why not start at one?

While starting at one might feel more intuitive, it’s not practical. Zero-based indexing is baked into the very foundation of programming languages, compilers, and hardware logic. Switching to one-based indexing would introduce unnecessary complexity and inefficiency. That’s why programmers worldwide have embraced zero-based indexing as the universal standard.

It’s not weird—it’s smart!

So, the next time you see array[0], remember it’s not just a quirk of programming. It’s a smart, efficient design choice that keeps your code running smoothly.

Want to learn more?

Want to know more? Check out The Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker Videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Why don’t computers use a different base for numbers?

Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

30 April 2025

Back

Why DPI matters: The difference between screen & print quality

8 July 2025

Understanding DPI: What does it actually mean?

DPI (dots per inch) is exactly what it sounds like—it’s a measure of how many tiny dots (or pixels) fit into one inch of space. The more dots you have, the more detail your image retains.

For digital screens, 72 DPI is the standard because it keeps file sizes small and looks crisp at a normal viewing distance. But when it comes to printing, things change dramatically.

Think of it like wearing pyjamas. At home, wearing 72 DPI is fine—relaxed, comfortable, and good enough for what you need. But taking that same look to a first date? Suddenly, the details matter a lot more.

Why does print need 300 dpi (or more)

When you print something, you’re holding it much closer to your eyes than a screen. Your brain expects more detail because it’s used to seeing sharp, high-resolution objects up close. If your image is only 72 DPI, it won’t have enough detail to look crisp—it will appear soft, pixelated, and blurry, like a sun-faded potato chip.

That’s why 300 DPI is the magic number for print. At this resolution, images retain their sharpness even when viewed up close. The higher dot density makes lines and textures look clean, rather than jagged or smudged.

Imagine a giant poster—when viewed from 10 feet away, a few blurry dots don’t matter. But now think about a business card. You hold it right up to your face, and if it’s not printed at high resolution, it’ll look like it was drawn in MS Paint by a four-year-old with a potato.

The simple rule: screen vs print

If you only remember one thing, make it this:

  • 72 DPI is fine for screens. It’s optimised for digital displays, loads quickly, and keeps file sizes manageable.
  • 300 DPI (or higher) is essential for print. It preserves fine details, ensuring your artwork looks as sharp on paper as it does on screen.

What happens if you use the wrong DPI?

  • If you use 72 DPI for print, your artwork will look blurry and pixelated.
  • If you use 300 DPI for digital, your file sizes will be unnecessarily large, and it won’t look any better than a 72 DPI image.

So, always think about where your image will be seen before choosing the right DPI. If it’s just for a website or social media, 72 DPI is fine. But if it’s going to a printer, crank it up to 300 DPI to avoid a pixelated disaster.

Want to learn more about getting the best quality out of your designs? 

Check out Dave The Lesson Hacker’s YouTube video HERE

For more Lesson Hacker videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Why don’t computers use a different base for numbers?

Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

30 April 2025

Back

Why do GPUs get so hot?

10 June 2025

GPUs are the powerhouses of modern computing, handling everything from gaming to video editing and complex 3D rendering. But with great power comes great heat. Ever wondered why your graphics card runs so hot? Let’s break it down.

The science behind GPU heat

Think of your GPU like a busy chef in a restaurant, constantly preparing thousands of meals at once. Each dish represents a calculation, and just like in a real kitchen, all that activity generates heat.

At the heart of it all is electricity. Every time your GPU processes data, tiny electrical signals rush through billions of transistors. But electricity is never 100% efficient—some of that energy gets lost as heat. With so many calculations happening at lightning speed, things heat up quickly.

Why GPUs run hotter than other components

Unlike your CPU, which gets short breaks between tasks, GPUs are designed for continuous heavy lifting. Whether you’re gaming, rendering 3D models, or watching high-definition videos, your GPU is working flat out, pushing itself to the limit.

To make things even trickier, modern GPUs are built with incredible density, packing more transistors into smaller spaces than ever before. It’s like squeezing too many commuters onto a packed Monday morning train; there’s no room to breathe, and the heat has nowhere to escape.

How GPUs keep their cool

This is where cooling solutions come in. Your computer’s fans work hard to move hot air away from the GPU, while heatsinks help absorb and disperse excess warmth. High-performance gaming setups even use liquid cooling to keep temperatures under control.

If your GPU ever gets too hot, it can throttle its performance to prevent damage, but ideally, you want to avoid this from happening. Keeping your PC well-ventilated and dust-free can go a long way in helping your GPU stay cool and efficient.

 

Next time you hear your computer fans whirring into action, just remember: your GPU is working hard to deliver stunning graphics and smooth performance. Looking after it will keep your system running at its best.

Want to dive deeper into how GPUs work? Watch the full video here

Want to learn more about computer science and the latest tech trends? Visit our website Craig’n’Dave for all the latest resources and insights.

 

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Why don’t computers use a different base for numbers?

Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

30 April 2025

Back

What is the Dark Web?

Understanding the Dark Web: What You Need to Know

3 June 2025

What is the Dark Web? A Look Into Its Mysteries

When you hear about the Dark Web, it’s easy to imagine a place full of criminals, illegal activities, and shady dealings. But is that the whole picture? Let’s take a closer look at what the Dark Web really is, how it works, and why some people use it.

What Exactly is the Dark Web?

To understand the Dark Web, imagine the internet as a massive city. The regular web—the part of the internet where you shop, search for information, and watch videos—is like the lively downtown area. Everyone knows you’re there, from your internet provider to the websites you visit. Now, picture the Dark Web as the quieter, hidden alleyways. It’s a part of the internet where your online activities are harder to track, and you need special tools to access it—like the TOR browser (The Onion Router). When you use TOR, you’re essentially putting on an invisibility cloak, hiding your digital footprint from prying eyes.

Is the Dark Web Dangerous?

It’s not as sinister as it may sound at first. Simply visiting the Dark Web isn’t illegal or dangerous, as it’s just another layer of the internet. However, much like any other part of the internet, there are areas of the Dark Web where illegal activities occur—such as the buying and selling of illicit items. But, that’s not all. The Dark Web is also used for good. It provides a safe space for people like journalists, government agents and whistleblowers, who need to communicate securely without the risk of surveillance or hackers.

While it’s often associated with criminal activity, not everyone who visits the Dark Web is up to no good. It’s a tool for anonymity and security in an otherwise open internet, and its uses extend far beyond shady dealings.

Stay Safe and Informed

So, is the Dark Web a dangerous place? Not if you’re careful. It’s a bit like wandering into an unfamiliar neighbourhood—there are good parts and bad parts. As long as you steer clear of the illegal corners, the Dark Web can serve as a valuable tool for privacy and secure communication.

For a deeper dive into the world of the Dark Web, check out our video here

Want to learn more about computer science and the latest tech trends? Visit our website Craig’n’Dave for all the latest resources and insights.

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Why don’t computers use a different base for numbers?

Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

30 April 2025

Back

How much does it cost to build a CPU?

27 May 2025

Building a CPU isn’t just complicated; it’s an engineering marvel that demands staggering resources. 

Imagine creating the most intricate pancake in the world, where every ingredient is microscopic, precision matters, and the price tag is astronomical. 

Let’s break it down to understand what goes into making these high-tech powerhouses. 

Silicon wafers: the foundation of a CPU. 

At the heart of every CPU is a silicon wafer. While the raw materials themselves are relatively cheap, turning them into a usable wafer is an entirely different story. The process involves cutting-edge technology and precision, with costs starting at £8,000 or more per wafer. And that’s just the beginning. The factories where CPUs are made, known as fabs, are extraordinary facilities. 

Building a state-of-the-art fab capable of producing today’s 3nm or smaller transistors can set you back over £16 billion. Why so much? Because these fabs operate on an atomic scale, even the tiniest mistake can render entire batches unusable. 

The level of cleanliness, precision, and technological advancement required is unmatched. 

Research and development: the hidden cost. 

Designing a CPU isn’t a quick or cheap process. Teams of engineers spend years creating, testing, and refining each design. Simulations, prototypes, and endless troubleshooting are part of the journey, with research and development costs reaching millions of pounds for a single chip. 

It’s an investment of time, money, and expertise to push the boundaries of what’s possible. 

Why CPUs are worth every penny. 

When you consider the monumental effort and expense behind each CPU, it’s easier to understand their price. 

Every chip is a piece of technology more complex than most buildings, packed into a form factor small enough to fit in your hand. CPUs power everything from our laptops to supercomputers, making them one of the most essential inventions of our time. 

Curious to learn more about the fascinating world of CPUs? 

Watch the full video on our YouTube channel for an in-depth explanation. 

For more insights into computer science and to explore our resources, visit the Craig’n’Dave website today.

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Why don’t computers use a different base for numbers?

Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

30 April 2025

Back

Are Graphics cards cheating now?

13 May 2025

In the ever-evolving world of gaming and computing, Nvidia’s latest RTX 5000 series has sparked an interesting debate: are graphics cards actually improving, or is AI doing all the heavy lifting? 

With the launch of DLSS 4 and some mind-blowing specs, we’re diving into what’s real, what’s AI-generated, and whether any of it really matters.

Are GPUs really getting better?

Nvidia’s new flagship, the RTX 5090, is an absolute beast. With 92 billion transistors, 32GB of GDDR7 VRAM, and memory bandwidth that defies belief, it’s designed to dominate 4K gaming, dabble in 8K, and obliterate your wallet at $1,999. But who is this really for? While gamers will appreciate the power, this kind of hardware is also aimed at content creators, developers, and professionals pushing the limits of rendering and AI-driven applications.

The evolution of DLSS

One of the biggest advancements in recent years has been DLSS (deep learning super sampling). When it first launched in 2018, it was a bit underwhelming—think blurry, pixelated mess. But Nvidia kept improving it, and today, DLSS 4 is a game-changer. Using transformer-based AI models, it enhances graphics, generates frames, and makes gameplay smoother than ever.

DLSS 4 includes three major features:

  • Super resolution – upscales lower-resolution images to 4K or even 8K.
  • Ray reconstruction – improves ray tracing quality using AI rather than traditional methods.
  • Multi-frame generation – creates new frames in real-time, making gameplay ultra-smooth.

This means you can enjoy high-end visuals without needing a ridiculously expensive GPU every year.

Is AI ‘cheating’ in gaming?

Some critics argue that AI-generated frames aren’t ‘real’ pixels, but does it actually matter? If a game looks stunning, runs at 120fps, and feels seamless, is it important whether every frame was painstakingly rendered or if AI stepped in to assist?

It’s a bit like baking a cake—whether the icing was handmade or piped by a machine, the end result is still delicious. For most gamers, AI-powered enhancements are a blessing, allowing them to enjoy top-tier performance without breaking the bank.

The future of GPUs and gaming

One thing is clear: AI is no longer just a sidekick in gaming—it’s taking centre stage. DLSS 4 is proof that Nvidia is leaning heavily into AI-driven enhancements. But there’s a catch: multi-frame generation is exclusive to the RTX 5000 series, meaning older GPUs are slowly being left behind.

For those still clinging to older hardware, the choice is clear: embrace the upgrade cycle or accept a future as a retro gamer. Either way, gaming technology is moving faster than ever, and Nvidia’s latest advancements are redefining what’s possible.

Want to see the tech in action? Check out our full breakdown of the RTX 5000 series and DLSS 4 in our Lesson Hacker video.

 

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Why don’t computers use a different base for numbers?

Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

30 April 2025

Back

Why don’t computers use a different base for numbers?

The simple reason why binary beats all other number bases

Why not Base-4?

At first glance, it seems logical to ask: Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

Electronics love simplicity

Computers are built on circuits that recognise two states: “off” and “on”. These states are easy, reliable, and practical for electronics to detect. Base-4, on the other hand, would mean handling four distinct states—imagine “off,” “partly on,” and “fully on.” Cool in theory, but impractical in reality. Building hardware to detect such levels would not only be expensive but also error-prone. Think of it like trying to get a light switch to dim to exactly 37%—possible, but far from practical.

A costly rewrite of history

Binary’s dominance dates back to the early days of computing, when switches were literal levers toggling between two positions. Switching to Base-4 today would require a complete overhaul of modern technology. Every programme would need rewriting, every processor redesigning, and every programmer retraining. The cost? More than even the world’s wealthiest could cover.

Base-3 computers: A brief history

Interestingly, a ternary (Base-3) computer was once a serious contender in the 1950s. Yet, despite its potential, binary won out for its simplicity, reliability, and efficiency. The entire computing industry has been built on this foundation, and for good reason: sometimes less really is more.

The unbeatable efficiency of binary

While other number bases could theoretically work, binary remains the gold standard. Its simplicity makes it easy to implement, cost-effective, and highly reliable. If it ain’t broke, don’t fix it—or add unnecessary complexity.


Want to dive deeper? Watch our full Craig’n’Dave Lesson Hacker video

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies. 

Stay informed, stay curious!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

Why can’t we just stick RAM directly onto the CPU?

22 April 2025

In the world of computer science, speed is everything. So, it’s easy to see why the idea of sticking RAM directly onto the CPU seems like a genius move. Zero latency, lightning-fast speeds, and no more bottlenecks—what’s not to love? But in reality, it’s not that simple. Let’s break down why we can’t just combine these two crucial components into one.

The difference between CPU and RAM

At first glance, sticking RAM onto the CPU might sound like a great way to boost performance. After all, the closer RAM is to the CPU, the faster data can be accessed, right? Unfortunately, it’s not that straightforward. The CPU and RAM are built in fundamentally different ways.

CPUs are designed to handle calculations at breakneck speeds using logic processes. On the other hand, RAM—specifically Dynamic RAM (DRAM)—uses capacitors to temporarily store data. The catch is that these capacitors need constant refreshing to retain their information. This is similar to a student frantically rereading their notes to ensure they remember everything during revision.

Why it doesn’t work together

Trying to combine CPU and DRAM onto the same chip would cause chaos in the manufacturing process. DRAM fabrication doesn’t align well with the processes used to create a CPU. Imagine trying to install a high-end GPU into a budget laptop—it just won’t fit, and forcing it could cause damage.

Even cutting-edge technologies like Intel’s Haswell architecture use embedded DRAM (eDRAM) sparingly. The goal is to use just enough to boost performance without massively increasing production costs. However, merging CPU and RAM completely would be a manufacturing nightmare.

The speed factor: DRAM vs. SRAM

Even if we could combine the two, there’s another issue: speed. DRAM operates at a top speed of about 1 GHz, while modern CPUs can easily surpass 3 GHz. That’s like putting bicycle tyres on a Formula 1 car—you’re limiting the performance of the entire system.

To overcome this speed gap, CPUs use SRAM (Static RAM) for on-chip cache. SRAM is much faster than DRAM but comes with its own drawbacks: it’s bulkier and significantly more expensive. Sure, we could fill a CPU with SRAM, but that would come at an astronomical cost—far more than most of us are willing to pay.

Why we stick to separate RAM and CPUs

While combining RAM and the CPU might sound like a performance dream, the technical and cost limitations make it impractical. The current balance of DRAM for main memory and SRAM for cache strikes the best compromise between speed, cost, and practicality.

Want to know more? Check out The Lesson Hacker’s YouTube video – 

For more Lesson Hacker Videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

Are loot boxes gambling?

The digital debate every gamer should know

18 March 2025

What are loot boxes?

In simplest terms, loot boxes are mystery rewards players can earn or buy. Think of them as digital versions of Kinder Eggs, but instead of chocolate and toys, they contain skins, weapons, or characters to enhance your game. Sounds fun, right? Except you don’t know what you’re getting until you’ve paid. Cue disappointment when that elusive Messi card in FIFA Ultimate Team turns out to be yet another low-tier player.

The gambling argument

Critics argue that loot boxes mimic the mechanics of gambling. You pay for a chance at a desirable outcome, complete with flashy animations and dopamine-fueled suspense. Younger players are especially vulnerable, with some spending hundreds—or even thousands—chasing rare items.

The Norwegian Consumer Council has even labelled loot boxes as “predatory mechanisms” that exploit psychological tricks to drain wallets. Sound familiar? That’s because it feels suspiciously like pulling the lever on a Vegas slot machine, except your reward is a virtual hat rather than a pile of cash.

Developers defend their treasure chests

Game developers, however, see things differently. They compare loot boxes to toys like mystery figurines, claiming they’re just “harmless fun.” Since most loot boxes don’t pay out real money, they argue they don’t qualify as gambling. But with gaming companies raking in an estimated $15 billion annually from loot boxes, it’s clear these digital party bags are more than just fun—they’re big business.

A lack of transparency

Transparency remains a sore spot. In the UK, only two of the 45 top-grossing games disclose loot boxes in their advertising, despite being required to by law. That lack of clarity leaves players unaware of the odds—or costs—involved.

Are loot boxes all bad?

Not necessarily. For some players, loot boxes add excitement to gaming. Plus, self-regulation within the industry has started to improve, with guidelines for clearer disclosure. Yet, countries like Belgium and Japan have gone further, banning or regulating loot boxes to protect consumers.

So, are loot boxes gambling?

The answer depends on who you ask. Critics say yes, developers say no, and governments remain undecided. What’s clear is that players need to approach loot boxes with caution—because while you might win a cool skin, it’s easy to lose track of how much you’re spending.

Watch the full video for more insights!

Check it out on the Craig’n’Dave YouTube channel, and don’t forget to visit our website for more gaming and computer science insights.

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

I’ve lost control of my own DNA

18 February 2025

DNA testing kits have become a modern-day curiosity – a way to discover your ancestry, potential health risks, and even traits you didn’t know you had. But have you ever stopped to consider what happens to that genetic goldmine once you send it off? Here’s a closer look at the fascinating yet slightly alarming world of DNA testing.

Uncovering secrets with DNA kits

Companies like 23andMe have made exploring your DNA as simple as spitting into a tube. From identifying as 5% Norwegian to discovering a predisposition for male pattern baldness, the results can be both entertaining and enlightening. But the story doesn’t end with finding out why you have curly hair.

DNA testing services also promise health insights and family connections, often revealing unexpected truths. Think learning one of your parents isn’t actually your parent – a revelation that might spice up Sunday lunch discussions.

The double-edged sword of genetic data

As exciting as these insights are, they come with serious privacy implications. Your DNA isn’t just your own; it’s linked to your family too. Sending in a sample might inadvertently overshare details about your siblings, parents, and even distant cousins.

The bigger concern is what happens to this data if companies face financial trouble. For example, Atlas Biomed recently vanished from the web, leaving users wondering what became of their most sensitive information. Similarly, 23andMe’s struggles raise questions about the security of its extensive DNA database.

Can you take back control of your DNA?

If the thought of your genetic information being mishandled keeps you awake at night, you do have options. Many companies allow you to delete your data, but be aware: if it’s been anonymised and used in research, it’s out there for good.

Before diving into DNA testing, take these precautions:

  • read the terms and conditions: Know who has access to your data and how it’s used.
  • consider the long-term impact: Your results affect not just you, but your family too.
  • decide how much you want to know: Some truths might be best left undiscovered.

DNA testing kits can reveal fascinating insights, but they come with significant risks. If you’re considering taking the plunge, make an informed decision – and if you’ve already tested, explore ways to manage your data responsibly.

For more on this topic, watch the full Craig’n’Dave Lesson Hacker video linked below. 

Be sure to visit our website for more insights into the world of technology and for the best teaching resources for computer science and business studies. 

Stay informed, stay curious!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

Can You Spot the AI? The Rising Challenge of AI-Generated Faces

14 January 2025

We’re diving into the uncanny world of artificial intelligence, specifically AI-generated faces. These days, it’s getting harder and harder to tell what’s real and what’s not. So, let’s explore why AI faces are so convincing, the potential risks, and what we can do about it.

The Growing Power of AI in Creating Human Faces

Imagine scrolling through your social media feed. You see familiar faces, but wait—are they all real? Thanks to advances in AI, computers are now generating hyper-realistic human faces that can fool even the sharpest eyes. A recent study from Aberdeen University showed that most of us can’t reliably distinguish between real human faces and AI-generated ones, with a 65% misidentification rate.

That’s right, most people are flipping a coin when guessing whether a face is AI-made or naturally human.

Why Are AI Faces So Hard to Spot?

It turns out that AI’s secret weapon is hyperrealism. These generated faces have perfectly balanced features and a lifelike sparkle in their eyes. For most people, this makes AI faces incredibly hard to detect. The study found that even those confident in their answers were often wrong, revealing a classic “confidence paradox” – the more convinced we are, the more likely we’re mistaken.

Surprisingly, humans are at their peak face-recognition abilities at around 31 years old, so if you’re not there yet, or you’ve passed it, spotting AI faces might feel like a superpower slipping away.

The Dark Side: Bias, Fraud, and AI

While AI-generated faces can be fun (think video games and virtual avatars), there’s a troubling side to this tech. The data used to train AI is often biased, leading to AI-generated images that skew towards white faces. This ‘whitewashing’ problem creates racial disparities, and worse yet, the potential for misuse is huge. Think identity theft, fraud, or even law enforcement misuse through facial recognition software.

What’s Next? How Do We Stay Safe?

So, what can we do? Aberdeen University is already educating schools about the risks of AI-generated images and online fraud. But there’s more to be done. We need transparency, tools to spot fake faces, and public awareness to ensure that AI technology doesn’t outpace our ability to control it.

AI is rapidly shaping our world, and we need to stay informed. As technology advances, it’s important to remain curious, sceptical and educated.  Want to know more? Watch the full video below for a deeper dive, and check out our website for more content on tech, AI, and the future.

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

Can we really scrub the Internet clean?

7 January 2025

Exploring Ofcom’s Online Safety Reset

The internet is a vast expanse of information, entertainment, and, unfortunately, potential dangers, especially for children. With growing concerns about online safety, Ofcom has announced a major reset aimed at child safety. But can we really scrub the internet clean? 

Let’s delve into the details.

Ofcom’s Major Reset

Ofcom’s recent consultation proposes robust age checks, safer algorithms for personalised content, and more effective moderation of content accessible to children. This ambitious plan targets over 150,000 services, making it a colossal undertaking. The goal is to protect young users from harmful content, but the implementation is far from straightforward.

Tech Companies’ Current Efforts

Big tech companies are already taking steps to address these issues. Meta is implementing new safety measures on Facebook and Instagram to combat grooming, while Twitch is trying to shield young users from mature content. However, these measures often feel like playing whack-a-mole with a foam bat—inefficient and somewhat comical in the face of such a serious issue.

The Age Assurance Debate

A significant part of Ofcom’s plan involves age assurance, which has sparked a heated debate. Proposed methods like AI-powered facial scans to verify age raise privacy concerns. There’s also the risk of pushing children towards more dangerous online spaces if these methods prove too invasive or ineffective. Moreover, some parents and siblings inadvertently aid underage social media use, complicating enforcement.

Encryption and Privacy Challenges

End-to-end encryption, offered by services like Signal and WhatsApp, provides privacy but makes it difficult to detect abuse. This creates a tug-of-war between protecting children and guarding digital privacy. Ofcom is prepared to impose hefty fines on companies that fail to comply, underscoring the high stakes involved.

Our Conclusion

The challenge of cleaning up the internet is complex and vital. Striking the right balance between safeguarding young users and preserving digital liberties is crucial. Ofcom’s major reset aims to protect children but faces the massive task of overseeing a vast number of online services. While tech companies are implementing new safety measures, concerns about their effectiveness and the potential push towards riskier online spaces remain. Age verification methods raise privacy issues, and encryption complicates oversight.

So, can we really scrub the internet clean, or is it a pixelated pipe dream? One thing is certain: navigating the information superhighway safely will require ongoing efforts, evolving legislation, and continuous adaptation by tech companies. Stay tuned, stay informed, and most importantly, stay safe online.

Want to know more, check out Dave The Lesson Hacker’s YouTube video – https://youtu.be/SaAGNg6bZDc 

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

WordPress at war: The battle over open-source code

31 December 2024

In the tech world, even the most popular platforms can find themselves in unexpected conflicts. Today, we’re diving into the drama between WordPress and WP Engine, exploring how a disagreement over open-source principles has turned into a legal standoff.

The surprising power of WordPress

WordPress powers an impressive 40% of the internet. From personal blogs to corporate websites, it’s the go-to platform for millions. Part of its appeal lies in being open source, meaning the code is free for anyone to use and modify. Think of it as a collaborative coding project where everyone is invited to contribute. However, this spirit of community sharing is now being tested.

The split personality of WordPress

WordPress isn’t just one entity; it has a dual nature. On one side, there’s WordPress.org, a non-profit that offers free access to its code. On the other, we have Automattic, a for-profit company that offers paid services based on WordPress’s open-source platform. This creates an interesting dynamic where WordPress has to balance community ideals with business interests.

WP Engine vs. Matt Mullenweg: The feud begins

WP Engine is a major hosting provider for WordPress sites, helping users get their blogs, e-commerce stores, and other websites online. However, Matt Mullenweg, co-founder of WordPress, recently criticised WP Engine, accusing them of taking advantage of the open-source code without giving back enough to the community. He went as far as to call WP Engine “a cancer” on WordPress.

WP Engine was quick to respond, insisting they contribute significantly by maintaining sites, optimising performance, and providing customer support. They claim they’re already giving back to the WordPress ecosystem in many ways.

Escalating tensions: Blocking and lawsuits

To retaliate, Mullenweg took a bold step: he blocked WP Engine from using certain WordPress features. This decision caused major disruption, leaving many businesses uncertain if their sites would continue to function properly. For companies like Tricia Fox’s in Scotland, this disruption has meant unexpected costs and hours of extra work to keep things running smoothly.

Now, the conflict has moved to the courtroom. WordPress wants WP Engine to pay for using its trademark, arguing that they profit from the WordPress brand. WP Engine, in turn, has filed a lawsuit accusing WordPress of extortion and libel.

What does this mean for the open-source community?

The heart of the issue is open source itself. WordPress aims to “democratise publishing,” but its actions against WP Engine raise concerns about whether it’s staying true to that mission. The irony of two tech giants clashing over a platform designed to be free and open is not lost on the community. It’s a reminder that as open-source projects grow, they often face challenges balancing ideals with commercial realities.

Want more tech drama?

This isn’t just a story about two companies; it’s about the evolution of one of the internet’s most influential platforms. 

For a deeper dive into this unfolding drama, watch the full video on the Craig’n’Dave YouTube channel.

If you’re interested in more insights and resources, visit the Craig’n’Dave website for exclusive content tailored for computer science enthusiasts.

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

One charger to rule them all?

3 December 2024

The UK’s Move Towards USB-C.

Today, we’re tackling a common conundrum: chargers. Why so many different ones? And is the UK finally moving towards a universal solution, or will things stay tangled? Let’s dive into the ongoing debate over USB-C.

Why USB-C?

USB-C wasn’t always the go-to. We’ve seen everything from mini-USBs to proprietary cables like Apple’s Lightning. But USB-C arrived with a vision: one cable to charge everything. This port is versatile, offering faster charging, quicker data transfers, and a reversible design—no more fiddling in the dark to get it right! But, not all USB-C ports offer the same capabilities. For instance, a MacBook’s USB-C might support Thunderbolt technology, allowing super-fast data speeds and even external graphics card support. Your budget smartphone’s USB-C? It may only offer basic functionality. So, while the USB-C port might be universal, what it can do varies widely.

Is the UK on board?

With the EU already mandating USB-C, the UK is considering doing the same. If adopted, the standard could mean fewer chargers and less clutter. But in reality, companies are already making the switch due to global trends. Apple, for example, has dropped its proprietary Lightning cable for USB-C with the iPhone 15. So, UK regulations may not be a game-changer here.

The environmental impact

One key argument for a USB-C standard is cutting down e-waste. It’s estimated that there are 600 million unused cables lying around in the UK! A universal standard could reduce this number by lessening the need for multiple chargers. However, as people toss older cables, we might see an initial spike in e-waste.

The innovation dilemma

Standardisation could also slow innovation. Imagine if we’d settled on a single charger a decade ago—would USB-C even exist? There’s a risk that locking into one standard could stifle manufacturers from developing new technologies.

So, will USB-C rule them all?

It seems likely that USB-C will be the global standard for now, whether the UK enforces it or not. While it simplifies things, USB-C doesn’t solve every issue—charging speeds and capabilities still vary. So, don’t throw out those old chargers just yet. They might still come in handy!

Ready to learn more?
For a deeper dive into the UK’s tech scene and more tech insights, watch The Lesson Hackers video on this topic HERE.

Be sure to visit our website for more insights into the world of technology and for the best teaching resources for computer science and business studies. Stay informed, stay curious!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

Has AI and tech ruined sport?

19 November 2024

This is a topic that sparks a lot of debate: technology in sport. Some fans say tech is ruining their beloved sports, while others believe it’s making it fairer and more exciting. Let’s unpack how technology like Hawk-Eye, VAR, and AI have transformed the world of sports, for better or worse.

Precision or frustration?

Tennis fans are in for a big change. Wimbledon’s line judges are being replaced by Hawk-Eye technology—a camera system that makes precise calls on whether a ball is in or out. While this guarantees accuracy, something vital is lost: the drama. Remember when a player would challenge a call, and the crowd would hold its breath? Now, it’s simply “The computer says it’s out.” Accurate? Yes. Thrilling? Not so much.

Football’s introduction of VAR (Video Assistant Referee) was meant to correct bad calls. But has it made the game more enjoyable? While it does improve fairness, it’s hard to ignore the frustration when a game grinds to a halt for a five-minute review over whether someone’s toenail was offside. The precision is great, but the momentum of the game? That’s often the real casualty.

Data and AI: The future of fan engagement

Beyond refereeing tech, AI and data analytics are reshaping how fans engage with sport. Companies like Opta track everything from player speed to match predictions, turning sport into a data-driven experience. While it’s a different way of connecting with the game, some fans miss the messy, emotional moments that stats can’t capture.

So, Has tech really ruined sport?

Not quite. Technology hasn’t killed sport—it’s evolved it. The drama may now lie in data points and AI predictions instead of human error, but the heart of sport remains. Whether you’re shouting at a referee or a computer, the passion is still there.

Watch the full video on our channel to dive deeper into how tech is transforming sport.

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025

Back

Is OpenAI really that open?

5 November 2024

What started as a non-profit dream has evolved into a tech giant worth a staggering $157 billion, raising questions about its original mission.

OpenAI’s humble beginnings

OpenAI – once a bold, altruistic initiative aiming to create artificial intelligence for the benefit of humanity – has undergone a dramatic shift. 

Back in 2015, OpenAI set out with a noble goal: to make artificial intelligence accessible, safe, and beneficial for everyone. Spearheaded by influential figures like Elon Musk, it promised to use AI for the greater good, not just to line the pockets of the wealthy. Initially, a non-profit, OpenAI’s mission was simple: create AI that serves all of humanity, not just the elite.

However, the landscape changed quickly. By 2018, Elon Musk had left, citing concerns that the organisation was straying from its mission. Fast forward to today, and OpenAI’s once “open” nature seems a distant memory.

From non-profit to capped-profit

OpenAI’s transformation into a “capped-profit” organisation marked a significant departure from its non-profit roots. The company now walks a fine line between innovation and commercialisation, securing billions in funding from tech giants like Microsoft and Nvidia. While this has driven AI advancements, it’s also placed enormous pressure on OpenAI to monetise its technology, which risks sidelining its original vision.

Internal tensions and key exits

With the shift towards profit, OpenAI has faced growing internal conflicts. Safety researchers and AI developers, concerned that financial interests are outweighing ethical considerations, have begun to leave. Prominent figures, such as former CTO Mira Murati and Chief Scientist Ilya Sutskever, have stepped down, fuelling concerns about the company’s direction.

The critics speak out

One of the loudest critics has been none other than Elon Musk. From the sidelines, Musk has accused OpenAI of losing sight of its original purpose, claiming it’s now more focused on pleasing investors than safeguarding humanity from AI’s potential dangers.

OpenAI’s journey from non-profit idealism to a $157 billion behemoth leaves us wondering: has it lost sight of its mission, or is this just the price of progress? Let us know your thoughts in the comments!

Get Your Classroom Buzzing About AI!

Want to spark some lively discussions around AI? We’ve crafted some thought-provoking questions to fuel the conversation:

🤔 Why is the AI industry worth so much money and is it another dot com boom scenario?

💸 Should large language models be free to use, and if so how should they fund the servers, maintenance and electricity? If not, how should revenue be raised?

🌍 In what ways could AI create a new digital divide?

🧠 Can AI advance to a stage where it no longer requires humans?

These questions are sure to get students thinking critically and debating the future of tech!

If you’re curious to learn more, watch the Lesson Hackers video on OpenAI’s and how open it is Here.

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why do arrays start at zero?

Why do arrays start at zero? It’s not just a programming quirk—it’s a design rooted in efficiency and how computers handle memory. Discover the logic behind this smart choice!

15 July 2025

Why DPI matters: The difference between screen & print quality

If you’ve ever worked on digital art, a webcomic, or even just designed a logo, you’ve probably heard someone say, “Don’t draw at 72 DPI!” But why does DPI matter so much, and why does something that looks perfect on screen turn into a blurry mess when printed?

8 July 2025

Why do GPUs get so hot?

GPUs are built for intense, non-stop processing—which means they generate a lot of heat, fast. But why? Clever cooling keeps performance high and temperatures in check, so your graphics card can stay cool under pressure.

10 June 2025

What is the Dark Web?

The Dark Web is a hidden part of the internet, offering anonymity and security for those who need it. While often linked to criminal activity, it also serves as a vital tool for privacy, journalism, and secure communication.

3 June 2025

Why you should attend the Festival of Computing 2025

Discover the future of computing education at the Festival of Computing 2025 — a must-attend event for Computer Science teachers.
With expert keynotes, practical CPD, and a vibrant community, it’s the ultimate day of inspiration and professional growth.

29 May 2025

How much does it cost to build a CPU?

Ever wondered what it takes to build a CPU? It’s like crafting the world’s most intricate pancake—but with a price tag to match. From silicon wafers costing £8,000 to state-of-the-art factories worth billions, creating these tiny tech marvels is anything but simple.

27 May 2025

Are Graphics cards cheating now?

Is AI revolutionising gaming or just covering for lacklustre hardware? Nvidia’s RTX 5000 series and DLSS 4 blur the line between real performance and smart shortcuts.

13 May 2025

Teacher sitting at a laptop computer.

Smart Revise May 2025 update

In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a […]

10 May 2025

Meet the speakers inspiring change at the Festival of Computing 2025

Join the UK’s leading voices in computing education at the Festival of Computing 2025.
Inspiring keynotes and practical sessions await — don’t miss your chance to shape the future of digital learning.

6 May 2025