Back

Is Apple in hot water?

Is your data still safe in the UK?

2 September 2025

Apple just pulled a major privacy feature from the UK—and it wasn’t because they felt like it. The tech giant was asked by the UK government to weaken its encryption, effectively creating a backdoor to your iCloud data. Apple’s response? “Nah, we’ll just remove the whole feature instead.” But what does this mean for your privacy, and why is it such a big deal? Let’s break it down.

What is the snooper’s charter?

The Investigatory Powers Act (charmingly nicknamed the Snooper’s Charter) is a UK law that gives the government the right to demand access to encrypted data in the name of national security. Think terrorism, child abuse, and organised crime—the heavy stuff.

In theory, this law is about protecting the public. But in practice, it means the government can secretly force tech companies to create backdoors, making once-secure systems vulnerable. The problem? Encryption is designed so not even Apple can access your private data. The whole point is that your information is locked in a digital vault that only you have the key to.

Apple’s response: No vault for you

Rather than creating a secret backdoor, Apple took a different approach. They simply removed their C (ADP) feature from the UK altogether. ADP gave iCloud data an extra layer of encryption that even Apple couldn’t crack.

By pulling the feature, Apple essentially said, “If we can’t guarantee privacy, you can’t have it.” It’s a bold move—one that’s left privacy advocates cheering and the UK government fuming.

Why does this matter?

If you were using ADP in the UK, it’s now gone. Your iCloud data is no longer as secure as it was. But the impact goes beyond just Apple users.

If the UK government wins its legal battle to force Apple (and potentially other companies) to add backdoors, it could set a global precedent. Governments worldwide might demand the same, making everyone’s data—from journalists and activists to everyday users—more vulnerable. And once a backdoor exists, it’s not just governments that will exploit it. Hackers, cybercriminals, and shady data brokers will be lining up too.

What can you do?

If you’re concerned about your privacy, you might want to look into alternative encrypted storage solutions. Or, if the UK keeps pushing for more data access, you may have to resort to smuggling USB sticks across the Channel like some kind of 21st-century data bootlegger.

Want to dive deeper?

This is just a glimpse into the ongoing battle between governments and tech companies over your privacy. 

Watch the full video on our Craig’n’Dave YouTube channel.

For more insights, resources, and lesson content, head over to our website: craigndave.org.

Stay informed, stay secure, and stay tuned.

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

A new year and another new initiative

1 September 2025

The concept of learning styles—the idea that individuals learn better when taught in their preferred sensory modality (e.g., visual, auditory, kinesthetic)—has been widely popular in education. However, despite its appeal, the theory has been largely debunked by empirical research. Here’s a breakdown of the origins, popularity, and scientific critique: 

 Origins and Popularity 

  • Early Theories: The idea of learning styles can be traced back to educational psychology in the 20th century. One of the most influential models was the VARK model (Visual, Auditory, Reading/Writing, Kinesthetic), developed by Neil Fleming in the 1990s. 
  • Appeal: It resonated with educators and learners because it emphasised personalisation and seemed intuitive—people often feel they have a preferred way of learning. 

Scientific Research and Debunking 

  • Key Issue: The central claim is that matching teaching styles to a student’s preferred learning style improves learning outcomes. This is known as the “meshing hypothesis.” 
  • Major Review: In 2008, a comprehensive review by Pashler et al. in Psychological Science in the Public Interest concluded that:  There is no adequate evidence base to justify incorporating learning styles assessments into general educational practice.” 

Findings

  • Studies that properly tested the meshing hypothesis (i.e., using randomised controlled trials and measuring actual learning outcomes) did not find support for it. 
  • People may have preferences, but teaching to those preferences does not improve learning
  • Content matters more: The best modality often depends on the subject matter (e.g., diagrams for geometry, audio for music), not the learner. 

What Actually Works 

  • Cognitive science supports strategies like: 
  • Spaced repetition 
  • Retrieval practice 
  • Interleaving (mixing different topics or skills) 
  • Dual coding (combining words and visuals) 
  • These methods are evidence-based and improve learning across the board, regardless of “style.” 

Why It Still Persists 

  • Confirmation bias: People remember when their preferred style seemed to help. 
  • Commercial interests: Many companies sell learning style assessments and training. 
  • Intuitive appeal: It feels personalised and empowering, even if it’s not effective. 

Want to know more? Watch the full video on our YouTube channel – At the chalk face.

For more educational news, check out the At the Chalk Face YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Did Deepseek change AI?

Deepseek: The Chinese AI startup shaking up Silicon Valley

26 August 2025

What is DeepSeek and why is it making waves?

DeepSeek, an AI company based in Hangzhou, China, is making headlines with its latest models, DeepSeek-V3 and DeepSeek-R1. These models aren’t just impressive in quality—they’ve been built for a fraction of the cost compared to OpenAI’s ChatGPT. Reports suggest DeepSeek trained its models for under $6 million, an astonishingly low figure in the AI industry. To put it into perspective, that’s like buying a Ferrari for the price of a second-hand scooter.

Adding to the excitement, DeepSeek’s AI assistant has surged to the top of the US App Store, overtaking ChatGPT. If there’s one thing Americans love more than AI chatbots, it’s winning—and DeepSeek seems to be doing just that.

How did they build it for so little?

The secret lies in a technique called AI distillation. Unlike traditional AI training methods that demand vast amounts of computing power, distillation allows DeepSeek to train a large model first, then extract the key knowledge into a smaller, more efficient version. Think of it as revising for an exam—not reading the entire textbook, just the essential parts.

This method is incredibly cost-effective. Research teams have recreated OpenAI’s reasoning model for as little as $450 in just 19 hours. Some have even done it for $50 in 26 minutes—cheaper than a takeaway pizza. By using distillation, DeepSeek has bypassed the traditional ‘throw money at it’ strategy and delivered an AI that punches well above its weight. Even OpenAI’s CEO, Sam Altman, has hinted that they may need a new open-source strategy to keep up.

What are the drawbacks?

DeepSeek’s meteoric rise isn’t without controversy. One major concern is its hardware. Reports suggest the company may have access to far more Nvidia AI chips than US export controls should allow. If true, this raises serious questions about trade restrictions and supply chains.

Another challenge is accuracy. While AI distillation makes models faster and cheaper, it also means some information gets lost along the way. It’s like summarising a novel—you get the main ideas, but occasionally miss important details.

Are there security concerns?

With AI becoming more affordable and accessible, concerns around misuse are growing. While democratising AI leads to faster innovation, it also increases the risk of deepfakes, misinformation, and other ethical dilemmas. If DeepSeek can build a ChatGPT competitor at a fraction of the cost, what’s stopping a rogue developer from creating something far more dangerous in their garage?

DeepSeek has disrupted the AI landscape, proving that cutting-edge models don’t need billion-dollar budgets. This has left OpenAI and Silicon Valley scrambling to adapt. Will this spark a new AI arms race? Possibly. But one thing is clear—AI is evolving at breakneck speed, and the future is closer than we think.

Want to see more about this AI shake-up? Watch the full video on our YouTube channel

For more Lesson Hacker Videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

What is AI?

AI: Just fast maths pretending to be smart

12 August 2025

AI. It’s a term that gets thrown around everywhere—from science fiction films to social media posts and school corridors. But what actually is artificial intelligence, and why is it so important to understand?

At its core, AI doesn’t actually think—it just predicts. It’s essentially super-fast maths, rapidly analysing patterns to guess what should come next in a sequence. Imagine that friend who always finishes your sentences… except AI does it with slightly better accuracy.

How do transformers power AI?

No, we’re not talking about giant robots (although that would be cool). In AI, transformers are a type of deep learning model that helps machines generate human-like text. Here’s how they do it:

  • Word magic: AI doesn’t see words—it sees numbers. It converts text into numerical values that represent meaning. Kind of like the Matrix, but without the leather trench coats.
  • Attention, please! Transformers scan every word in a sentence and decide which ones are important. It’s a bit like pretending to listen in a meeting but only perking up when you hear “free snacks.”
  • Prediction time: AI makes an educated guess about the next word, refines it, and repeats the process until the sentence sounds human. The result? AI-generated essays, jokes, and sometimes suspiciously accurate emails.

How does AI learn??

Behind the scenes, AI is powered by huge datasets and clever algorithms. These systems “learn” patterns from data, meaning they can improve their performance over time without being explicitly programmed to do so. This process is called machine learning, and it’s how many of today’s most exciting AI tools work.

AI and you

AI is already influencing your daily life, whether you realise it or not. It shapes the content you see online, helps doctors spot diseases faster, supports businesses with automation, and could even play a role in your future career. Understanding how it works is more than just useful—it’s essential.

Why AI isn’t taking over (yet)

Despite its clever tricks, AI isn’t sentient—it’s just playing a game of supercharged fill-in-the-blank. While it’s brilliant for generating text and answering questions, it still lacks genuine understanding or creativity. So, would you trust it to run the world? probably not. But to help you write a convincing email? absolutely.

Want to learn more?

Want to know more? Check out The Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

 

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

How do computers generate random numbers?

29 July 2025

Ah, randomness! It’s everywhere in nature—think dice rolls, quantum physics, or even your cat’s indecision. But when it comes to computers, randomness doesn’t come naturally. Why? Because computers are logical machines, designed to follow precise instructions. So, when we ask for a “random” number, they can’t just pluck one from thin air. Instead, they rely on something called a pseudo-random number generator (PRNG)—essentially, randomness with a script.

How does a pseudo-random number generator work?

Here’s how computers fake randomness step by step:

  1. The magic seed
    The process begins with a “seed” number. This seed could be almost anything—like the exact millisecond from the system clock, the temperature of your CPU, or even the quirky motion of a lava lamp (a trick famously used by Cloudflare for added unpredictability).
  2. Math happens
    Once the seed is set, it’s run through a complex mathematical formula designed to churn out seemingly random results. Picture a blender spinning at full speed, tossing numbers into a chaotic whirl.
  3. Voilà! fake randomness
    Out comes a number that looks completely random. However, if someone knows the original seed and the formula, they can predict the outcome—like a magician pulling the same rabbit from their hat every time.

Can computers create true randomness?

When it comes to security, like encrypting sensitive data, fake randomness isn’t enough. For truly unpredictable results, computers turn to nature for help. They measure chaotic phenomena like radioactive decay, electrical noise, or even the small, unpredictable quirks of daily life. This kind of randomness, called “true randomness,” is far more secure and impossible to predict.

So, while computers don’t naturally do random, they’ve mastered the art of faking it with clever algorithms. But when we need something truly unpredictable, we can rely on the chaos of the natural world. Or, as a simpler alternative, just watch a cat trying to decide whether to go outside.

Want to learn more?

Want to know more, check out The Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Can We Tell the Difference Between High Frame Rates?

22 July 2025

Ever heard gamers argue about frame rates? One insists 60 FPS is perfectly fine, another declares anything below 240 FPS is unplayable, and then there’s that one person who swears they can see the difference between 999 and 1000 FPS—probably while wearing a pair of ancient glasses. But what’s the truth behind these claims?

How the human eye processes motion

Let’s clear something up first: the human eye doesn’t see in frames per second. It’s not a digital monitor but a complex biological system. Our eyes take in continuous information, and our brains process motion at speeds that matter—but only to a point.

At frame rates below 30 FPS, motion starts to look disjointed, like an old puppet show. Jump to 60 FPS, and things feel smoother, though many gamers will still find room to complain. Push it up to 120 FPS, and you’ll notice things feel even “snappier”—but now we’re entering a realm where perception begins to blur with personal preference.

The limits of perception

What about 240 FPS? At this stage, individual frames become almost imperceptible, but some people—especially competitive gamers—may notice the increased smoothness in fast-paced scenarios. Beyond that? Unless you’re a fighter pilot, a mantis shrimp, or bluffing, the benefits become negligible.

It’s not just about frame rate

Frame rate is only one piece of the puzzle. Motion blur, screen technology, and input lag also influence how smooth gameplay feels. So, if you’re investing heavily in a high-performance monitor, remember this: at some point, you’re not just paying for a better gaming experience—you’re paying for bragging rights.

Does it really matter?

While high frame rates can enhance gaming for certain scenarios, they’re not always necessary for a great experience. Understanding the science of perception can help you decide when to upgrade—and when to save your money.

Want to dive deeper into the science of frame rates?

Check out Dave The Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

 

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Why do arrays start at zero?

15 July 2025

If you’ve ever dived into programming, you’ve probably asked yourself: why on earth do arrays start at zero instead of one? At first glance, it seems counterintuitive, but the answer lies in efficiency and logic.

Visualising arrays: the row of lockers analogy

Think of an array as a row of lockers. Each locker has a position, starting at the very beginning of the row. The first locker is zero steps from the start, the second locker is one step away, and so on. If you want to access the third locker, you count two steps from the beginning: 0, 1, 2. This is the essence of zero-based indexing—it measures the offset from the starting point.

The link between arrays and memory

Arrays in programming map directly to how memory works in a computer. When an array is created, it’s stored as a block of memory. Accessing an element at array[i] involves the computer locating the base address of the array in memory and adding i to it. Starting at zero simplifies this calculation, making it faster and more efficient. In essence, zero-based indexing aligns perfectly with how hardware is designed to operate.

Why not start at one?

While starting at one might feel more intuitive, it’s not practical. Zero-based indexing is baked into the very foundation of programming languages, compilers, and hardware logic. Switching to one-based indexing would introduce unnecessary complexity and inefficiency. That’s why programmers worldwide have embraced zero-based indexing as the universal standard.

It’s not weird—it’s smart!

So, the next time you see array[0], remember it’s not just a quirk of programming. It’s a smart, efficient design choice that keeps your code running smoothly.

Want to learn more?

Want to know more? Check out The Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker Videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Why DPI matters: The difference between screen & print quality

8 July 2025

Understanding DPI: What does it actually mean?

DPI (dots per inch) is exactly what it sounds like—it’s a measure of how many tiny dots (or pixels) fit into one inch of space. The more dots you have, the more detail your image retains.

For digital screens, 72 DPI is the standard because it keeps file sizes small and looks crisp at a normal viewing distance. But when it comes to printing, things change dramatically.

Think of it like wearing pyjamas. At home, wearing 72 DPI is fine—relaxed, comfortable, and good enough for what you need. But taking that same look to a first date? Suddenly, the details matter a lot more.

Why does print need 300 dpi (or more)

When you print something, you’re holding it much closer to your eyes than a screen. Your brain expects more detail because it’s used to seeing sharp, high-resolution objects up close. If your image is only 72 DPI, it won’t have enough detail to look crisp—it will appear soft, pixelated, and blurry, like a sun-faded potato chip.

That’s why 300 DPI is the magic number for print. At this resolution, images retain their sharpness even when viewed up close. The higher dot density makes lines and textures look clean, rather than jagged or smudged.

Imagine a giant poster—when viewed from 10 feet away, a few blurry dots don’t matter. But now think about a business card. You hold it right up to your face, and if it’s not printed at high resolution, it’ll look like it was drawn in MS Paint by a four-year-old with a potato.

The simple rule: screen vs print

If you only remember one thing, make it this:

  • 72 DPI is fine for screens. It’s optimised for digital displays, loads quickly, and keeps file sizes manageable.
  • 300 DPI (or higher) is essential for print. It preserves fine details, ensuring your artwork looks as sharp on paper as it does on screen.

What happens if you use the wrong DPI?

  • If you use 72 DPI for print, your artwork will look blurry and pixelated.
  • If you use 300 DPI for digital, your file sizes will be unnecessarily large, and it won’t look any better than a 72 DPI image.

So, always think about where your image will be seen before choosing the right DPI. If it’s just for a website or social media, 72 DPI is fine. But if it’s going to a printer, crank it up to 300 DPI to avoid a pixelated disaster.

Want to learn more about getting the best quality out of your designs? 

Check out Dave The Lesson Hacker’s YouTube video HERE

For more Lesson Hacker videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Why do GPUs get so hot?

10 June 2025

GPUs are the powerhouses of modern computing, handling everything from gaming to video editing and complex 3D rendering. But with great power comes great heat. Ever wondered why your graphics card runs so hot? Let’s break it down.

The science behind GPU heat

Think of your GPU like a busy chef in a restaurant, constantly preparing thousands of meals at once. Each dish represents a calculation, and just like in a real kitchen, all that activity generates heat.

At the heart of it all is electricity. Every time your GPU processes data, tiny electrical signals rush through billions of transistors. But electricity is never 100% efficient—some of that energy gets lost as heat. With so many calculations happening at lightning speed, things heat up quickly.

Why GPUs run hotter than other components

Unlike your CPU, which gets short breaks between tasks, GPUs are designed for continuous heavy lifting. Whether you’re gaming, rendering 3D models, or watching high-definition videos, your GPU is working flat out, pushing itself to the limit.

To make things even trickier, modern GPUs are built with incredible density, packing more transistors into smaller spaces than ever before. It’s like squeezing too many commuters onto a packed Monday morning train; there’s no room to breathe, and the heat has nowhere to escape.

How GPUs keep their cool

This is where cooling solutions come in. Your computer’s fans work hard to move hot air away from the GPU, while heatsinks help absorb and disperse excess warmth. High-performance gaming setups even use liquid cooling to keep temperatures under control.

If your GPU ever gets too hot, it can throttle its performance to prevent damage, but ideally, you want to avoid this from happening. Keeping your PC well-ventilated and dust-free can go a long way in helping your GPU stay cool and efficient.

 

Next time you hear your computer fans whirring into action, just remember: your GPU is working hard to deliver stunning graphics and smooth performance. Looking after it will keep your system running at its best.

Want to dive deeper into how GPUs work? Watch the full video here

Want to learn more about computer science and the latest tech trends? Visit our website Craig’n’Dave for all the latest resources and insights.

 

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

What is the Dark Web?

Understanding the Dark Web: What You Need to Know

3 June 2025

What is the Dark Web? A Look Into Its Mysteries

When you hear about the Dark Web, it’s easy to imagine a place full of criminals, illegal activities, and shady dealings. But is that the whole picture? Let’s take a closer look at what the Dark Web really is, how it works, and why some people use it.

What Exactly is the Dark Web?

To understand the Dark Web, imagine the internet as a massive city. The regular web—the part of the internet where you shop, search for information, and watch videos—is like the lively downtown area. Everyone knows you’re there, from your internet provider to the websites you visit. Now, picture the Dark Web as the quieter, hidden alleyways. It’s a part of the internet where your online activities are harder to track, and you need special tools to access it—like the TOR browser (The Onion Router). When you use TOR, you’re essentially putting on an invisibility cloak, hiding your digital footprint from prying eyes.

Is the Dark Web Dangerous?

It’s not as sinister as it may sound at first. Simply visiting the Dark Web isn’t illegal or dangerous, as it’s just another layer of the internet. However, much like any other part of the internet, there are areas of the Dark Web where illegal activities occur—such as the buying and selling of illicit items. But, that’s not all. The Dark Web is also used for good. It provides a safe space for people like journalists, government agents and whistleblowers, who need to communicate securely without the risk of surveillance or hackers.

While it’s often associated with criminal activity, not everyone who visits the Dark Web is up to no good. It’s a tool for anonymity and security in an otherwise open internet, and its uses extend far beyond shady dealings.

Stay Safe and Informed

So, is the Dark Web a dangerous place? Not if you’re careful. It’s a bit like wandering into an unfamiliar neighbourhood—there are good parts and bad parts. As long as you steer clear of the illegal corners, the Dark Web can serve as a valuable tool for privacy and secure communication.

For a deeper dive into the world of the Dark Web, check out our video here

Want to learn more about computer science and the latest tech trends? Visit our website Craig’n’Dave for all the latest resources and insights.

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

How much does it cost to build a CPU?

27 May 2025

Building a CPU isn’t just complicated; it’s an engineering marvel that demands staggering resources. 

Imagine creating the most intricate pancake in the world, where every ingredient is microscopic, precision matters, and the price tag is astronomical. 

Let’s break it down to understand what goes into making these high-tech powerhouses. 

Silicon wafers: the foundation of a CPU. 

At the heart of every CPU is a silicon wafer. While the raw materials themselves are relatively cheap, turning them into a usable wafer is an entirely different story. The process involves cutting-edge technology and precision, with costs starting at £8,000 or more per wafer. And that’s just the beginning. The factories where CPUs are made, known as fabs, are extraordinary facilities. 

Building a state-of-the-art fab capable of producing today’s 3nm or smaller transistors can set you back over £16 billion. Why so much? Because these fabs operate on an atomic scale, even the tiniest mistake can render entire batches unusable. 

The level of cleanliness, precision, and technological advancement required is unmatched. 

Research and development: the hidden cost. 

Designing a CPU isn’t a quick or cheap process. Teams of engineers spend years creating, testing, and refining each design. Simulations, prototypes, and endless troubleshooting are part of the journey, with research and development costs reaching millions of pounds for a single chip. 

It’s an investment of time, money, and expertise to push the boundaries of what’s possible. 

Why CPUs are worth every penny. 

When you consider the monumental effort and expense behind each CPU, it’s easier to understand their price. 

Every chip is a piece of technology more complex than most buildings, packed into a form factor small enough to fit in your hand. CPUs power everything from our laptops to supercomputers, making them one of the most essential inventions of our time. 

Curious to learn more about the fascinating world of CPUs? 

Watch the full video on our YouTube channel for an in-depth explanation. 

For more insights into computer science and to explore our resources, visit the Craig’n’Dave website today.

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Are Graphics cards cheating now?

13 May 2025

In the ever-evolving world of gaming and computing, Nvidia’s latest RTX 5000 series has sparked an interesting debate: are graphics cards actually improving, or is AI doing all the heavy lifting? 

With the launch of DLSS 4 and some mind-blowing specs, we’re diving into what’s real, what’s AI-generated, and whether any of it really matters.

Are GPUs really getting better?

Nvidia’s new flagship, the RTX 5090, is an absolute beast. With 92 billion transistors, 32GB of GDDR7 VRAM, and memory bandwidth that defies belief, it’s designed to dominate 4K gaming, dabble in 8K, and obliterate your wallet at $1,999. But who is this really for? While gamers will appreciate the power, this kind of hardware is also aimed at content creators, developers, and professionals pushing the limits of rendering and AI-driven applications.

The evolution of DLSS

One of the biggest advancements in recent years has been DLSS (deep learning super sampling). When it first launched in 2018, it was a bit underwhelming—think blurry, pixelated mess. But Nvidia kept improving it, and today, DLSS 4 is a game-changer. Using transformer-based AI models, it enhances graphics, generates frames, and makes gameplay smoother than ever.

DLSS 4 includes three major features:

  • Super resolution – upscales lower-resolution images to 4K or even 8K.
  • Ray reconstruction – improves ray tracing quality using AI rather than traditional methods.
  • Multi-frame generation – creates new frames in real-time, making gameplay ultra-smooth.

This means you can enjoy high-end visuals without needing a ridiculously expensive GPU every year.

Is AI ‘cheating’ in gaming?

Some critics argue that AI-generated frames aren’t ‘real’ pixels, but does it actually matter? If a game looks stunning, runs at 120fps, and feels seamless, is it important whether every frame was painstakingly rendered or if AI stepped in to assist?

It’s a bit like baking a cake—whether the icing was handmade or piped by a machine, the end result is still delicious. For most gamers, AI-powered enhancements are a blessing, allowing them to enjoy top-tier performance without breaking the bank.

The future of GPUs and gaming

One thing is clear: AI is no longer just a sidekick in gaming—it’s taking centre stage. DLSS 4 is proof that Nvidia is leaning heavily into AI-driven enhancements. But there’s a catch: multi-frame generation is exclusive to the RTX 5000 series, meaning older GPUs are slowly being left behind.

For those still clinging to older hardware, the choice is clear: embrace the upgrade cycle or accept a future as a retro gamer. Either way, gaming technology is moving faster than ever, and Nvidia’s latest advancements are redefining what’s possible.

Want to see the tech in action? Check out our full breakdown of the RTX 5000 series and DLSS 4 in our Lesson Hacker video.

 

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Why don’t computers use a different base for numbers?

The simple reason why binary beats all other number bases

30 April 2025

Why not Base-4?

At first glance, it seems logical to ask: Why don’t computers use Base-4 instead of Base-2? After all, wouldn’t using more numbers give us more power? While it might sound appealing, the reality comes down to the fundamental way electronics work—and why binary remains unbeatable.

Electronics love simplicity

Computers are built on circuits that recognise two states: “off” and “on”. These states are easy, reliable, and practical for electronics to detect. Base-4, on the other hand, would mean handling four distinct states—imagine “off,” “partly on,” and “fully on.” Cool in theory, but impractical in reality. Building hardware to detect such levels would not only be expensive but also error-prone. Think of it like trying to get a light switch to dim to exactly 37%—possible, but far from practical.

A costly rewrite of history

Binary’s dominance dates back to the early days of computing, when switches were literal levers toggling between two positions. Switching to Base-4 today would require a complete overhaul of modern technology. Every programme would need rewriting, every processor redesigning, and every programmer retraining. The cost? More than even the world’s wealthiest could cover.

Base-3 computers: A brief history

Interestingly, a ternary (Base-3) computer was once a serious contender in the 1950s. Yet, despite its potential, binary won out for its simplicity, reliability, and efficiency. The entire computing industry has been built on this foundation, and for good reason: sometimes less really is more.

The unbeatable efficiency of binary

While other number bases could theoretically work, binary remains the gold standard. Its simplicity makes it easy to implement, cost-effective, and highly reliable. If it ain’t broke, don’t fix it—or add unnecessary complexity.


Want to dive deeper? Watch our full Craig’n’Dave Lesson Hacker video

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies. 

Stay informed, stay curious!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Why can’t we just stick RAM directly onto the CPU?

22 April 2025

In the world of computer science, speed is everything. So, it’s easy to see why the idea of sticking RAM directly onto the CPU seems like a genius move. Zero latency, lightning-fast speeds, and no more bottlenecks—what’s not to love? But in reality, it’s not that simple. Let’s break down why we can’t just combine these two crucial components into one.

The difference between CPU and RAM

At first glance, sticking RAM onto the CPU might sound like a great way to boost performance. After all, the closer RAM is to the CPU, the faster data can be accessed, right? Unfortunately, it’s not that straightforward. The CPU and RAM are built in fundamentally different ways.

CPUs are designed to handle calculations at breakneck speeds using logic processes. On the other hand, RAM—specifically Dynamic RAM (DRAM)—uses capacitors to temporarily store data. The catch is that these capacitors need constant refreshing to retain their information. This is similar to a student frantically rereading their notes to ensure they remember everything during revision.

Why it doesn’t work together

Trying to combine CPU and DRAM onto the same chip would cause chaos in the manufacturing process. DRAM fabrication doesn’t align well with the processes used to create a CPU. Imagine trying to install a high-end GPU into a budget laptop—it just won’t fit, and forcing it could cause damage.

Even cutting-edge technologies like Intel’s Haswell architecture use embedded DRAM (eDRAM) sparingly. The goal is to use just enough to boost performance without massively increasing production costs. However, merging CPU and RAM completely would be a manufacturing nightmare.

The speed factor: DRAM vs. SRAM

Even if we could combine the two, there’s another issue: speed. DRAM operates at a top speed of about 1 GHz, while modern CPUs can easily surpass 3 GHz. That’s like putting bicycle tyres on a Formula 1 car—you’re limiting the performance of the entire system.

To overcome this speed gap, CPUs use SRAM (Static RAM) for on-chip cache. SRAM is much faster than DRAM but comes with its own drawbacks: it’s bulkier and significantly more expensive. Sure, we could fill a CPU with SRAM, but that would come at an astronomical cost—far more than most of us are willing to pay.

Why we stick to separate RAM and CPUs

While combining RAM and the CPU might sound like a performance dream, the technical and cost limitations make it impractical. The current balance of DRAM for main memory and SRAM for cache strikes the best compromise between speed, cost, and practicality.

Want to know more? Check out The Lesson Hacker’s YouTube video – 

For more Lesson Hacker Videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

Are loot boxes gambling?

The digital debate every gamer should know

18 March 2025

What are loot boxes?

In simplest terms, loot boxes are mystery rewards players can earn or buy. Think of them as digital versions of Kinder Eggs, but instead of chocolate and toys, they contain skins, weapons, or characters to enhance your game. Sounds fun, right? Except you don’t know what you’re getting until you’ve paid. Cue disappointment when that elusive Messi card in FIFA Ultimate Team turns out to be yet another low-tier player.

The gambling argument

Critics argue that loot boxes mimic the mechanics of gambling. You pay for a chance at a desirable outcome, complete with flashy animations and dopamine-fueled suspense. Younger players are especially vulnerable, with some spending hundreds—or even thousands—chasing rare items.

The Norwegian Consumer Council has even labelled loot boxes as “predatory mechanisms” that exploit psychological tricks to drain wallets. Sound familiar? That’s because it feels suspiciously like pulling the lever on a Vegas slot machine, except your reward is a virtual hat rather than a pile of cash.

Developers defend their treasure chests

Game developers, however, see things differently. They compare loot boxes to toys like mystery figurines, claiming they’re just “harmless fun.” Since most loot boxes don’t pay out real money, they argue they don’t qualify as gambling. But with gaming companies raking in an estimated $15 billion annually from loot boxes, it’s clear these digital party bags are more than just fun—they’re big business.

A lack of transparency

Transparency remains a sore spot. In the UK, only two of the 45 top-grossing games disclose loot boxes in their advertising, despite being required to by law. That lack of clarity leaves players unaware of the odds—or costs—involved.

Are loot boxes all bad?

Not necessarily. For some players, loot boxes add excitement to gaming. Plus, self-regulation within the industry has started to improve, with guidelines for clearer disclosure. Yet, countries like Belgium and Japan have gone further, banning or regulating loot boxes to protect consumers.

So, are loot boxes gambling?

The answer depends on who you ask. Critics say yes, developers say no, and governments remain undecided. What’s clear is that players need to approach loot boxes with caution—because while you might win a cool skin, it’s easy to lose track of how much you’re spending.

Watch the full video for more insights!

Check it out on the Craig’n’Dave YouTube channel, and don’t forget to visit our website for more gaming and computer science insights.

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026

Back

I’ve lost control of my own DNA

18 February 2025

DNA testing kits have become a modern-day curiosity – a way to discover your ancestry, potential health risks, and even traits you didn’t know you had. But have you ever stopped to consider what happens to that genetic goldmine once you send it off? Here’s a closer look at the fascinating yet slightly alarming world of DNA testing.

Uncovering secrets with DNA kits

Companies like 23andMe have made exploring your DNA as simple as spitting into a tube. From identifying as 5% Norwegian to discovering a predisposition for male pattern baldness, the results can be both entertaining and enlightening. But the story doesn’t end with finding out why you have curly hair.

DNA testing services also promise health insights and family connections, often revealing unexpected truths. Think learning one of your parents isn’t actually your parent – a revelation that might spice up Sunday lunch discussions.

The double-edged sword of genetic data

As exciting as these insights are, they come with serious privacy implications. Your DNA isn’t just your own; it’s linked to your family too. Sending in a sample might inadvertently overshare details about your siblings, parents, and even distant cousins.

The bigger concern is what happens to this data if companies face financial trouble. For example, Atlas Biomed recently vanished from the web, leaving users wondering what became of their most sensitive information. Similarly, 23andMe’s struggles raise questions about the security of its extensive DNA database.

Can you take back control of your DNA?

If the thought of your genetic information being mishandled keeps you awake at night, you do have options. Many companies allow you to delete your data, but be aware: if it’s been anonymised and used in research, it’s out there for good.

Before diving into DNA testing, take these precautions:

  • read the terms and conditions: Know who has access to your data and how it’s used.
  • consider the long-term impact: Your results affect not just you, but your family too.
  • decide how much you want to know: Some truths might be best left undiscovered.

DNA testing kits can reveal fascinating insights, but they come with significant risks. If you’re considering taking the plunge, make an informed decision – and if you’ve already tested, explore ways to manage your data responsibly.

For more on this topic, watch the full Craig’n’Dave Lesson Hacker video linked below. 

Be sure to visit our website for more insights into the world of technology and for the best teaching resources for computer science and business studies. 

Stay informed, stay curious!

Related posts

How a GCSE in computer science can shape your students’ future careers

A GCSE in Computer Science isn’t just a qualification—it’s a launchpad to exciting careers in tech, from gaming and robotics to cyber security and AI.
By connecting classroom learning to real-world pathways, teachers can inspire students to see the true value and future potential of their skills.

1 May 2026

How a GCSE in Computer Science can shape your future career

A GCSE in Computer Science opens the door to careers in gaming, robotics, cyber security, and beyond.
Discover how this subject can lead to exciting degrees and future opportunities in the tech world.

29 April 2026

Do we need government AI copyright laws?

AI is transforming creativity — but are we protecting the people behind the art? We explore the UK’s heated debate over AI copyright laws and what they mean for creators and innovation.

8 April 2026

How Do Map Apps Work?

Discover how your map app uses graph theory and clever algorithms to find the fastest route, even before you spot the traffic jam. It’s the smart tech behind every turn and reroute you trust.

What is Chip Binning?

Chip binning is how manufacturers sort silicon chips based on their performance, turning some into high-speed processors and others into more modest models. It’s like baking cookies—some come out perfect, others just good enough.

Meet Dodona: A powerful coding platform built for real classrooms

Discover how Dodona is transforming programming lessons with a powerful, classroom-ready platform built by educators. With the integration of Time2Code, it’s never been easier to deliver engaging, structured coding lessons while saving time and reducing hassle.

Students looking at a hill to climb.

Goals version 2

Until now, the Terms goal has used the Leitner system to determine what students should complete each week. This approach […]

4 April 2026

Differentiation is dead

For decades, teachers were told that differentiation was the golden ticket. If we could just tailor the right task to […]

3 April 2026

It’s not in the mark scheme

Just because it’s not in the mark scheme doesn’t mean it’s wrong — Quicksort proves there are often multiple valid ways to reach the same correct answer.
Understanding the principles behind algorithms matters far more than memorising a single “approved” method.

27 March 2026