Back

Does anyone still use low-level code?

14 January 2026

In an age where everyone seems obsessed with the latest AI chatbot or shiny new high-level programming language, you might wonder: Does anyone still use low-level code? 

The short answer: Yes. 

The long answer: YEEEEEEEEEEEEEES.

While most of the tech world is busy creating chatbots that sound like they’ve just devoured Freud and downed a Red Bull, somewhere in a dimly lit corner, a humble C developer is quietly making sure your toaster doesn’t launch into orbit.

The hidden power of low-level programming

Low-level programming is far from dead. In fact, it’s the invisible force quietly running the technology you use every day. Your car, your washing machine, the plane you’re not on because you spent your money on a new GPU — all of these rely on software written in C, C++, Rust, or even intimidating assembly language. (If you’ve ever seen assembly code, you’ll know it looks like someone tried to type while fending off a raccoon.)

You might be thinking, “Isn’t AI coding now? What’s the point?” Well, here’s the catch — someone still has to build the very systems that AI runs on. Think frameworks, compilers, virtual machines, and device drivers. AI agents don’t know how to manage memory in C, nor do they understand that using eval() like confetti is a bad idea.

Why learning low-level code matters

Learning low-level programming is like learning to fix an engine while everyone else is just learning to drive Teslas. Sure, a Tesla can drive itself… until it doesn’t. Then guess who they call? Not the AI coder — they call you.

If you’re fascinated by game engines, hardware drivers, or compilers, keep going. You’re not outdated — you’re underappreciated. When automation takes over many roles, your skills will remain invaluable because someone has to debug those GPIO pins robots can’t touch.

Stay low. Stay powerful. 

Curious to learn more about the importance of low-level programming?

Watch the full Lesson Hacker video to explore endianness and more fascinating computer science concepts. 

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Is the Online Safety Act protecting us, or going too far?

The UK’s new Online Safety Act aims to protect young people online, but its sweeping measures are raising big questions about privacy, freedom, and access to information. Is it safeguarding the vulnerable, or simply going too far?

7 January 2026

Back

Should AI have morals?

What happens when artificial intelligence starts flattering us instead of challenging us?

13 January 2026

Artificial intelligence is evolving fast — but as it gets friendlier, should we be worried it’s losing its grip on the truth?

We’re exploring a hot topic in both computer science and ethics: Should AI be built with morals, or is it enough for it to make you feel good? 

Spoiler alert — if your chatbot applauds your worst ideas, it might be time for a software update.

Let’s start with ChatGPT, specifically the GPT-4o update. This version of OpenAI’s popular AI assistant had one job: make users happy. It did this so well, it started agreeing with everything. People shared examples of it praising clearly harmful behaviour, reinforcing conspiracy theories, and even applauding dodgy life choices. Why? Because its success was measured on positive user feedback — essentially, how many people responded with smiley face emojis.

The result? A hype man in silicon form. Warm and fuzzy? Yes. Useful? Not so much. 

Eventually, OpenAI admitted it had gone too far and rolled back the overly agreeable behaviour. But the episode raised big questions about the purpose of AI. Should it be emotionally supportive at all costs, or should it sometimes challenge us?

Then there’s GrokElon Musk’s “anti-woke”, “truth-seeking” AI launched via X (formerly Twitter). Despite the branding, Grok began doing something unexpected: it corrected false claims, backed up scientific consensus, and even fact-checked Musk himself. It wasn’t trying to be political — just accurate. But that honesty proved controversial, especially for users who expected Grok to reinforce their existing views. Apparently, it’s all fun and games until the AI doesn’t flatter your worldview.

So, what do we actually want from AI? Is it more important that it makes us feel good — or helps us be better?

On one hand, supportive AIs can offer comfort and validation. But when they reinforce false beliefs or encourage risky decisions, the consequences can be serious. On the other hand, AIs that challenge misinformation and offer correction might feel uncomfortable in the moment — but they can help us grow. Just like that one teacher who was a little harsh with the red pen, but made you a stronger thinker.

This is about more than software — it’s about trust, responsibility, and the future of technology in society. Because if we build AI to agree with us no matter what, we’re not building intelligence. We’re building digital yes-men. And they might just smile and nod while we walk ourselves off a cliff.

So, where do you stand? 

Should AI be polite and supportive — or truthful, even if it stings?

Watch the full video here to explore the debate in full.

For more Lesson Hacker Videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Is the Online Safety Act protecting us, or going too far?

The UK’s new Online Safety Act aims to protect young people online, but its sweeping measures are raising big questions about privacy, freedom, and access to information. Is it safeguarding the vulnerable, or simply going too far?

7 January 2026

Back

What is vibe coding? Is it the future of programming?

Welcome to the “tell, don’t type” era of coding

12 January 2026

If “vibe coding” sounds like something you’d do while lounging in a beanbag with lo-fi beats and herbal tea, you’re not alone. But despite its chilled-out name, vibe coding is a seriously powerful development method—and it’s changing the way we write software.

At its core, vibe coding means using plain English to tell an AI what you want your program to do. Instead of hammering out every loop, condition, and semicolon, you type something like: “Make a form that submits user data to the backend and shows a thank-you message.” The AI interprets your request and generates the code for you—sometimes even with documentation.

This magic happens thanks to large language models like GPT, which have been trained on vast amounts of code. They break your prompt into tokens, map those to patterns they’ve seen before, and predict the most likely next tokens to generate full functions, boilerplate files, and more. Think autocomplete on steroids.

What’s more, modern AI tools like Copilot, Cursor, and Replit are context-aware. They don’t just spit out code snippets—they understand your project structure, track variables across files, and can even refactor code you’ve long forgotten you wrote.

Of course, vibe coding isn’t flawless. The AI can “hallucinate” functions that don’t exist, or write code that looks great… until it crashes. It’s like having a super-keen intern: quick, clever, but occasionally wildly overconfident.

Still, for speeding up development, brainstorming solutions, or simply avoiding another late-night regex breakdown, vibe coding is a game-changer. You bring the ideas. The AI brings the syntax.

Watch our Lesson Hacker video here to explore more.

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Is the Online Safety Act protecting us, or going too far?

The UK’s new Online Safety Act aims to protect young people online, but its sweeping measures are raising big questions about privacy, freedom, and access to information. Is it safeguarding the vulnerable, or simply going too far?

7 January 2026

Back

What does a GPU actually do?

The crayon-filled truth about graphics processing.

9 January 2026

 Why your graphics card is more of an art class than a supercomputer

If you’ve ever wondered what a GPU really does, you’re not alone. Graphics Processing Units often sound like the mysterious cousins of CPUs, quietly making magic happen behind the scenes of your favourite games and videos. But here’s a fun way to think about it: imagine a colouring book the size of the Eiffel Tower… and a looming deadline.

A CPU would take one look, grab a single crayon, and carefully colour inside the lines—inch by inch. Methodical, yes. Efficient? Not quite. 

CPUs are brilliant at complex, sequential tasks, like running your operating system or checking your emails. They’re your digital Swiss Army knives. But they weren’t built for speed painting.

Enter the GPU: not one person with a crayon, but a room full of toddlers—each with a crayon in hand. Shout “RED!” and suddenly hundreds of tiny hands go wild scribbling. It might not all be tidy, but the job gets done at lightning speed. That’s parallel processing in action.

GPUs are crammed with hundreds (sometimes thousands) of tiny, specialised cores designed to handle the same task simultaneously. They’re ideal for things like shading millions of pixels, calculating real-time lighting effects, or rendering dragons in ultra-high resolution at 60 frames per second.

While your CPU can do a little of everything, a GPU goes all-in on one job: graphics. It doesn’t bother with emails or spreadsheets—it’s far too busy making your game worlds look stunning (or quietly mining crypto, if you’re into that).

So next time you’re blown away by slick visuals, thank the GPU. And if something crashes? Don’t blame the hardware. Maybe just check the crayon count.

Watch our Lesson Hacker video to explore more.

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Is the Online Safety Act protecting us, or going too far?

The UK’s new Online Safety Act aims to protect young people online, but its sweeping measures are raising big questions about privacy, freedom, and access to information. Is it safeguarding the vulnerable, or simply going too far?

7 January 2026

Back

Should beginners use AI to code?

8 January 2026

Here’s why the answer isn’t so simple. AI can be an amazing tool for coders—but should absolute beginners rely on it?

So, you’ve just dipped your toes into the world of coding—still coming to terms with variables, loops, and the existential dread of debugging. Then someone tells you, “Just use AI, it’ll write the code for you!” Sounds tempting, right? 

But here’s why that shiny tool might be more lightsaber than lifesaver.

Imagine giving a Jedi weapon to someone who’s only just mastered the art of stick-fighting. That’s what it’s like handing over AI code generation tools to a beginner. Yes, it’s powerful. Yes, it sounds impressive. But if you don’t yet understand the basics, there’s a real risk of slicing through your logic and confidence.

This isn’t to say you should avoid AI altogether. In fact, it can be an incredible tutor—if you use it the right way. Ask it questions. Explore its answers. Use it to understand concepts like callbacks (which, let’s be honest, sound more like something your ex never gave you). But don’t fall into the trap of copying and pasting code like you’re following a recipe from the internet—because while it might work, you won’t truly know how or why.

AI should be your sidekick, not your saviour. 

It’s brilliant when you need a quick fix or to meet a tight deadline. But if your goal is to learn how to code—really learn—then you need to do the thinking. The debugging. The failing and fixing.

Because one day, you’ll face AI-generated code that doesn’t work. And if you’ve skipped the hard stuff, you’ll be stuck—realising, with horror, that the problem isn’t the code. It’s you.


Watch our Lesson Hacker video here to explore more.

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Is the Online Safety Act protecting us, or going too far?

The UK’s new Online Safety Act aims to protect young people online, but its sweeping measures are raising big questions about privacy, freedom, and access to information. Is it safeguarding the vulnerable, or simply going too far?

7 January 2026

Back

Is the Online Safety Act protecting us, or going too far?

Navigating the new online safety act

The UK’s Online Safety Act has landed, and while its intentions might seem noble, the execution has raised eyebrows across classrooms, workplaces, and dinner tables alike. Designed to protect young people from harmful online content, it’s already being labelled by some as overkill — a digital bazooka to squash a fly.

So, what’s really going on? Let’s break it down.

Age checks, fines, and blocked sites

At its core, the Act requires platforms to implement strict age verification systems. Think ID scans, facial recognition, or even using your webcam to prove you’re old enough to view certain content. Non-compliant sites risk heavy fines or outright bans in the UK.

But here’s the catch: this doesn’t just affect teenagers. Adults are finding themselves locked out of music, films, and even news unless they hand over personal data to third-party verifiers. Imagine being asked to show ID just to stream a song on Spotify — it’s happening.

The VPN boom

Unsurprisingly, VPN downloads have surged. Acting like an invisibility cloak for the internet, VPNs let users bypass age restrictions and region locks. Ironically, even some MPs — the very people behind the law — have been expensing VPN subscriptions instead of submitting to verification checks.

Yet this workaround isn’t risk-free. Free VPNs, in particular, often come with hidden dangers, from data harvesting to malware. In trying to dodge surveillance, users may be stepping into something worse.

When protection becomes restriction

The ripple effects go beyond entertainment. News about conflicts in Gaza or Ukraine, LGBTQ+ support resources, and other legitimate educational content have been blocked under sweeping rules. The Act’s “better safe than sorry” approach has meant that entire conversations and communities are stifled.

It’s a balancing act: yes, protecting young people is vital, but when important voices and discussions are muted, digital freedom takes a serious hit.

Finding the balance

So, is the Online Safety Act safeguarding the vulnerable or silencing too much? 

Its double-edged nature shows us that regulation without nuance can lead to privacy risks, restricted freedoms, and frustrated users.

For teachers, students, and parents navigating these changes, the key is to stay informed and ask the hard questions: how do we balance safety and freedom online?

Watch the full Lesson Hacker video to dive deeper into the world of The Online Safety Act.

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

Why do we still use the qwerty keyboard – even though it makes no sense?

The baffling history of QWERTY and why it’s here to stay

6 January 2026

Have you ever stared at your keyboard and wondered why the letters seem scattered at random—as if someone lost a bet in the 19th century? That’s the QWERTY layout for you. It’s the standard we all use, but few of us know why… or how we ended up stuck with it.

The story begins in the 1870s with Christopher Sholes, the inventor of the first commercially successful typewriter. Early typewriters had a major flaw: they jammed when nearby keys were struck too quickly in succession. So, instead of creating a logical, alphabetical layout, Sholes rearranged the keys to slow things down—not to frustrate typists, but to stop the typewriter from throwing a mechanical tantrum mid-sentence. That’s how “QWERTY” was born.

Over the years, others have tried to fix it. The Dvorak layout is one such alternative, engineered for speed and efficiency. In theory, it’s better. In practice? Not so much. 

Learning a new layout is like learning to write with your non-dominant hand while your friends roll their eyes every time they need to borrow your laptop. Studies show the performance gains are minimal at best—and honestly, who has time to re-learn how to type?

Like GCSEs and Windows updates, QWERTY has stuck around—not because it’s ideal, but because change is hard. It’s embedded in everything: your laptop, your phone, even your smart fridge. Changing it now would take a digital revolution… and most of us can’t even find the “@” symbol without squinting.

So next time your fingers fumble across the keyboard, don’t blame yourself. Blame history. And Christopher Sholes.

Watch the full Lesson Hacker video to dive deeper into the weird world of QWERTY – and laugh while you learn!

For more Lesson Hacker videos, check out the Craig’n’Dave YouTube playlist HERE.

Be sure to visit our website for more insights into the world of technology and the best teaching resources for computer science and business studies.

Stay informed, stay curious!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

Unlocking the Craig’n’Dave Resource Centre

Everything computer science teachers need in one place

16 December 2025

A deep dive into the hidden gems of the Craig’n’Dave Resource Centre

If you know Craig’n’Dave, chances are it’s because of our videos — they’re the most popular thing we make. But what many teachers don’t realise is that behind those videos sits something even more powerful: a complete, fully editable resource ecosystem designed to help you teach computer science with confidence.

In the latest episode of At the Chalk Face, Craig and Dave open the vault to explore one of their core products — the Craig’n’Dave Resource Centre. Whether you’re brand new to CnD or you’ve been using their materials for years, this behind-the-scenes look reveals just how much sits within this 4,000-strong library.

What exactly is the Resource Centre?

The Resource Centre is a growing collection of over 4,000 editable resources covering GCSE, A-level and Cambridge IGCSE computer science. It includes:

  • Fully editable schemes of learning
  • Complete sets of student workbooks
  • Presentation slide decks (the same ones used in C&D videos!)
  • Exemplar answers and A4-formatted knowledge organisers
  • End-of-topic tests
  • Delivery calendars for multiple timetables
  • Free sample units for every course

Every resource has been designed and refined by real teachers with over 20 years’ experience in the classroom. They were created to solve the same problems computer science teachers face every day — from student engagement to lesson sequencing to time-saving.

More than lessons: The hidden treasures

Beyond the main course materials, there are features many teachers don’t know exist:

  • The Essential Algorithms & Data Structures Book — a complete, specification-aligned guide with code in three languages.
  • Telium — a brilliant end-of-Year-10 project that ties together everything students have learned.
  • Logic gate symbol packs, exam technique guides and terminology lists.
  • “Little extras” packs full of those small but essential items teachers always need.

And yes — everything is fully editable, so you can adapt it to your school’s needs.

Why teachers love it

Even if you already have schemes, resources or established lessons, the Resource Centre is perfect for refreshing your approach, boosting confidence, finding inspiration, or improving student outcomes with tried-and-tested materials.

Get your FULL ACCESS to the Resource Centre HERE.

Watch the full video

🎥 Dive deeper and see the full walkthrough here

Explore more from Craig’n’Dave

Discover all our resources, courses and teaching tools at: 👉 https://craigndave.org

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

The biggest curriculum shake-up in a decade – PART 2

What the CAR review means for teachers

21 November 2025

Welcome back to the next instalment of our deep-dive into the Curriculum and Assessment Review (CAR). Part one explored broad curriculum design and assessment reform; part two gets straight to the good stuff: What the upcoming changes actually mean for computing teachers.

This new review is a hefty read. Fortunately, this series breaks it all down so you don’t have to! Here’s what matters most for computing.

Computing time is shrinking – and that’s a problem

One of the standout concerns raised in the CAR is the reduction in curriculum time:

  • Key Stage 3 has dropped from 4% to 3%.
  • Key Stage 4 has dropped from 5% to 2%.

Schools increasingly push computing into carousels, shortened timetables, or — at Key Stage 4 — only offer it to GCSE Computer Science students. Shockingly, only 10% of schools surveyed teach computing to all KS4 students, despite it being a foundation subject.

The review makes it crystal clear: every pupil should study computing until age 16, just like PE, RS and citizenship.

GCSE Computer Science is being replaced

This news has caused quite a stir: the current GCSE Computer Science will be replaced by a broader, more balanced GCSE in Computing.

But why?

  • The existing qualification is “too narrow and specialised”.
  • Students score lower in Computer Science than in almost all other subjects.
  • The gender gap hasn’t improved despite years of initiatives.

The good news: the government has confirmed that core computer science principles — including programming and algorithms — will remain. They just won’t stand alone as a full qualification anymore.

Expect a GCSE that blends computer science, IT, digital literacy, real-world applications, and modern computing concepts.

Computing won’t sit alone anymore – subjects will intertwine

The new curriculum will be the most holistic version yet. Skills will overlap between subjects, and computing will act as an engine powering others, such as maths, DT, geography, and citizenship.

The programme of study will be machine-readable and interactive, showing explicit links across subjects. Think Google Earth in geography, algorithms discussed in English when analysing bias, or spreadsheet skills needed for financial literacy.

Digital literacy and AI: Now national priorities

Two major themes run across the whole review:

Digital literacy

Defined by the government as the knowledge, behaviours, and confidence needed to use technology safely and critically. This includes:

  • online safety
  • digital footprints
  • cyber security
  • fake news and bias
  • navigating modern interfaces
  • basic operational skills, many pupils no longer have

Schools must offer explicit digital education across all key stages.

AI literacy

AI will feature throughout the curriculum, but computing is its “home”. Students will learn:

  • how AI works
  • its limitations
  • ethical implications
  • how to use and question it

Given that students already use AI outside school, this is a long-overdue update.

So what next?

The CAR review sets the stage for the biggest shift in computing education in a decade. From a rebalanced curriculum to the arrival of a new GCSE, the coming years will reshape what — and how — we teach.

For now, the key message is simple: computing is becoming broader, more relevant, and more cross-curricular than ever before.

Download our Curriculum and Assessment Review summary HERE- Curriculum and Assessment Review Summary.

 

📺 Watch our breakdown here as we talk through the key findings in our signature chalk face style.


🌐 Explore more resources, guides and updates on the Craig’n’Dave website — your home for high-quality computing education support.

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

The biggest curriculum shake-up in a decade

What it means for computing teachers

14 November 2025

From GCSE computing to AI qualifications: unpacking the curriculum and assessment review.

If you’ve been anywhere near education news lately, you’ll know that the Curriculum and Assessment Review (CAR) has landed — all 180 pages of it (plus a hefty 61-page government response for good measure). It’s the most significant look at the education system from ages 5 to 18 in over a decade. And if that sounds like a lot to digest, don’t worry — we have done the reading so you don’t have to.

Let’s check out what this means for computing teachers, students, and schools across the UK and break down the key takeaways.

GCSE Computing replaces GCSE Computer Science

One of the biggest announcements is the shift from GCSE Computer Science to GCSE Computing. At first glance, it’s just a word change — but it’s much more than that. The new qualification aims to better reflect the breadth of the digital world by combining computer science, IT, and digital literacy.

That means programming and algorithms will still be at the heart of the course, but there’ll be a stronger emphasis on digital skills and critical application — preparing students for a world where tech is integral to every industry.

A new qualification in AI and data science?

There’s also talk of a new Level 3 qualification in Data Science and AI. While it’s not confirmed whether this will be an A-level or T-level, it signals an exciting potential pathway for students keen to explore cutting-edge technology in more depth.

Academies to follow the national curriculum

Another big change: academies will be required to teach the national curriculum. This levels the playing field so every student receives the same core education — including computing — no matter where they are in the country.

NEA changes and assessment reform

Non-examined assessments (NEAs) will only continue where they’re essential. For computing, that means no return of coursework-style assessments at GCSE, though there’s still debate around whether they’ll remain at A level. The government has also made it clear that externally marked exams remain the fairest and most reliable assessment method, particularly in the age of AI.

A new era for digital literacy

Digital literacy will take on a far greater role, not just in computing, but across the entire curriculum. Expect more clarity on what “digital literacy” actually means, and a renewed focus on preparing students for life and work in a tech-driven world.

The changes will roll out gradually — with new programmes of study expected by 2028 and the first teaching of new GCSEs in 2029. 

But one thing’s clear: this shake-up is set to reshape computing education for the next generation.

A modern holistic curriculum

This will be the most modern and holistic National curriculum to date.  No subject sits in a vacuum, and this is no more true than in Computing, where so much of what we do is transferable to other subjects.  It is clear for example, that some subjects, digital methods now influence the content and how it is taught. 

Where it does, the government will include a requirement for the relevant digital content in those subjects’ programmes of study and will ensure that it aligns with the computing curriculum, to reduce the risk of duplication.

Broader still, the National curriculum revised programmes of study will prioritise core concepts in each subject and make sure they are coherent within and across subjects.

To enable this, the new National Curriculum will be an online, machine-readable and interactive.  It will visually represent the links within and between subject areas and gives connections to prior learning, helping teachers to contextualise learning across traditional subject boundaries.

 

🎥 Want to hear Craig and Dave’s full breakdown?
Watch the video now for their insights, discussion, and a free downloadable summary of the CAR report – Curriculum and Assessment Review Summary

💻 Explore more resources, updates, and teacher support at craigndave.org

 

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

High expectations from the first minute

7 November 2025

In Computing, every minute counts. Setting high expectations isn’t about demanding work or creating unnecessary pressure—it’s about clarity. It means knowing exactly what you want from your students and using simple, consistent techniques to achieve it. The most effective teachers don’t leave the first few minutes of a lesson to chance. They use this time purposefully to establish routines, reinforce learning, and create a calm, focused atmosphere where students know what to do and why it matters. Here’s how you can make the beginning of every lesson count.

1. Meet and greet: the power of the doorway

The moment students arrive is your first opportunity to assert calm authority. Greet students at the door with their name, a smile and a clear expectation for how they should enter the room. This bottleneck puts you in control. Don’t allow students to just pile into the room. An orderly entry sets the tone for the rest of the lesson. Don’t hesitate to stop students from entering if they are being disruptive—this reinforces that your classroom is a place of focus and respect.

Tip: Use positive reinforcement for students who enter appropriately and calmly redirect those who don’t to leave and enter again.

2. Engagement on entry: establishing a routine

Idle time is the enemy of learning. Students should know exactly what to do the moment they walk in. Whether you call it a starter, do-now activity, or engagement on entry, the key is consistency. This routine builds a culture of focus and reduces wasted time.

Tip: “What are we doing?” should never be a question in your classroom.

3. Combat the forgetting curve with recall activities

The start of the lesson is the perfect time for retrieval practice. A well-designed recall activity helps students strengthen their memory and make connections with prior learning. However, avoid tasks with a fixed end point—some students will finish early and become disengaged.

Tip: Provide more work than there is time to complete.

4. Why Smart Revise Quiz is the perfect solution

For GCSE and A-level students, Smart Revise Quiz is a powerful tool. Its dynamic, never-ending loop of low-stakes multiple-choice questions ensures that students are always engaged. The platform uses intelligent algorithms for spacing, interleaving, and personalisation, targeting each student’s weaker areas and adapting the question order accordingly.

Tip: With Smart Revise no student finishes early and no student is left behind.

5. Preparing for the lesson ahead

Alternatively, use the start of the lesson to prime students for what’s to come. At A-level, Craig’n’Dave micro-activities are excellent for this purpose. At GCSE, every lesson includes a starter that aligns with the learning objectives, helping students transition into the right mindset.

Tip: If you use Smart Revise, it is best to stick to the routine. You can also use starter activities at any point in the lesson as class discussions or plenaries instead.

6. Inclusive and accessible activity

The 2025 Ofsted framework places a stronger emphasis on inclusion and equity. This means ensuring that all students, including those with SEND or from disadvantaged backgrounds, can access and engage with the starter activity. That’s why a low-stakes, low-barrier to entry activity is better for the start of the lesson. Inspectors will be looking for how well teachers identify and reduce barriers to learning. It is important that inclusive practices are embedded in everyday routines.

Tip: Too much challenge too soon can turn off students before they even begin.

7. Why seven minutes matters

The duration of your starter activity sends a message. Five minutes can feel rushed and unimportant. Ten minutes may seem arbitrary. But seven minutes? It feels intentional. It’s long enough to be meaningful, short enough to maintain momentum.

Tip: Odd numbers feel deliberate. Use them to your advantage.

8. Transitioning into the lesson

Once the initial activity is complete, have a clear, recognisable signal to begin the main lesson. This could be a phrase, a countdown, or a visual cue. The goal is for students to respond quickly and without repeated prompting.

Consistency breeds compliance. Familiar cues reduce friction.

9. Use consistent language and positive signals

High expectations are not just about what you do—they’re also about what you say and how you say it. The language you use in the classroom communicates your belief in students’ potential and shapes the culture of learning. When you speak with clarity, purpose, and positivity, you signal to students that they are capable, that their time matters, and that learning is serious business. Use consistent, positive phrasing that reinforces routines and expectations. For example, instead of saying, “Stop messing around,” try, “Show me you’re ready to learn.” Non-verbal cues are equally powerful. A raised hand, a countdown, or a visual timer can become familiar signals that prompt immediate responses without the need for repeated instructions.

Tip: Play the long game. Over time, these cues become part of the classroom rhythm, reducing the need for correction and increasing student autonomy.

10. Eliminate distractions before you begin

Before students enter the room do a quick sweep. Clear up any loose pens, paper and rubbish. Ensure the computers are turned on and ready for a student to log on. If there are technical difficulties the engagement on entry activity gives everyone else something to be working on while you diagnose the problem. Before diving into new content, ensure you have every student’s full attention. Techniques include gathering students at the front of the room, using screen-locking software to prevent off-task behaviour and waiting silently until you have 100% focus.

Tip: Own the room before you teach. Never compete with distractionsdon’t talk if a student is talking, wait for perfect silence.

Final thoughts: routines build culture

High expectations aren’t about being strict—they’re about being consistent. When students know what to expect and what’s expected of them, they feel secure and ready to learn. The start of the lesson is your opportunity to build that culture, every single time.

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

What is a code pointer?

Why pointers are confusing, clever, and occasionally catastrophic

28 October 2025

If you’ve ever dipped your toe into C++ or C# and found yourself bombarded with stars (*) and ampersands (&), you’re not alone. One minute you’re coding a game, the next you’re lost in a tangle of memory addresses, wondering why your variables are playing hide and seek.

Let’s break it down.

Imagine your computer’s memory as a giant library. Every variable you create — like int sandwich = 3; — is a book stored on a specific shelf. A pointer doesn’t hold the sandwich (value) itself. Instead, it’s more like a sticky note that says, “Sandwich is in aisle 4, second shelf from the left.” That sticky note is the memory address. 

This is what a pointer stores — not the actual value, but the location of that value.

Why bother with all this indirection? Efficiency and flexibility. Passing around a pointer instead of a full variable is faster, especially if that variable is large. And crucially, if a function needs to change your sandwich — maybe to add pickles — it can go directly to the source. Without a pointer, you’d be modifying a copy. With a pointer, you’re making changes to the original. 

Result: one nicely pickled sandwich.

But it’s not all tasty treats. Pointers come with dangers. If a pointer directs you to a part of memory that doesn’t contain valid data — or worse, doesn’t exist — you’ll hit what’s called a segmentation fault. Think of it as following a dodgy satnav that tells you to turn left… off a cliff.

Curious to learn more about the fascinating world of code pointers? 

Check out our very own Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

Is reading necessary?

6 September 2025

Why reading belongs in the Computer Science classroom 

In a recent article, an English teacher shared how short, focused reading sessions—just five to seven minutes long—can reignite a love of reading in disengaged students. Inspired by research from Stanford University, Erin Miller trialled one-to-one reading interventions with her Key Stage 3 students and saw a noticeable shift in their attitudes toward reading. The simplicity of the approach is striking: minimal interruption, targeted support, and a consistent routine. But what does this have to do with computer science? 

Quite a lot, actually. 

Reading for pleasure: more than just literacy 

It’s tempting to think that once students can read well enough to access the curriculum, the job is done. But reading for pleasure goes far beyond basic literacy. It’s a gateway to: 

  • Vocabulary growth: Words like concatenate, iterate, and recursion are common in programming but rare in everyday speech. Students who read widely are more likely to encounter and internalise these terms, making it easier to grasp abstract computing concepts. 
  • Improved comprehension: Understanding problem statements or even debugging messages requires stronger reading skills. 
  • Higher academic performance: OECD’s PISA studies consistently show that students who read for pleasure outperform their peers—not just in literacy, but in maths and science too. 
  • Cultural capital: Reading builds background knowledge, not only providing an opportunity to cement the curriculum in the real world but also helping students engage more meaningfully with others and with the world around them. 
  • Is reading just for English teachers? 

Absolutely not. Just as every teacher has a role in developing digital citizens for our subject, every teacher—including in Computing—should be helping students become more literate. Whether it’s understanding ethical dilemmas in AI, exploring the history of computing, or simply following a tutorial, reading is foundational. 

Fitting it in: A curriculum challenge 

Yes, the curriculum is crowded. But reading doesn’t have to be a separate activity. It can be woven into existing routines: 

  • Replace a retrieval task with a short reading and reflection. 
  • Make a main task require students to read a paragraph before the activity can be undertaken. This might sound old-school, but it helps.  
  • Encourage students to read computing-related texts and share insights with peers. 

One practical idea is to use Alan Harrison’s How to Learn Computer Science at A level. Ask students to read a chapter and prepare to discuss something they found interesting. This not only builds subject knowledge but also fosters scholarly habits. 

For students with low literacy, reading can feel like a barrier rather than a gateway. So how do we support them without defaulting to overly simplified texts or assistive tools that risk becoming crutches? Scaffold don’t simplify. Pre-teach vocabulary, introducing key terms before reading. In computer science, words like algorithm or binary can be unpacked with visuals first. Chunk the text by using short, manageable passages. 

Does AI Make Reading Redundant? 

It’s true that AI can summarise texts instantly. But that’s not the point. Reading is about growth. AI can’t replicate the personal development that comes from wrestling with a challenging idea or discovering a new perspective. While tracking independent reading is harder in the age of AI, the benefits—confidence, curiosity, and competence—are worth the effort. 

Building Habits That Last 

Drawing on James Clear’s work on habit formation, we can help students make reading a regular part of their lives. Techniques like habit stacking (e.g., reading as part of homework) can make reading more automatic and enjoyable. This is where Craig’n’Dave resources help. Not only do students watch a video for homework, but the take-notes icon in the GCSE videos gives them a cue to read and write down what they see. 

Ultimately, reading isn’t a luxury, it’s a necessity. If you read more widely around a subject it is a vehicle for synthesising many abstract concepts taught in isolation in class, making it one of the most powerful tools we can give our students.

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

A new year and another new initiative

1 September 2025

The concept of learning styles—the idea that individuals learn better when taught in their preferred sensory modality (e.g., visual, auditory, kinesthetic)—has been widely popular in education. However, despite its appeal, the theory has been largely debunked by empirical research. Here’s a breakdown of the origins, popularity, and scientific critique: 

 Origins and Popularity 

  • Early Theories: The idea of learning styles can be traced back to educational psychology in the 20th century. One of the most influential models was the VARK model (Visual, Auditory, Reading/Writing, Kinesthetic), developed by Neil Fleming in the 1990s. 
  • Appeal: It resonated with educators and learners because it emphasised personalisation and seemed intuitive—people often feel they have a preferred way of learning. 

Scientific Research and Debunking 

  • Key Issue: The central claim is that matching teaching styles to a student’s preferred learning style improves learning outcomes. This is known as the “meshing hypothesis.” 
  • Major Review: In 2008, a comprehensive review by Pashler et al. in Psychological Science in the Public Interest concluded that:  There is no adequate evidence base to justify incorporating learning styles assessments into general educational practice.” 

Findings

  • Studies that properly tested the meshing hypothesis (i.e., using randomised controlled trials and measuring actual learning outcomes) did not find support for it. 
  • People may have preferences, but teaching to those preferences does not improve learning
  • Content matters more: The best modality often depends on the subject matter (e.g., diagrams for geometry, audio for music), not the learner. 

What Actually Works 

  • Cognitive science supports strategies like: 
  • Spaced repetition 
  • Retrieval practice 
  • Interleaving (mixing different topics or skills) 
  • Dual coding (combining words and visuals) 
  • These methods are evidence-based and improve learning across the board, regardless of “style.” 

Why It Still Persists 

  • Confirmation bias: People remember when their preferred style seemed to help. 
  • Commercial interests: Many companies sell learning style assessments and training. 
  • Intuitive appeal: It feels personalised and empowering, even if it’s not effective. 

Want to know more? Watch the full video on our YouTube channel – At the chalk face.

For more educational news, check out the At the Chalk Face YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

What is AI?

AI: Just fast maths pretending to be smart

12 August 2025

AI. It’s a term that gets thrown around everywhere—from science fiction films to social media posts and school corridors. But what actually is artificial intelligence, and why is it so important to understand?

At its core, AI doesn’t actually think—it just predicts. It’s essentially super-fast maths, rapidly analysing patterns to guess what should come next in a sequence. Imagine that friend who always finishes your sentences… except AI does it with slightly better accuracy.

How do transformers power AI?

No, we’re not talking about giant robots (although that would be cool). In AI, transformers are a type of deep learning model that helps machines generate human-like text. Here’s how they do it:

  • Word magic: AI doesn’t see words—it sees numbers. It converts text into numerical values that represent meaning. Kind of like the Matrix, but without the leather trench coats.
  • Attention, please! Transformers scan every word in a sentence and decide which ones are important. It’s a bit like pretending to listen in a meeting but only perking up when you hear “free snacks.”
  • Prediction time: AI makes an educated guess about the next word, refines it, and repeats the process until the sentence sounds human. The result? AI-generated essays, jokes, and sometimes suspiciously accurate emails.

How does AI learn??

Behind the scenes, AI is powered by huge datasets and clever algorithms. These systems “learn” patterns from data, meaning they can improve their performance over time without being explicitly programmed to do so. This process is called machine learning, and it’s how many of today’s most exciting AI tools work.

AI and you

AI is already influencing your daily life, whether you realise it or not. It shapes the content you see online, helps doctors spot diseases faster, supports businesses with automation, and could even play a role in your future career. Understanding how it works is more than just useful—it’s essential.

Why AI isn’t taking over (yet)

Despite its clever tricks, AI isn’t sentient—it’s just playing a game of supercharged fill-in-the-blank. While it’s brilliant for generating text and answering questions, it still lacks genuine understanding or creativity. So, would you trust it to run the world? probably not. But to help you write a convincing email? absolutely.

Want to learn more?

Want to know more? Check out The Lesson Hacker’s YouTube video HERE.

For more Lesson Hacker Videos check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

 

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026

Back

Why can’t we just stick RAM directly onto the CPU?

22 April 2025

In the world of computer science, speed is everything. So, it’s easy to see why the idea of sticking RAM directly onto the CPU seems like a genius move. Zero latency, lightning-fast speeds, and no more bottlenecks—what’s not to love? But in reality, it’s not that simple. Let’s break down why we can’t just combine these two crucial components into one.

The difference between CPU and RAM

At first glance, sticking RAM onto the CPU might sound like a great way to boost performance. After all, the closer RAM is to the CPU, the faster data can be accessed, right? Unfortunately, it’s not that straightforward. The CPU and RAM are built in fundamentally different ways.

CPUs are designed to handle calculations at breakneck speeds using logic processes. On the other hand, RAM—specifically Dynamic RAM (DRAM)—uses capacitors to temporarily store data. The catch is that these capacitors need constant refreshing to retain their information. This is similar to a student frantically rereading their notes to ensure they remember everything during revision.

Why it doesn’t work together

Trying to combine CPU and DRAM onto the same chip would cause chaos in the manufacturing process. DRAM fabrication doesn’t align well with the processes used to create a CPU. Imagine trying to install a high-end GPU into a budget laptop—it just won’t fit, and forcing it could cause damage.

Even cutting-edge technologies like Intel’s Haswell architecture use embedded DRAM (eDRAM) sparingly. The goal is to use just enough to boost performance without massively increasing production costs. However, merging CPU and RAM completely would be a manufacturing nightmare.

The speed factor: DRAM vs. SRAM

Even if we could combine the two, there’s another issue: speed. DRAM operates at a top speed of about 1 GHz, while modern CPUs can easily surpass 3 GHz. That’s like putting bicycle tyres on a Formula 1 car—you’re limiting the performance of the entire system.

To overcome this speed gap, CPUs use SRAM (Static RAM) for on-chip cache. SRAM is much faster than DRAM but comes with its own drawbacks: it’s bulkier and significantly more expensive. Sure, we could fill a CPU with SRAM, but that would come at an astronomical cost—far more than most of us are willing to pay.

Why we stick to separate RAM and CPUs

While combining RAM and the CPU might sound like a performance dream, the technical and cost limitations make it impractical. The current balance of DRAM for main memory and SRAM for cache strikes the best compromise between speed, cost, and practicality.

Want to know more? Check out The Lesson Hacker’s YouTube video – 

For more Lesson Hacker Videos, check out the CraignDave YouTube playlist HERE.

Visit our website to explore more cutting-edge tech-transforming news in the computer science world!

Related posts

Why are school exclusions rising? Causes, challenges, and solutions for teachers

Permanent exclusions in English schools have reached record highs, with nearly 11,000 pupils excluded in 2023-24—more than double the figure […]

17 January 2026

What is Endianness?

Welcome to the quirky world of endianness — a classic computing debate that’s as petty as indenting code with tabs versus spaces or whether ketchup belongs in the fridge.

15 January 2026

Does anyone still use low-level code?

Low-level programming isn’t dead — it quietly powers the devices we rely on every day, from cars to toasters. If you love digging into game engines, compilers, or hardware drivers, your skills are more essential than ever.

14 January 2026

Should AI have morals?

Should AI always agree with us, or tell us when we’re wrong? We explore whether artificial intelligence should be kind, or correct — and why the answer really matters.

13 January 2026

What is vibe coding? Is it the future of programming?

Vibe coding lets you tell an AI what you want in plain English—and it writes the code for you. But is it genius productivity or just a confident intern with a wild imagination?

12 January 2026

Trinket is shutting down in June 2026

Time2Code uses Trinket as its online IDE for Python. Unfortunately, that service is shutting down later this year, probably in […]

9 January 2026

What does a GPU actually do?

A GPU isn’t just a graphics chip—it’s like a room full of toddlers with crayons, all scribbling at once to bring your game to life. While CPUs think carefully, GPUs colour fast.

Fail safeguarding if phone used in school?

Should schools fail an Ofsted safeguarding inspection because of mobile phones? We dig into the headlines claiming schools should fail Ofsted if pupils are seen using phones.

Should beginners use AI to code?

Should beginners use AI to help them code? It might seem like a shortcut—but relying on it too soon could stop you learning the skills you actually need.

8 January 2026