You Have No Hard Drive – Why writing things down is the most underrated human upgrade

Think about the device you’re reading this on. Somewhere inside it, there’s a hard drive (or SSD) quietly holding terabytes of data — photos from five years ago, half-finished documents, an entire operating system. Turn it off, turn it back on, and everything is still there. Permanent. Reliable. Patient.

Now think about your brain. It doesn’t work like that. Not even close.

Your brain is a CPU with RAM, and that’s it

We like to flatter ourselves with the computer-brain analogy, but we always assign ourselves the wrong components. We imagine our memory as a vast internal archive — a biological hard drive — where experiences and ideas are filed away neatly, waiting to be retrieved.

The truth is far less flattering. Your working memory — the part that actually processes, manipulates, and reasons about information — behaves much more like RAM. Random access memory. Fast, powerful, essential for everything you’re doing right now… and volatile. It loses its contents the moment the power flickers.

You’ve experienced this a thousand times. You walk into a room and forget why you’re there. You have a brilliant idea in the shower that evaporates before you reach a towel. You leave a meeting with a clear sense of what to do and by the next morning the specifics have blurred into vague intentions. That’s not a personal failing. That’s the architecture.

Your brain is an extraordinary processor. It can recognize patterns, make leaps of intuition, synthesize information from wildly different domains. What it cannot do reliably is store. There is no save button. There is no file system. The input comes in, the processing happens, and then — if you don’t deliberately do something about it — the output dissipates.

And what you do “remember” might not be real

Here’s the part that should genuinely unsettle you: even the memories you feel certain about are less reliable than you think.

Neuroscientists at Harvard describe memory not as replay but as reconstruction. Every time you recall something, your brain doesn’t pull a file from a folder. It reassembles the memory from scattered fragments, fills in gaps with general knowledge, and stitches it all together into something that feels seamless. As neuropsychologist Venki Murthy puts it: “You’re not recording. You just can’t. We don’t have the bandwidth, the bits and bytes to do it.”

In other words, your brain does exactly what we criticize AI for doing. It hallucinates.

When a large language model generates a confident but fabricated answer, we call it a hallucination — the model is predicting what should come next based on patterns, not retrieving a stored fact. Your brain does something strikingly similar. It predicts what a memory should look like based on patterns, context, mood, and suggestion. Psychologist Elizabeth Loftus has spent decades demonstrating how trivially easy it is to implant entirely false memories in people — memories they will then defend with total confidence. And researcher Dan Schacter’s work on “flashbulb memories” shows that even our most vivid, I-remember-exactly-where-I-was recollections are often wildly inaccurate when checked against the record.

This isn’t some edge case. This is the default mode. Your thoughts and recollections are probabilistic, not factual. You’re running a prediction engine, not a database query. The main difference between you and an AI hallucinating is that you also feel emotionally certain about your confabulations. CSIRO researchers put it neatly: our brains use learned associations to fill in the gaps and quickly respond to whatever sits before us — they guess what the correct answer might be based on limited knowledge. The technical term for this is confabulation, and it is, functionally speaking, the biological version of hallucination.

No hard drive, and the ports are terrible

So you have an unreliable processor with no persistent storage. Surely at least the input/output is good? Not really.

A 2024 study from Caltech quantified the information throughput of human behavior at approximately 10 bits per second. That’s not a typo. Your sensory systems — eyes, ears, skin — take in roughly 1 billion bits per second. But the bottleneck of conscious processing squeezes that down to about 10. For context, we get anxious when our home Wi-Fi drops below 100 megabits per second. Your brain’s behavioral output is ten million times slower than that.

And it gets worse when you consider the output ports. Unless you have a brain-computer interface surgically implanted, you are limited to a handful of biological I/O channels: your hands to write or type, your voice to speak, your facial expressions and body language to gesture. That’s it. Those are the only ways to get information out of your head and into the world. On the input side, you have eyes, ears, and a few other sensors like skin for temperature and touch. These are ancient interfaces — remarkably capable in many ways, but narrow. A computer can dump its entire memory to an external drive in seconds. You have to sit there and type, one character at a time, at roughly 10 bits per second.

We can map this directly against a computer system. A computer has high-bandwidth I/O: USB, Thunderbolt, network interfaces capable of gigabits per second. It has persistent storage that doesn’t degrade or confabulate. And it has standardized formats — text, images, structured data — that other machines can read without ambiguity. You, by contrast, have low-bandwidth biological ports, no persistent storage, and your output format is natural language — which is, to put it charitably, lossy. Subject to interpretation, emotion, ambiguity, and the limitations of your vocabulary.

The output problem

Given all of this — no hard drive, hallucinating RAM, and terrible I/O — the logical move is obvious: write things down. Externalize. Give yourself a hard drive, even if the connection cable is slow.

And yet most of us don’t. Or we do it badly and inconsistently. We sit in an hour-long meeting and walk out with nothing on paper. We have a week’s worth of thoughts rattling around in our heads and never commit any of them to a document. We treat writing things down as optional — something for students and diarists — rather than what it actually is: a critical piece of infrastructure for a system that otherwise leaks and distorts.

Research backs this up. A study published in Frontiers in Psychology found that the physical act of writing activates widespread connectivity across brain regions responsible for movement, vision, sensory processing, and memory. When you write something down, you aren’t just copying information — you’re re-processing it, forcing your brain to prioritize, consolidate, and structure. You’re running the data through the CPU a second time and this time saving the output somewhere it can persist. Somewhere it won’t be quietly rewritten by your mood, your biases, or the simple passage of time.

But here’s the deeper problem

Even when we do write things down, we’re not very good at it. This isn’t a controversial claim. Three out of four high school students score below proficient in writing. Employers report that fewer than half of new graduates can communicate adequately in writing. U.S. companies spend over $3 billion a year on writing remediation. And thanks to the Dunning-Kruger effect, most poor writers don’t even realize they’re poor writers.

We spend years learning to write in school, and we emerge… functional. We can string sentences together. But clearly expressing a complex thought — capturing an idea with enough precision that someone else (or your future self, or an AI tool) can pick it up and actually use it — that’s a different skill entirely, and one most of us never really master.

You can see this playing out in real time with how we use AI. We have been given the most powerful text-processing tools in human history, and our dominant mode of interaction with them is the chatbot: short prompts, a few sentences at most. The typical ChatGPT message is 200 to 500 characters — roughly the length of a long text message. We’re not writing detailed briefs, structured problem statements, or rich context documents and feeding them to AI. We’re pecking out casual one-liners and hoping the machine figures out what we mean. This isn’t a failure of the technology. It’s a reflection of how we’ve always communicated: in short bursts, with minimal structure, relying on the other party — now an AI — to fill in the gaps. We are, once again, asking a system to reconstruct our intent from fragments. The same pattern our own brains use. The same pattern that leads to hallucinations.

Input → Process → Output → Store

The computer science model is simple: input, process, output. But computers have one thing we don’t — persistent, faithful storage between the processing and the next cycle, and high-bandwidth ports to get information in and out. Every time you think something through and don’t write it down, you’re running a computation and then throwing away the result. Worse than throwing it away, actually — you’re leaving behind a corrupted version that your brain will confidently present as the original.

This matters more now than it ever has. We live in an era where the information you externalize doesn’t just sit in a notebook. It can be searched, shared, fed into AI systems, re-processed, and built upon by others. A well-written note from a meeting isn’t just a memory aid — it’s an asset. A poorly written one (or a nonexistent one) is a dead end. You’re not just writing for yourself anymore. You’re writing for every future human and machine that might need to pick up where your thinking left off.

So write it down. Imperfectly. Messily. In bullet points or paragraphs or voice memos you transcribe later. The format matters less than the act. You are not a reliable narrator of your own thoughts — and the only cable connecting your mind to the outside world is painfully slow. But a document is persistent. It doesn’t confabulate. It doesn’t degrade. It gives your extraordinary, hallucinating, 10-bits-per-second processor the two things it’s missing: a hard drive, and a check against its own confabulations.

Build yourself a hard drive. Stop trusting the RAM.