I still remember the first survival skill I learned on a Commodore 16: check how much memory was left. You had to. The machine had 16 kilobytes of RAM, and the BASIC interpreter already ate a good chunk of it before you typed a single line of code. If you didn’t watch it, your program would crash. Monitoring wasn’t a habit. It was survival.

That was 1985. Forty-odd years later, I’m sitting at a MacBook Pro with 32 gigabytes of RAM, a phone that manages background processes with more sophistication than most early operating systems could dream of, and an Apple Watch that independently negotiates power consumption based on activity context. And until a few months ago, I still had battery percentage visible on every single device, CPU and memory usage in the menu bar, and a faint anxiety that would spike every time I noticed a number trending in the wrong direction.

cover

I was still flying a Commodore 16. In my head, at least.

The legacy of machines that needed babysitting

The generation that started computing in the 1980s and early 1990s did not have the luxury of trusting hardware. The Commodore 16, like its more famous sibling the C64, demanded constant awareness. You counted bytes. You knew that loading a sprite ate into the variable table. You learned to PEEK and POKE memory addresses not out of curiosity but out of necessity, because the machine would not save you from yourself.

DOS was no different. You managed CONFIG.SYS and AUTOEXEC.BAT with surgical attention, tuning conventional memory to squeeze one more kilobyte below the 640KB ceiling so a game or application would actually load. Windows 3.1 crashed with enough frequency that watching the system resources gauge (remember that applet in the Program Manager?) was a perfectly rational behavior. It told you when the inevitable was approaching.

The habit formed under genuine constraint became baked into identity. By the time Windows 95 arrived and things became marginally more stable, the monitoring reflex was already wired in. Task Manager was a reflex. Watching CPU spikes was reassuring, almost meditative. The system could do things without your awareness, and vigilance was how you stayed in control.

This is not nostalgia. It is the origin story of a dysfunction.

How a security mindset makes everything worse

cover

Spend enough time in DFIR and threat intelligence, and you develop a specific kind of attention to your environment. You learn to notice anomalies. An unusual process, a spike in outbound traffic, a disk access pattern that doesn’t match normal behavior: these are the signals that matter in incident response. The difference between catching a compromise early and explaining a breach to a client often comes down to someone noticing something that looked slightly off.

This is valuable. It is also, when applied to everyday device use, completely exhausting and mostly useless.

I have spent years working in security architecture and CTI, building detection logic, analyzing memory dumps, and teaching others to read system behavior as a narrative of intent. That training made me hyper-aware of resource consumption not because it was operationally useful on my personal devices, but because it felt consistent with professional rigor. Monitoring everything, always, was what responsible technical people did. Or so I told myself.

The problem is that the threat model for incident response and the threat model for “I’m writing a blog post on a Tuesday afternoon” are not even remotely the same. In IR, a 40% CPU spike on a server that should be idle is a potential indicator of compromise. On my laptop, it means Chrome opened another tab. Applying the same cognitive weight to both is not rigorous, it is noise tolerance miscalibration. And noise tolerance miscalibration, as any detection engineer will tell you, leads to alert fatigue, and alert fatigue leads to missing what actually matters. I have seen the same pattern in enterprise settings too, where teams build dashboards that look impressive but quietly bury what matters most, like I described in The great SOC charade.

The irony is perfect: the security training that was supposed to make me more attentive was making me less attentive where it counted, by flooding my peripheral vision with irrelevant data.

What modern operating systems actually do

Here is the part where I give modern systems the credit they have earned, and that credit is long overdue.

Modern operating systems are extraordinarily sophisticated at resource management. macOS uses a combination of compressed memory, memory pressure signals, and App Nap to handle RAM in ways that make the raw “used/free” number almost meaningless as a proxy for system health. Apple’s Activity Monitor documentation explains how memory pressure and compressed memory work together to keep performance stable under load, while Apple’s own developer documentation describes App Nap as conserving battery life “by regulating the app’s CPU usage and by reducing the frequency with which its timers are fired.” When Activity Monitor shows 28GB of 32GB in use, that is not automatically a crisis, that is often the OS doing its job correctly. The number that matters is sustained memory pressure with performance impact, not high usage by itself.

Android’s Doze mode, introduced in Android 6.0 and progressively refined in Android 7.0, goes even further in its autonomy. According to the official Android platform documentation, “Doze extends battery life by deferring app background CPU and network activity when a device is unused for long periods.” The system periodically exits Doze during maintenance windows to let apps complete pending syncs, jobs, and alarms, then returns to the restricted state with increasing intervals over time. The OS has a model of your behavior. Second-guessing it with manual monitoring is roughly equivalent to standing next to an autopilot and occasionally twitching the controls because the numbers on the screen make you nervous.

The machines we use today are not Commodore 16s. They do not need constant supervision to function. Treating them as if they do is not caution, it is cargo-cult behavior: performing the rituals of an earlier technological era without the constraints that made those rituals necessary.

The cost of constant vigilance

Attention is not free. This is the thing I had to explicitly tell myself, because decades of technical culture had convinced me that monitoring was always additive, that more awareness was always better than less. It is not.

Every glance at a battery percentage is a context switch. It is small, almost invisible, but it is real. The eye moves, the brain briefly shifts register, evaluates the number, computes a projection (“if I’m at 67% now and I have two hours of meetings left…”), and then returns to whatever it was doing. The research on this is not ambiguous. Gloria Mark and colleagues documented that interrupted knowledge work leads to measurable stress and a significant resumption lag before people fully re-engage with the original task. Even brief switches carry a cost: Rubinstein, Meyer, and Evans showed that task switching reduces efficiency, and Sophie Leroy’s work on attentional residue explains why part of your mind remains stuck on the previous task after every interruption.

More importantly, those micro-interruptions are not neutral. They pull you out of flow states. They introduce low-level anxiety (“I should probably plug in soon”) that occupies background cognitive capacity even when you are not consciously thinking about it. I recognized this pattern clearly when I finally turned off the percentage display on my watch: for the first week, I kept reflexively glancing at my wrist and feeling a faint unease at not seeing the number. That unease had nothing to do with actual battery status. It was withdrawal from a monitoring habit that had become compulsive.

There is a concept in security operations that maps well here: signal-to-noise ratio. Good detection engineering is not about monitoring more, it is about monitoring the right things and aggressively suppressing everything else. A SOC that alerts on every process creation event is not more secure than one with well-tuned rules. It is just slower and more exhausted. The same principle applies to personal attention management. The question is not “what can I monitor?” but “what actually matters, and what am I monitoring out of habit?” This is also the same tradeoff behind the automation debate in security operations: tools are essential, but without tuning and human context they create the illusion of control, a point I explored in The automation trap.

There are still moments when active monitoring is absolutely the right move. If your laptop suddenly runs hot at idle, battery drain changes abruptly after an update, or you notice repeated network activity that does not fit your normal pattern, investigate. In those cases, you are not feeding a compulsion, you are responding to an exception. The same exception-driven mindset is what makes incident response effective under pressure, and it is why I keep coming back to it even when discussing everyday habits, including in my analysis of trust boundaries in incident response teams.

Letting go without going blind

I want to be clear that I am not suggesting you turn off everything and trust blindly. That would be naive, and professionally incoherent coming from someone whose work involves finding exactly the artifacts that systems quietly generate and rarely surface. There is a difference between informed detachment and willful ignorance.

The distinction I have landed on is this: monitor for exceptions, not for the continuous state. On my Mac, I no longer have a CPU usage graph in the menu bar. I do have an application that I can open when the fan spins up unexpectedly, which is the actual signal worth responding to. On my iPhone, I do not display battery percentage unless I am below roughly 20%, which is the only threshold where behavior change (finding a charger) actually makes sense. On my Apple Watch, I check battery status once in the morning and once before bed. That is sufficient.

The time reclaimed is not enormous in absolute terms, but the quality of attention reclaimed is significant. Without a dozen small numerical distractions in my peripheral vision, I am more present in the work itself. It may sound like self-help language, but in practice it is a simple behavioral loop: remove the stimulus, reduce the compulsion. The Commodore reflex does not fire if there is nothing to trigger it.

If you grew up with machines that needed constant babysitting, you probably carry some of the same habits I did. Those habits were rational once. The machines have moved on. It may be time we do the same.

The 16 kilobytes are long gone. You can stop counting them now.