Career DishReal jobs, real talk

Is Cybersecurity Stressful?

~18 min read · 6 voices

We asked six cybersecurity professionals one question. None of them mentioned the hackers.

These characters are composites, built from dozens of real accounts, interviews, and community threads. The people aren't real. The experiences are.

What stresses you out most about this job?

What you'll learn

The Alert Volume

M

Marlon

27SOC analyst (Tier 1) at a managed security services provider in Dallas, Texas2 years in

I triaged 73 alerts yesterday. Seventy-three. My shift is 6 PM to 6 AM. I cover eight client environments simultaneously, which means I'm monitoring about 14,000 endpoints across eight different SIEMs. Well, technically six SIEMs because two of the clients share a multi-tenant Sentinel deployment and one still uses ArcSight, which tells you something about their budget priorities.

Of those 73 alerts, four required investigation beyond the initial triage. Two turned out to be authorized IT activity that hadn't been whitelisted yet. One was a user who installed a browser extension that triggered our malware detection because the extension communicates with a domain that shares hosting infrastructure with a known C2 server. That took me 40 minutes to chase down and ultimately close as benign. The fourth was an actual malicious email that made it past the client's email gateway and was opened by an employee. I contained the endpoint and escalated to the client's incident response point of contact, a guy named Philip who answered on the third ring at 2:30 AM and sounded like he'd been sleeping for approximately four minutes.

The stress isn't the 73 alerts. I can process alerts. The stress is that somewhere inside those 73 alerts, one of them might be the one that matters, and if I close it too fast because I'm moving through the queue, and it turns out to be a legitimate compromise, that's on me. My team lead, Arturo, calls it "the needle in the noise problem." You're looking for one real threat inside a haystack of false positives, and the haystack keeps getting bigger, and your shift doesn't get longer. I have the same 12 hours tonight. The alert volume is up 15 percent from six months ago because two of the clients onboarded new cloud workloads and didn't tune the detection rules. So I'm processing more alerts per hour with the same brain. And the brain is the bottleneck.

You're looking for one real threat inside a haystack of false positives, and the haystack keeps getting bigger, and your shift doesn't get longer.
— Marlon

The On-Call Rotation

P

Petra

36Senior security engineer at an e-commerce company in Portland, Oregon8 years in cybersecurity

I'm on call one week out of every four. That doesn't sound bad until you live it. On-call means the PagerDuty app on my phone is armed 24/7 for that week. If a critical security alert fires at 3 AM, my phone screams, and I have 15 minutes to acknowledge it and 30 minutes to begin investigation. Fifteen minutes. That's the SLA we committed to with the business. I didn't write the SLA. My predecessor did. But I inherit it every fourth week.

Last rotation, I got paged at 1:47 AM on a Wednesday because our WAF detected a potential SQL injection attempt against the product search API. I was asleep next to my partner, Kai, who also woke up because the PagerDuty alarm is designed to wake the dead. I got out of bed, opened my laptop at the kitchen table, connected to the VPN, and started investigating. The SQL injection attempt was real, someone was probing the search parameter, but the WAF had blocked it. The attacker was using automated tooling, probably SQLMap, from a residential proxy in Romania. I confirmed the WAF rules had caught every attempt, verified no data had been accessed, blocked the source IP range, and documented everything. Total time: about 90 minutes. I went back to bed at 3:20 AM. My alarm went off at 7:00 AM. I went to work.

Kai has stopped asking "was it serious?" when the phone goes off. The question now is "how long?" Because the duration determines whether the rest of the night is salvageable. Ninety minutes is recoverable. The three-hour incident response I had in October, that one ruined the next day entirely. I was useless in meetings. My manager, Beth, told me to take the afternoon off. I did. But the backlog from missing half a day of work was waiting for me the next morning. The on-call week doesn't pause the regular workload. It layers on top of it. My team has four engineers. The rotation should be one in five or one in six to be sustainable. We're at one in four because we can't fill the open headcount. We've been trying to hire a fifth engineer for nine months. The market for senior security engineers is extremely competitive and our salary band tops out at $165,000, which is not enough for Portland when Amazon and Intel are offering $190,000 to $210,000 with stock. So we stay at four. And the rotation stays at one in four. And Kai stays awake on rotation weeks because the phone might go off.

Kai has stopped asking "was it serious?" The question now is "how long?" Because the duration determines whether the rest of the night is salvageable.
— Petra

The Asymmetry

D

Devonte

39Incident response lead at a defense contractor in Northern Virginia11 years in cybersecurity

The asymmetry is the thing nobody outside of this field understands. An attacker needs one way in. I need to defend every way in. Every port, every endpoint, every user, every application, every vendor integration, every API, every misconfigured cloud storage bucket. I need all of them to be secure. The attacker needs one of them to be wrong. That's the math. And the math never changes.

I've done incident response for eleven years. I've investigated breaches where the initial access was a phishing email, an unpatched VPN, a stolen credential from a third-party breach, a developer who pushed an AWS access key to a public GitHub repository, and a USB drive left in a parking lot. That last one was at a defense subcontractor in 2018. Someone plugged in a USB drive they found on the ground. In a parking lot of a defense contractor. The drive contained a payload that established a reverse shell within 40 seconds of being plugged in. We found it 11 days later during a routine threat hunt. Eleven days. My colleague at the time, a woman named Candace, she's the one who spotted the anomalous outbound connection during the hunt. Without her, it might have been 30 days. Or 90. Or we might have found out when a journalist called.

The stress isn't the individual incident. I know how to run an incident. I have playbooks. I have a team of six analysts. We train for this. The stress is the knowledge that right now, right this moment, there might be something in our environment that we haven't found yet. Not because we're bad at our jobs. Because the adversary is patient and resourced and doesn't have to follow our rules. That awareness doesn't turn off. My wife, Yolanda, she says I "scan" rooms when we go to restaurants. She's joking. But she's also describing something real about how this job rewires your brain. You start seeing attack surfaces everywhere. The restaurant's point-of-sale system is running Windows 7. The hotel's WiFi has a captive portal with an expired certificate. My daughter's school uses a password of "student123" for the parent portal. I notice all of it. I can't stop noticing it. That's the cost of eleven years of knowing what's possible.

The attacker needs one way in. I need to defend every way in. That's the math. It never changes.
— Devonte

The Communication Gap

L

Lindsey

33Security awareness manager at a pharmaceutical company in Research Triangle Park, North Carolina6 years in cybersecurity

My job is to make 6,000 employees care about security. Not understand it. Care about it. There's a huge difference. Understanding is a PowerPoint. Caring is behavior change. And behavior change in a workforce of 6,000 people who are mostly research scientists, sales reps, and manufacturing line workers is, I'm going to be honest, often impossible and always exhausting.

I run the phishing simulation program. Every month I send a simulated phishing email to the entire company. The email is designed to look realistic. Last month's simulation impersonated a benefits enrollment reminder from HR. It had the right logo, the right sender formatting, and a link to a page that looked like our benefits portal. 14 percent of employees clicked the link. That's 840 people. 840 people who would have entered their credentials on a fake login page if this had been real. I know every one of their names because the system tracks it. The repeat clickers, the people who fail the simulation month after month, I know them too. There are about 200 people in this company who will click on anything that looks vaguely official. I've sent targeted training to those 200 people three times. The click rate among that group has dropped from 38 percent to 22 percent. Progress. But 22 percent of 200 is still 44 people who will hand their credentials to anyone who asks nicely.

The stress is knowing that my entire program, the training, the simulations, the awareness campaigns, all of it can be defeated by one person clicking one link on one bad day. I had a conversation with our CISO, a man named Bennett, about this. He asked me what the click rate would need to be for me to feel comfortable. I said zero. He laughed. Then he stopped laughing because he realized I meant it. Zero is the number that would make me sleep well. And zero is impossible. So I don't sleep well. I sleep adequately, on a good month, knowing that somewhere in the building there's a research scientist who will click a link that says "URGENT: Your password expires in 4 hours" because the urgency overrides everything I taught them in the 30-minute annual training module that they completed in 11 minutes because they had it playing in a background tab.

My entire program can be defeated by one person clicking one link on one bad day. I asked what click rate would make me comfortable. I said zero. Zero is impossible.
— Lindsey

The Skill Treadmill

O

Osman

44Cloud security architect at a SaaS company in Austin, Texas16 years in IT/security

I'm 44 years old and I study for certifications the way my son studies for his SATs. That's the part of this career nobody warned me about. In most fields, you accumulate expertise over decades and it compounds. In cybersecurity, your expertise has a half-life. The attack techniques I learned five years ago are partially obsolete. The cloud architectures I was securing three years ago have been replaced by newer services with different security models. My CISSP, which I earned in 2016 and maintain with continuing education credits, required 40 hours of professional development this year to keep active. Forty hours. That's a week of my life every year just maintaining a certification I already have.

Last year I spent about $4,800 of my own money on training and certifications. The company reimburses $3,000 per year, which is generous compared to some places. The remaining $1,800 came out of my pocket. I took the AWS Security Specialty exam, which cost $300 and required about 120 hours of study. I did an online course on Kubernetes security because half of our production workloads are running on EKS now and two years ago I couldn't spell Kubernetes. I attended two conferences, Black Hat and fwd:cloudsec, which cost about $2,000 combined after early bird registration.

My wife, Farida, she's an optometrist. She completed her residency in 2012 and her core clinical skills have been largely stable since then. She learns new techniques and technologies, sure, but the fundamental knowledge base of optometry doesn't reorganize itself every 18 months. Mine does. I've been in this field since I was 28. In that time, the primary infrastructure model has shifted from on-premises to hybrid to cloud-native. The primary threat model has shifted from perimeter-based attacks to identity-based attacks to supply chain attacks. The tools I use have changed three times over. If I stop learning for two years, I'm unemployable at my current level. Not figuratively. Literally. The job postings for cloud security architects in 2024 require knowledge of services and attack patterns that didn't exist in 2022. Farida's job postings haven't changed fundamentally in a decade. I am deeply jealous of this and she does not understand why.

If I stop learning for two years, I'm unemployable at my current level. Not figuratively. Literally.
— Osman

The Weight of Knowing

B

Blythe

48Director of information security at a regional hospital system in Milwaukee, Wisconsin19 years in healthcare IT/security

I know, right now, sitting here, that there are three significant security gaps in our environment that I have documented, reported to leadership, and been unable to get funding to fix. I know that our MRI machines run Windows 10 and the manufacturer won't support an upgrade until 2027. I know that our patient portal has a session management weakness that our application security vendor identified eight months ago and that our development team has deprioritized twice because they're working on features the CEO considers higher priority. I know that our backup infrastructure has not been tested for a full disaster recovery scenario in 14 months because the test requires a maintenance window that operations has declined to schedule.

I know all of this. I have written memos about all of this. I have risk acceptance forms signed by the COO for two of the three issues. The third is still in review. And the stress, the specific stress that I carry, is not about the technical problems. I know how to fix all of them. The stress is that I am the person who knows, and the organization has chosen to accept risks that I have told them are unacceptable, and if something goes wrong, I will be the person on the stand explaining what we knew and when we knew it.

I have two kids. Miles is 14. June is 11. They know I work in "hospital computers." They don't know that their mother lies awake sometimes thinking about a ransomware attack that could lock medical records for 19,000 patients and that the backup recovery plan hasn't been validated in over a year. They don't know that I've started documenting every risk acceptance decision in a personal file that I keep on an encrypted external drive at home. Not because I'm paranoid. Because I watched what happened to the CISO at that health system in Iowa who got fired after a breach and then found out the organization was blaming decisions on him that had been made by the CFO. Documentation is survival. My husband, Clayton, found the external drive once and asked what it was. I said "insurance." He thought I meant the company. I meant mine.

I am the person who knows. The organization has chosen to accept risks I've told them are unacceptable. If something goes wrong, I'll be the one explaining what we knew and when.
— Blythe

What We Noticed

The stress is almost entirely about accountability without adequate control.

Marlon is accountable for 14,000 endpoints he didn't configure. Lindsey is accountable for 6,000 people's clicking habits she can't control. Blythe is accountable for risks the organization chose to accept against her recommendation. The pattern is consistent: cybersecurity professionals are expected to prevent outcomes they don't have full authority to prevent. The gap between responsibility and authority is where the stress lives.

Nobody mentioned the adversaries.

Not one of the six people described the attackers as their primary stressor. Devonte came closest, describing the asymmetry of the defender's position. But even he framed it as a structural problem, not a personal one. The hackers are a known variable. The organizational friction, the budget constraints, the staffing gaps, the sleep deprivation, the knowledge that decays, those are the things that actually wear people down.

The job follows everyone home, but the shape changes by seniority.

Marlon's stress is about the next shift and the alerts in the queue. Petra's stress is about the phone ringing at 1:47 AM and what that does to a relationship. Osman's stress is about relevance: will his skills still be current in two years? Blythe's stress is about legacy: what happens if the thing she warned about actually happens? The more senior the person, the longer the time horizon of the worry.


Frequently Asked Questions About Cybersecurity Stress

What is the most stressful part of cybersecurity?

The most commonly cited stressors are alert fatigue, the asymmetric pressure of needing to prevent every breach while attackers only need one success, chronic understaffing and unsustainable on-call rotations, difficulty justifying security budgets to leadership, and increasing personal legal liability for security leaders. These stressors compound over time and are structural rather than task-based.

Is cybersecurity burnout common?

Yes. Industry surveys consistently report that 60 to 70 percent of cybersecurity professionals experience burnout symptoms. Burnout is most acute among SOC analysts and incident responders who work rotating shifts and handle high alert volumes. Contributing factors include 24/7 on-call expectations, the constant need to upskill, and the psychological weight of knowing that a mistake could lead to a significant breach.