Frame the problem, then land it in their world. Lead with curiosity, not conclusions.
This is the most important moment. You are here to listen. Nobody asks them this. When you do, they relax.
"What does a typical week look like for your team?"
"What's the mission? Who do you serve?"
Listen more than you talk. Let them tell you where the concerns are before you name them.
And honestly? Ten years ago, that was a perfectly valid strategy. Hackers were humans. They chose banks and big corporations because that's where the money was. You were safe because you were invisible."
Bad actors now use bots that scan the internet 24 hours a day. The bot doesn't know you are a nonprofit; it only knows your digital door is unlocked. You aren't targeted because of who you are, but because of what you are: an opportunity.
Then came the AI factor. Today, AI allows attackers to write perfect, personalized emails to your staff in seconds. They can mimic your ED's writing style. They can clone voices for phone scams. The fakes are now too good for 'sharp eyes' alone."
Choose one or two that land for this person.
- NGO sector: #2 most targeted by nation-state actors worldwide (31%), behind only IT. (Microsoft Digital Defense Report)
- 68% of breaches involved a human element: clicking a link, reusing a password. Not a tech problem; a people problem.
- 80% of donors say that if they become aware of a breach, they will not give. (give.org) This changes the math for any organization that depends on donor trust.
- For a business, a breach costs money. For you, it costs trust. Trust is the one asset you can't buy back with insurance.
But you operate on a different currency: trust. The trust of a donor who writes a check. The trust of a volunteer who gives their time. If your data is leaked, or your email is used to scam your community, you can't write a check to fix that.
We need to move from 'Security by Obscurity' to 'Security by Stewardship.' It's not about becoming a fortress; it's about treating your digital keys with the same care you treat your physical keys."
Seven questions that reveal more about security posture than most full audits.
Move from the snapshot to the next step. Match your close to their readiness.
If donor trust came up: 80% of donors won't give after becoming aware of a breach. Closing those windows protects the relationships that fund the mission.
The Tech: Website, email, devices, backups.
The Business: Vendor contracts, insurance, finance systems.
The People: Onboarding, policies, AI use."
The Foundations Report: Narrative summary across 14 domains. Plain English.
The Action Plan: Prioritized by effort and impact.
And a review conversation. You will never be left staring at a document without guidance."
Fixed fee: $1,200 to $1,800. No surprises, no scope creep.
If this isn't the right moment, that's fine. I have something I'd like to leave with you."
Online: cyberwisesolutions.net ยท /services ยท /pricing
Printed: Hand it over. Physical artifacts carry weight. A printed brief on the table is proof of investment in the relationship.
The Readiness Brief demonstrates the Cyberwise ethic: a self-assessment tool before the formal engagement, so they can decide for themselves.
Every one is real, sympathetic, and understandable. None are arguments to be won. Clear, candid, calm, and supportive โ always.
The Foundations Assessment covers 14 domains across people, systems, data, and emerging tools. Each one is there for a reason.
It IS: A clear, calm picture of where the organization stands today โ so they can make thoughtful decisions about tomorrow. Use the word "clarity." The instrument is standardized across all clients. Only the report and action plan are bespoke.
It is NOT: A compliance audit. A pass/fail score. An assessment of staff performance. A measure of how "behind" they are. Avoid the word "audit" entirely.
Time: 30โ60 minutes. Who: The person with the broadest operational view โ executive director, ops manager, office administrator. Technical expertise not required.
Foundations Report: Narrative summary across all 14 domains. Plain English. Designed for leaders, board members, and non-technical staff.
Action Plan: Prioritized next steps โ immediate, short-term, and long-term. Sized for small teams and real capacity.
Review Conversation: A guided walk-through of findings. They will never be left staring at a document without guidance.
Path Forward: Partner with Cyberwise for implementation support, or move independently. No pressure either way.
Pricing: $1,200โ$1,800 fixed fee. Implementation Support: Guided $250/mo ยท Partnered $400/mo ยท Embedded $600/mo.
Cybersecurity without clear ownership is nobody's job. This domain asks who is actually responsible for the digital environment โ and how that responsibility is understood by leadership and the board. The structures that govern every other area of an organization's work also determine how well it responds when something goes wrong digitally. If the honest answer to "who owns this?" is "kind of everyone," the practical answer is that no one does.
Data you cannot find, you cannot protect. This domain maps where organizational information actually lives in practice โ not where the policy says it should live, but where it ends up. Personal devices, personal email accounts, informal cloud storage, the shared folder that predates the last three executive directors. Good data hygiene is one of the highest-return investments a small organization can make, and it costs considerably less than almost everything else on the action plan.
Every account that was ever created, every login that was ever shared, every system that was never properly locked down when someone left โ this domain maps the full landscape of access across the organization. It examines how accounts are managed, whether multi-factor authentication is in place, how shared credentials are handled, and what the offboarding process actually looks like in practice versus what the policy describes. The most common attack vector in the sector is an unlocked door that nobody remembered to close.
This domain looks at how devices are managed, secured, and tracked โ including personally-owned devices that staff use for work. The most common vulnerabilities here are the ones nobody thought to address because they never came with an official label: the volunteer's personal laptop, the outdated phone, the device that left the organization but still has access to shared systems. If you can't see it, you can't secure it.
Email is the primary vector for the vast majority of successful cyberattacks โ and the back door that is most often left unlocked. This domain examines how email is configured, who has access, whether multi-factor authentication is in place, and how communication tools beyond email (Slack, WhatsApp, group texts) are used for organizational business. The gap between official policy and actual practice is often widest here. It is also where the most consequential quick wins live.
When ransomware locks every file on a network, the backup strategy is the difference between paying a ransom and restoring from yesterday. This domain examines not just whether backups exist, but whether they are automated, isolated from the main network, tested, and accessible when needed. An untested backup is an assumption. Most organizations discover the difference between assumption and reality at the worst possible moment.
Financial systems and donation platforms sit at the intersection of the most sensitive data and the most significant risk of direct, immediate loss. This domain examines how financial tools are secured, who has access, how unusual transactions are verified, and whether the people handling money know what a financial scam looks like before it arrives. The most common financial attack is not a technical exploit โ it is an email that looks like it came from the executive director and asks accounts payable to wire a payment.
The organization's website and domain are its public face. They are also infrastructure that attackers can use against it โ by exploiting vulnerabilities in the site, by impersonating the domain in phishing campaigns, or by hijacking email sent on the organization's behalf. This domain looks at who controls that infrastructure, how access is managed, how current the software is, and whether domain security records are properly configured. Most organizations discover these gaps only after someone has already impersonated them.
Physical access is digital access. A device left on a desk in an unlocked office, a visitor with unescorted access to the server room, a laptop sitting in a car โ these are not IT problems, they are physical ones with digital consequences. This domain examines how the organization manages physical access to technology, sensitive documents, and systems. The organizations that handle this well treat physical and digital security as the same conversation, because at the boundary they are.
Most organizations don't discover a security problem from a dramatic alarm. They discover it because a staff member noticed something odd and knew to say something โ or didn't say something because they weren't sure it mattered, or were afraid of being blamed. This domain looks at whether there is a clear path from "I think something happened" to "here is what we do next." The organizations that respond best to incidents are the ones that knew the path before they needed it.
Many mission-driven organizations hold data that demands exceptional care โ not merely because of regulation, but because of whose data it is. Client files, case histories, location data, medical information. The address of a survivor who has told no one else. This domain examines how that data is identified, stored, and protected, and whether the people handling it understand why it matters. The exposure of this data is not a compliance problem. For most of these organizations, it is a moral catastrophe.
The livestreaming setup. The event registration system. The tablet at the check-in desk. The A/V cart rolled out for Sunday services or the annual gala. These systems often carry sensitive data and are almost always the least scrutinized in any security conversation. This domain asks whether the technology used for programs and events is as carefully considered as the technology used for administration. For most organizations, the honest answer is: not yet.
Every vendor, contractor, or platform with access to the organization's systems is part of its security story โ whether the organization thinks of them that way or not. This domain examines how external relationships are managed: who has been granted access, whether that access is still appropriate, and what agreements actually say about data handling and security responsibilities. The vendor breach is one of the most common pathways to organizational exposure, and one of the least frequently examined.
Staff across every organization are using AI tools in their daily work โ often without formal policy, often without awareness of what they are sharing with those tools. Client names, financial data, case notes, grant strategies, the home address of someone whose privacy matters. This domain examines how the organization is navigating AI adoption, what guidance exists, and where unofficial tool use may be creating exposure that hasn't been named yet. The organizations that handle this well are not the ones that ban AI. They are the ones that talk about it honestly.
Deep reference for calibrating your approach. Read them as characters before you write for them.
Portrait: He has led his congregation through a building campaign, two staff transitions, and a pandemic pivot to online services. He coordinated the streaming setup himself during COVID because someone had to. He is not a technophobe. But technology is a means, not an interest. He has a part-time office administrator, a volunteer IT person named Mike who also drives a sensible SUV, and a board that ranges from deeply engaged to showing up four times a year with strong opinions about the budget. He is the one who signs the check. He is also the one who would have to make the phone call if something went wrong.
How he talks about security: In the language of stewardship โ once you give him the frame. Before that, the way most people talk about their gutters. He does not use "threat surface" or "multi-factor authentication." If you use them unprompted in the first conversation, you will lose him. He has been talked down to by vendors before. He will not engage with that energy a second time.
The fear underneath: Not the breach itself. The phone call. A congregant whose personal information was exposed. A staff member asking why the church didn't have better safeguards. A story in the local community paper. A donor who quietly stops giving. The children's ministry roster โ names, addresses, medical notes, pickup authorizations โ is the thing that keeps him up if he lets himself think about it too hard. The counseling files sitting in a shared folder nobody has audited since the last associate pastor left.
What moves him: Story, not statistics. If you describe, plainly and specifically, what happens when donor data is exposed or when a phishing email empties the building fund, he will not forget it. He also responds to being treated as a thoughtful person who is capable of making good decisions once he has the right information. He understands the weight of trust better than most of your readers ever will.
The conversation that works: Open with a question. Ask about Mike, the streaming setup, the donor database. Listen more than you talk. Somewhere in the middle, ask one question about the counseling files or the children's ministry records, and watch his expression shift slightly. Don't press. Just name what you heard: "It sounds like you have some things in good shape and some places where you're carrying more exposure than you probably realize. That's normal. It's fixable. And it doesn't require Mike to become a cybersecurity expert."
Portrait: She has been running the organization for seven years. She was there when they grew from four staff to fourteen, when the board cycled through three chairs, and when the pandemic forced remote operations in eleven days with a technology budget that was already thin. She has a master's in social work or public policy. She thinks strategically, advocates effectively, and can read a 990 as fluently as most people read the news. Cybersecurity has been on her list for three years. Her board is starting to ask. A major funder included a data security question in last year's grant application for the first time.
How she talks about security: In terms of liability, responsibility, and compliance โ usually in the context of what funders expect. She has absorbed enough sector anxiety to know she should be worried. She has not yet been given a reason to believe the solution is within reach for an organization like hers.
The fear underneath: The people her organization serves. She runs a refugee resettlement program, or a domestic violence shelter, or a community health clinic. The data she holds is not abstract โ it is the address history of a woman hiding from someone dangerous, or the HIV status of a teenager who told the clinic and nobody else. She carries this. The secondary fear: if a breach becomes public, the damage to donor and foundation confidence is not recoverable on a short timeline.
What moves her: Cost clarity, before anything else. Being given a fixed, transparent price before she has to ask is not a small thing โ it signals she is dealing with someone who understands how her world works. The second thing: "right-sized." She has been failed by solutions designed for organizations ten times hers. Being told that good security for her team does not look like enterprise security scaled down โ that it is a fundamentally different approach built for her actual risk and her actual people โ is genuinely new information. She leans in.
The question that stops her: "If your email went down right now, could your staff reach you in an emergency?" She will pause. She will not know. Don't make her feel bad about it. "Most organizations in your position can't answer that confidently. It's one of the things we look at." She books the assessment before the call ends.
Portrait: She has been with the organization for six years โ longer than anyone except the executive director. She knows where every vendor contract lives, which staff members actually use the shared drive versus their personal laptop (three of them, no matter how many times she's asked), and the reason the Wi-Fi password is still "FaithHope22" is that changing it would require a conversation with the volunteer IT person that she has been too tired to have. She is operationally brilliant in the quiet way that keeps organizations functional. She does not write strategy documents or present at board meetings. She makes those things possible by keeping everything else from falling apart.
How she talks about security: Practically and skeptically, usually at the same time. She has watched enough technology initiatives arrive with great fanfare and leave with a shrug to have earned her skepticism. When she talks about security, she talks about specific situations: the staff member who forwards everything to personal Gmail, the shared password spreadsheet on the server that technically anyone can access, the laptop a former employee might still be able to log into.
The fear underneath: Being blamed. If something goes wrong, the question will move quickly toward: who was responsible for this? Janet often carries that exposure without it ever being formally acknowledged. She has raised concerns before and been told "we'll look into it" by people who did not look into it. She has stopped raising them as loudly. She has not stopped noticing.
What moves her: Specificity. Concrete next steps. The sense that whoever she is talking to actually understands how an organization like hers operates day to day, and is not going to recommend a solution that requires three staff members and a $500-a-month platform to maintain. "Simple" does a lot of work for this persona. So does "built for small teams." She also responds strongly to being treated as a peer โ she has a finely calibrated radar for condescension. If you talk down to her, even slightly, you will not get her back. If you treat her operational knowledge as the asset it actually is, she will become one of the most effective internal advocates for the work.
The conversation that works: This often happens after Sandra says "you should talk to Janet โ she handles all of this." Jim arrives expecting to brief her. Instead, he listens for ten minutes while she describes, with quiet precision, exactly what is not working. He takes notes. He says: "Everything you've just described is exactly what the assessment is designed to surface. What you already know is going to make this faster and more useful." She feels, possibly for the first time in a while, like the right person got her message.
Portrait: He is a deacon, or a board member, or a congregant with a day job in IT at a mid-size company and a genuine desire to contribute. He set up the church's network in 2019. He got them onto Microsoft 365. He configured the email. He is the first call when anything breaks, and he takes those calls without complaint. He drives a sensible SUV and has strong opinions about the router brand the organization should have bought instead of the one they bought. He means well. Genuinely. This is the most important thing to hold onto in any conversation involving him.
The structural problem: He is operating under a set of assumptions he absorbed from his day job โ an environment with a dedicated IT department, a security team, and a procurement budget not measured in hundreds of dollars. He applies those assumptions to the church with great enthusiasm and genuine care. The result is a security posture that looks more sophisticated than it is and protects against fewer things than anyone realizes. This is not malice. It is the natural result of applying enterprise assumptions to a context they were not designed for.
The fear underneath: Being exposed. Having his limitations surfaced in front of leadership he has invested in for years. His identity as the competent, reliable person who handles this is real and earned โ and the prospect of that identity being complicated by an outside assessment is genuinely threatening. He may not say he is opposed. But he can slow or derail the process through lukewarm endorsement, inaction, or the kind of quiet skepticism that makes leadership uncertain.
What moves him: Being brought in as a partner, not sidelined. "We want to build on what you've already put in place" does more work than almost any other sentence. He also responds to evidence: specific, non-accusatory examples of where well-intentioned setups create real exposure. Not "your setup is wrong" โ but "here's a pattern we see often in organizations like this one, and here's what it looks like when it becomes a problem."
The pivot that works: "You've clearly done a lot with limited resources. The thing I find with organizations like this is that most of the real exposure isn't in the infrastructure at all โ it's the people layer. And that's the part that's nobody's actual job. That's what the assessment is designed to look at." He can hear that. It is not a challenge to his competence. It is an acknowledgment that the problem is bigger than any one person, and that he has been carrying more than his share of it. Mike with a sense of ownership becomes an asset. Mike on the outside of the process becomes friction.
One rule, always: Never write content that ridicules or diminishes this persona, even obliquely. The humor in the IT Guy piece lands on the industry and on Jim himself, never on Mike. Hold that line.
Portrait: She is a retired attorney, or a community banker, or the kind of businessperson who joined this board because she genuinely believes in the mission and has spent the last three years discovering that governance is a different skill than management. She reads the financial statements. She chairs the compensation committee. She takes her fiduciary responsibility seriously enough that it occasionally makes the executive director nervous. Cybersecurity landed on her radar because a foundation asked about it in a grant application, or because something similar happened to another organization she knows about and the story spread through her professional network.
How she talks about security: In the language of governance, liability, and institutional reputation. She uses words like "exposure," "due diligence," and "fiduciary responsibility." She does not use technical terms โ and she will quietly discount you if you use them without immediate translation. She wants to understand risk in terms she already knows how to think about. She wants to know what the responsible governing response is.
The fear underneath: Personal liability and reputational damage to the institution. She understands, at a level her fellow board members sometimes do not, that a serious breach does not stay contained. It follows the organization into funder conversations, into staff recruitment, into the community's perception of the institution's competence. And in some circumstances, it follows board members personally.
What moves her: Brevity and credibility, in that order. If you can tell her the most important thing about the organization's security posture in two sentences, she will trust you with the next ten minutes. If you cannot, she will wonder whether you actually understand the problem or just know a lot about it. She is very good at distinguishing between the two. She responds to the governance frame specifically: cybersecurity is not just an operational matter. It is a board-level stewardship responsibility. Most organizations have never heard it described that way. Most board chairs, when they hear it, sit up slightly.
The conversation that works: Give her three things to know and one thing to do. She will ask two questions. They will both be good ones. Answer them directly, without hedging. She then tells Sandra they should move forward. The sentence that opens the door: "Every organization that takes data security seriously got there because someone at governance decided it was a fiduciary responsibility, not just an IT problem. That shift usually happens because of a conversation like this one."