A lead CMMC assessor with experience on both sides of the table explains why most organizations fail before assessment day even starts, and what it actually takes to pass.
Most defense contractors will interact with a CMMC assessor exactly once during their certification journey. That single interaction determines whether they can compete for DoD contracts or get sent back to square one. Understanding how assessors think is one of the highest-leverage things a contractor can do.
Norris Carden is a lead CMMC assessor at Sentar, a C3PAO (one of the authorized organizations that conduct CMMC assessments for defense contractors). But what makes Norris's perspective especially useful is that he didn't start on the assessment side. He spent six years in CMMC consulting, helping organizations build their documentation and prepare for readiness, before self-funding the training to become a Certified CMMC Assessor (CCA).
"I think consulting makes you a better assessor, and being an assessor can make you a better consultant." — Norris Carden
That dual perspective is exactly why his take on what goes wrong is worth paying attention to.
Norris's role as a lead assessor means he's the one running the assessment engagement from start to finish. He reviews the documentation, manages the assessment team, and makes the final determination on whether an organization meets the requirements.
But unlike many assessors, Norris came up through the consulting side first. He spent years helping organizations write their SSPs, build their evidence packages, and prepare for the very evaluations he now conducts. That means when he flags a problem, it's not theoretical. He's seen the same mistakes from both sides of the table.
Here's where most organizations misunderstand the assessment process entirely: they think their system is being assessed. It's not. It's the SSP.
Your System Security Plan is the document the assessor evaluates. Not your Active Directory setup. Not your Microsoft 365 configuration. Not your tooling stack. The SSP.
"We don't care what you use. One of my coworkers says all we do is call balls and strikes. I take that a little bit further and say, I don't criticize your pitch count, your pitch selection. I just call a ball or a strike." — Norris Carden
That means if the objective says "are you doing X?" and your SSP doesn't say it clearly, it's a miss. It doesn't matter how good your actual security posture is if the document doesn't reflect it.
The most common version of this failure? Teams write an implementation statement for a requirement but never look at the individual objectives underneath it.
"They never took the time to go look at the 800-171 assessment guide and realize, 'Hey, it says objective A, objective B, objective C. I need to address each one of those.'" — Norris Carden
Norris sees this constantly. Objectives that say "identify" or "define" get answered with vague statements instead of specifics.
"I can't tell you the number of SSPs I've seen that just say, 'We sync our time.' But don't specify, because it says 'identify' is the objective. They don't specify what is the time source that all of my systems go to." — Norris Carden
And then there's the other side of the problem: teams who over-document. They throw in extra context, irrelevant details, and unnecessary references that actually create more work for the assessor.
"If you throw in extra stuff in your SSP to supposedly address an objective, you just made me have to ask you about that. It may be irrelevant, but because you said it, I have to ask about it sometimes." — Norris Carden
The good news? Norris has seen the other side, too. Organizations that come in well-prepared make the process look completely different. "The difference is night and day," he says. Here's how they do it.
The single biggest shift teams need to make is moving from requirement-level responses to objective-level responses. Every NIST 800-171 requirement has multiple assessment objectives. Each one needs its own clear answer.
"If it says, 'What is the time source?' Put down the time source. If it says 'identify,' identify. If it says to do this, explain how you're doing that. That's really the biggest thing." — Norris Carden
The 800-171A assessment guide spells out every objective an assessor will evaluate. The CMMC assessment guide covers the same ground in slightly more accessible language. Both are available. There's no reason to guess.
One of the clearest signals of readiness is how a team explains their implementation. Norris breaks it down with a simple example.
Saying "Yes, we do X" is not enough. That tells the assessor nothing about how you actually do it. The strong version looks like this: "Yes, we do X using A, B, C tool. Refer to X, Y, Z document. Here is an artifact that proves it."
"That's awesome, because you just laid out that yes, you're doing it, how you're doing it, evidence that you're doing it, and where I can refer to if I need to for more information." — Norris Carden
This is the difference between documentation that survives assessment and documentation that stalls it.
Not everything needs an external reference, and not everything should live inside the SSP. The rule of thumb is straightforward.
If you can identify, define, describe, or explain something directly within the SSP objective statement without needing another document, do that. It's faster for the assessor and cleaner for you.
But if the information is fluid (like a list of approved applications that changes regularly), keep it in an external document so you don't have to rewrite your SSP every time.
"Business-wise, there's no reason to have 14 different documents when five will work. Keep it neat, clean, tight, well-informed." — Norris Carden
One document can serve multiple requirement families. An acceptable use agreement, for instance, can be relevant to access control, identity management, and audit requirements all at once.
Norris flags one specific trap that catches teams repeatedly: treating Active Directory as their authoritative source for authorized users. The problem is that AD only confirms an account exists, not that it went through a proper authorization process.
"If Billy Bob creates a new account for his friend, it's in Active Directory. Does that mean automatically it's an authorized user? It didn't go through your process of validating and authorizing that the user has a reason to be there." — Norris Carden
A separate, maintained list of authorized users (and privileged users) that reflects an actual authorization process is far stronger than pointing at a directory that any admin could modify.
There's a principle everyone hears when prepping for assessment: don't volunteer information the assessor hasn't asked for. Norris points out that teams follow this rule in conversation but completely ignore it when writing their SSP.
"They're throwing out all sorts of extra stuff unnecessarily. Don't make the assessor ask questions that the assessor doesn't need to ask." — Norris Carden
If the objective asks whether you've defined your time source, you don't need a policy document explaining why you chose that time source. Just name it.
💡Writing an SSP that actually passes assessment is harder than most teams expect.
BEMO helps defense contractors structure their documentation around the exact objectives assessors evaluate, so nothing gets missed and nothing unnecessary slows you down.
The consequences of showing up unprepared are immediate and costly.
It starts at the scoping call, where the assessor reviews your documentation for the first time. Norris has seen organizations get flagged within minutes.
"It took that company less than five minutes to review stuff, and probably took less than that because I figured it out in 30 seconds of them showing me the SSP." — Norris Carden
From there, the outcomes are limited. The assessor can hint ("I'm not seeing" or "I would expect to see"), but they can't advise or tell you how to fix it. You go back and start again.
If you make it to assessment day and things go poorly early on, there's the option to convert the assessment into a mock. It's the same process, no consulting, no advising, but it doesn't count against you. It's a real option, and it's fully approved by DoD. But it means rescheduling, rebuilding, and spending more time and money.
And for teams doing self-assessments right now, Norris warns that the same rules apply. If you find something, you put it in a POA&M, and you only have six months to correct it. Years ago, consulting clients would give themselves unrealistic timelines. That doesn't fly anymore.
The bigger picture? Contracting timelines are shrinking.
"Sometimes three to six months between bid going out and the contract being awarded. If you're not ready, you're not gonna be there." — Norris Carden
Norris has already seen it happen in practice — companies being told they can't see proposal details until they're CMMC ready.
And for organizations that have already earned Level 2? The work isn't over. Norris recommends beginning to implement NIST 800-171 Revision 3 requirements now, using DoD's guidance for the objectives they define. GSA has already adopted Revision 3 for its own CUI protection requirements.
"CMMC is about protecting the data. It's not about continuity, keeping your business functioning. They don't tell you to run a backup. So if you're hacked and you lose everything, as long as their data is secure, you've met the CMMC compliance requirement." — Norris Carden
Norris's lesson comes down to this: readiness isn't about the tools you use or the size of your IT team. It's about whether your documentation clearly, specifically, and completely answers every objective an assessor will evaluate.
That kind of preparation doesn't happen by accident. It takes someone who understands how assessors think, how documentation needs to be structured, and how to build an environment where security practices and evidence collection are part of the process from day one, not bolted on at the end.
💡BEMO is the managed compliance provider built for this.
From gap assessment to implementation to audit day, BEMO coordinates the entire process so you can stop worrying about evidence gaps and start earning the contracts that depend on certification.
The assessor evaluates your System Security Plan (SSP), not your technology stack. As Norris explains in this guide, the assessor doesn't care whether you use Active Directory or Google Workspace. What matters is whether your SSP clearly addresses every assessment objective with specific, evidence-backed responses. Your system supports the SSP, but the document itself is what's on trial.
Not technically, but you can be told you're not ready to proceed. The scoping call is where assessors review your documentation for the first time. If the SSP doesn't address objectives or is full of gaps, the assessor may recommend you go back and rebuild before scheduling assessment days. Norris has seen teams get flagged within seconds of showing their documentation.
Each NIST 800-171 requirement contains multiple assessment objectives (A, B, C, and so on). Writing one implementation statement for the requirement as a whole is not enough. Each objective needs to be addressed individually. If an objective says "identify," you need to specifically identify that thing. If it says "define," you need a clear definition. Missing even one objective means you won't pass on that control.
Certification isn't the finish line. You need to continuously monitor your environment, submit annual attestations, and prepare for a reassessment every three years. Norris also recommends starting to implement NIST 800-171 Revision 3 requirements now, since DoD is actively working on the transition and GSA has already adopted Revision 3 for its own CUI protection standards.
Yes. If it becomes clear early in the assessment that the organization isn't performing well, the assessment can be converted into a mock assessment. This is fully approved by DoD. A mock follows the same process with no consulting or advising, but it doesn't count as an official result. It gives the organization a chance to identify gaps and reschedule a formal assessment later.