In this module
AD5.7 Secure Score as a Weekly Health Check
Figure AD5.7 — Secure Score measures configuration against Microsoft's recommendations. A stable or rising score means your controls are maintained. A dropping score means something changed. The score measures what's configured, not how effective it is — your Monday review incidents tell you about effectiveness.
Navigating Secure Score
Navigate to security.microsoft.com → Exposure management → Secure score. The overview page shows: your current score, the maximum achievable score (you can't reach 100% on E3 because some recommendations require E5 features), your score by category (Identity, Data, Device, Apps), and the improvement actions ranked by point value.
Your Monday check: compare the current score to last week. If it's the same or higher, your controls are maintained. If it dropped, click "History" to see which improvement actions lost points — something was disabled or changed. Investigate the change: was it intentional (a CA policy temporarily disabled for troubleshooting) or accidental (someone modified a compliance policy)?
The improvement actions list: Each action shows: the point value (how much it improves your score), the category (Identity, Data, Device, Apps), the status (Completed, Not completed, In progress, Risk accepted), and the implementation steps. You don't need to implement every action — some require E5 licensing you don't have, some have high blast radius you've decided to defer, and some are low-value relative to the effort.
After deploying AD1-AD4, review the remaining improvement actions. The highest-value actions you haven't implemented yet are your next improvement priorities. Common remaining actions on E3: enable audit logging (typically already on), configure session timeout policies, enable modern authentication for Exchange Online (typically already on), and implement role-based access control for admin accounts.
For actions you've decided not to implement (E5-only features, high-blast-radius configurations you've deferred), set the status to "Risk accepted" with a justification. This stops them from appearing as "Not completed" and accurately reflects your conscious decision rather than an oversight.
Using the Secure Score API for reporting
For your quarterly report, pull the Secure Score programmatically:
Connect-MgGraph -Scopes "SecurityEvents.Read.All"
$score = Get-MgSecuritySecureScore -Top 1
Write-Host "Current Secure Score: $($score.CurrentScore) / $($score.MaxScore)"
Write-Host "Percentage: $([math]::Round($score.CurrentScore / $score.MaxScore * 100, 1))%"
Write-Host "Date: $($score.CreatedDateTime)"
# Category breakdown
$score.ControlScores | ForEach-Object {
Write-Host "$($_.ControlCategory): $($_.Score) / $($_.MaxScore)"
}Run this monthly and record the score in your metrics spreadsheet. The quarterly report shows the trend: "Secure Score improved from 42% (pre-deployment baseline) to 67% after AD1-AD4 deployment and has maintained at 65-68% for the last 3 months."
What Secure Score doesn't tell you
Secure Score measures configuration, not effectiveness. A score of 70% means 70% of Microsoft's recommended configurations are implemented — it doesn't mean 70% of attacks are blocked. Some of the most impactful security controls (user awareness, incident response procedures, monitoring cadence) aren't measured by Secure Score.
Use Secure Score as ONE input to your security posture assessment, not the only input. Your Monday review incidents tell you about real threats. Your DLP Activity Explorer tells you about data protection effectiveness. Your compliance rate tells you about device health. Secure Score tells you about configuration completeness. Together, they provide a comprehensive view.
Don't chase the number. A tenant with a Secure Score of 60% and an effective Monday review that catches incidents early is more secure than a tenant with a Secure Score of 85% where nobody checks the incident queue. Configuration is necessary but not sufficient — monitoring makes it operational.
The top improvement actions for E3 environments
After deploying Modules AD1-AD4, your remaining improvement actions fall into three categories:
Quick wins (implement this week). Actions worth 3+ points that take under 15 minutes: enable unified audit logging (if not already on), set the idle session timeout for M365 apps (reduces the window for unattended session abuse), enable sign-in risk policies in Entra ID Protection P1, and configure admin consent workflow for OAuth applications (preventing users from granting consent to malicious apps without admin approval).
Navigate to entra.microsoft.com → Applications → Enterprise applications → Consent and permissions → Admin consent settings. Enable "Users can request admin consent to apps they are unable to consent to." Set the reviewer to yourself. This prevents the consent phishing attack (AD2.4) at the source — users can't grant permissions to external apps without admin review.
Medium effort (implement this month). Actions requiring policy creation or testing: configure password protection (ban common passwords in Entra ID), enable sign-in frequency controls in conditional access (limit session lifetimes), configure admin MFA separately from user MFA (admin accounts should use phishing-resistant MFA like FIDO2 or certificate-based auth if possible), and restrict Azure AD admin portal access to admin roles only.
# Check admin consent workflow status
Connect-MgGraph -Scopes "Policy.Read.All"
Get-MgPolicyAdminConsentRequestPolicy | Select-Object IsEnabled, ReviewerType,
@{N="Reviewers";E={$_.Reviewers.Query}} | Format-ListDeferred / E5-required (document as risk accepted). Actions requiring E5 licensing: enable Defender for Endpoint (requires E5 or add-on), enable Defender for Cloud Apps (requires E5), configure auto-labeling policies (requires E5 Information Protection), enable Endpoint DLP (requires E5). Mark these as "Risk accepted" with your justification and compensating controls.
Automating Secure Score tracking
For hands-off monthly tracking, create a PowerShell script that pulls the score history and appends it to a CSV file:
Connect-MgGraph -Scopes "SecurityEvents.Read.All"
# Get the last 30 days of Secure Score history
$scores = Get-MgSecuritySecureScore -Top 30 | Sort-Object CreatedDateTime
# Display trend
$scores | ForEach-Object {
$pct = [math]::Round($_.CurrentScore / $_.MaxScore * 100, 1)
Write-Host "$($_.CreatedDateTime.ToString('yyyy-MM-dd')): $($_.CurrentScore)/$($_.MaxScore) ($pct%)"
}
# Export to CSV for quarterly report
$scores | Select-Object @{N="Date";E={$_.CreatedDateTime.ToString("yyyy-MM-dd")}},
CurrentScore, MaxScore,
@{N="Percentage";E={[math]::Round($_.CurrentScore / $_.MaxScore * 100, 1)}} |
Export-Csv -Path "C:\SecurityScripts\SecureScoreHistory.csv" -NoTypeInformation -Append
Write-Host "`nExported to SecureScoreHistory.csv"Run this monthly and the CSV builds your trend data automatically. The quarterly report includes: "Secure Score: 67% (stable over 3 months, up from 42% pre-deployment baseline)." The trend line demonstrates sustained security posture, not just a one-time configuration effort.
Common Secure Score changes and what they mean
Score increased without you doing anything: Microsoft updated the scoring methodology or removed a recommendation you hadn't addressed. Check what changed in the History tab — it may reveal that a previous "not completed" action was reclassified.
Score decreased after a Microsoft update: Microsoft sometimes adjusts scoring weights or adds new recommendation categories. A small drop (1-3 points) after a Microsoft update is normal. Check the release notes in the M365 Admin Center message centre for scoring changes.
Score decreased after your own changes: You (or another admin) disabled a control. The History tab shows exactly which improvement action lost points and when. This is the most actionable finding — investigate whether the change was intentional (troubleshooting) or accidental (misconfiguration), and re-enable if needed.
Score doesn't change despite implementing an action: Some improvement actions take 24-48 hours to be reflected in the score. If the action was implemented correctly (verified in the portal), wait 48 hours and check again. If it still doesn't change, the scoring system may not have detected the implementation — verify the implementation method matches what the improvement action specifies.
Walkthrough: implementing the top E3 quick win
The highest-value E3 quick win is typically "Turn on audit log search" — worth 5-7 points. While audit logging is usually on by default, verify it's enabled and properly configured:
Navigate to purview.microsoft.com → Solutions → Audit → search. If you see the audit search page, it's enabled. If you see a banner saying "Start recording user and admin activity," click it to enable.
Verify via PowerShell:
Connect-ExchangeOnline
Get-AdminAuditLogConfig | Select-Object UnifiedAuditLogIngestionEnabled
# Should return: TrueThe second quick win: "Require MFA for administrative roles" — which you've already deployed as CA001 in Module AD1. Navigate to the Secure Score improvement action → if it shows "Not completed" despite CA001 being active, click "Resolved through third party" or verify that CA001 targets the "Directory roles" condition correctly. Sometimes the Secure Score detection doesn't recognize CA-based MFA enforcement as meeting the recommendation — you may need to check the specific conditions the improvement action expects.
After implementing quick wins, monitor the score for 48 hours and verify the increase. Record the before and after scores in your log: "Secure Score: 64% → 69% after enabling audit log + admin consent workflow." This demonstrates quantifiable security improvement in your quarterly report.
Comparing your score to industry benchmarks
The Secure Score overview page includes a "Comparison" section showing how your score compares to similar-sized organizations in your industry. This benchmarking data comes from anonymised scores across all M365 tenants.
If your score is above the industry average, communicate this to management: "Our Secure Score of 67% exceeds the industry average of 52% for organizations of similar size." This validates your security investment relative to peers.
If your score is below average, the comparison identifies the gap — and the improvement actions list tells you exactly what to do about it. The gap itself becomes a business case: "Our score of 45% is below the industry average of 52%. The top 3 improvement actions would close the gap with an estimated 2 hours of implementation work and no additional licensing cost."
Don't obsess over the exact percentile — the comparison data is useful for context but not for precise benchmarking. Use it as a directional indicator: are you roughly in line with peers, significantly ahead, or significantly behind? The direction matters more than the exact position.
Your Secure Score shows an improvement action worth 9 points: "Turn on Microsoft Defender for Endpoint." You're on E3 and don't have Defender for Endpoint licensing. Should you mark this as "Risk accepted" or leave it as "Not completed"?
Option A: Leave as "Not completed" — it's a valid recommendation and you might upgrade to E5.
Option B: Mark as "Risk accepted" with justification: "Requires E5 licensing. Current endpoint protection via Intune compliance policies (AD3) and Defender Antivirus configuration profiles (AD3.6). E5 upgrade business case under development." Review annually when licensing is reassessed.
The correct answer is Option B. Leaving it as "Not completed" makes it appear as an oversight rather than a conscious decision. "Risk accepted" documents that you've evaluated the recommendation, understand the gap, and have compensating controls in place. The justification references the specific controls you DO have (AD3). When your manager or an auditor sees the improvement actions list, "Risk accepted with justification" demonstrates mature risk management — "Not completed" just looks like you haven't gotten to it yet.
Try it: Review your Secure Score and prioritize improvements
Navigate to security.microsoft.com → Exposure management → Secure score. Record your current score.
Click "Improvement actions" and sort by "Score impact" (highest first). Review the top 5 actions you haven't implemented:
1. Is the action achievable with E3? (If not, mark as "Risk accepted" with justification) 2. What's the blast radius? (High blast = plan carefully before implementing) 3. Is it already covered by another control? (Some actions overlap with configurations you've already made)
For the top 2-3 achievable actions, create a plan to implement them in the next 2 weeks. Each improvement action has step-by-step instructions — follow them, test, and verify the score increases.
Run the PowerShell score query and record the baseline. Run it again monthly for trend tracking.
You're reading the free modules of M365 Security: From Admin to Defender
The full course continues with advanced topics, production detection rules, worked investigation scenarios, and deployable artifacts.