5.6 Embedded Copilot in Entra, Purview, and Defender for Cloud
Embedded Copilot in Entra, Purview, and Defender for Cloud
Domain 3 — Manage Incident Response: "Investigate incidents by using agentic AI, including embedded Copilot for Security." Copilot's embedded experiences extend across multiple security products beyond Defender XDR and Sentinel.
Introduction
Beyond Defender XDR and Sentinel, Copilot has embedded experiences in Entra ID, Microsoft Purview, and Defender for Cloud. Each embedded experience provides product-specific AI assistance that accelerates the investigation and management tasks unique to that product.
Copilot in Microsoft Entra ID
The Entra ID embedded experience assists with identity risk assessment and conditional access troubleshooting — two of the most time-consuming identity security tasks.
User risk assessment. When viewing a risky user in Entra ID Protection, Copilot summarises the risk signals: what triggered the risk detection, the risk level and contributing factors, the user’s recent sign-in pattern, and whether the risk aligns with known attack patterns (AiTM, credential stuffing, password spray). This replaces the manual process of clicking through multiple sign-in log entries and risk detection reports.
The assessment answers the triage question: “Is this user actually compromised, or is the risk detection a false positive?” Copilot evaluates the evidence and provides a reasoning chain — not just a conclusion, but the evidence supporting the conclusion. The analyst validates the reasoning against the raw sign-in data.
Conditional access troubleshooting. When a user reports they cannot access a resource, Copilot can analyse the conditional access evaluation: which policies applied to the sign-in, which conditions were evaluated (location, device compliance, MFA status, risk level), and which policy caused the block. This diagnosis takes an administrator 10-15 minutes of manual policy review; Copilot provides the answer in seconds.
For security operations, conditional access troubleshooting matters during incident response: when you change conditional access policies to contain a threat (block access from risky IPs, require MFA for all sign-ins), you may inadvertently block legitimate users. Copilot helps identify which users are affected by the policy change and why.
Sign-in log analysis. Copilot can summarise a user’s sign-in activity: “This user signed in from 3 countries in the last 24 hours, with 2 high-risk sign-ins and 5 MFA challenges. The sign-ins from the UK match the user’s historical pattern. The sign-ins from the US and Russia are anomalous.” This is the same analysis you would perform manually (Module 1), but delivered in seconds.
Copilot in Microsoft Purview
The Purview embedded experience assists with compliance investigation and data protection tasks.
DLP alert investigation. When investigating a DLP alert (Module 3.3), Copilot summarises the alert context: what sensitive data was detected, the confidence level, the policy that triggered, and whether the data was blocked or delivered. For complex DLP alerts involving multiple SIT matches across multiple files, Copilot’s summary saves significant triage time.
Audit log investigation. Copilot can generate audit log search queries from natural language descriptions: “Show me all file downloads by j.morrison from the Finance Reports SharePoint site in the last 48 hours.” Copilot generates the appropriate search parameters for the Purview audit log interface — not KQL (which is for Sentinel), but the Purview portal’s own search syntax.
eDiscovery assistance. When building content search queries in eDiscovery (Module 3.8), Copilot can translate investigation questions into the KQL-like search syntax that eDiscovery uses: “Find all emails from j.morrison to external recipients containing attachments with the word ‘confidential’ between March 18 and March 20.”
Communication compliance. For organisations using Purview’s communication compliance features, Copilot assists with reviewing flagged communications — summarising why a message was flagged, what policy it matched, and the recommended disposition (escalate, dismiss, or remediate).
Copilot in Microsoft Defender for Cloud
The Defender for Cloud embedded experience assists with cloud security alert investigation and posture management.
Security alert explanation. When investigating a Defender for Cloud alert (Module 4.8), Copilot explains the alert in cloud context: what the alert means, what the attacker is likely trying to achieve, how this fits in the cloud kill chain, and what remediation actions are recommended. For cloud-specific alert types that analysts encounter less frequently (container escape indicators, suspicious ARM operations, anomalous storage access), the contextual explanation is particularly valuable.
Remediation guidance. For security recommendations (CSPM findings from Module 4.4), Copilot provides step-by-step remediation guidance in natural language: “To fix the recommendation ‘Virtual machines should have endpoint protection,’ navigate to the VM in the Azure portal, install the MDE extension, and verify the agent reports to Defender for Cloud.” This guidance is more accessible than the documentation-style remediation steps in the native recommendation.
Attack path explanation. Copilot can explain attack paths (Module 4.4) in natural language: “This attack path shows that an attacker could exploit CVE-2024-21410 on your internet-facing web server to gain access, then move laterally to your SQL server through unrestricted network access, and exfiltrate customer data. The easiest fix is to patch the CVE — this eliminates the entry point.” The natural language explanation makes attack paths accessible to non-technical stakeholders (management, compliance officers) who need to understand the risk without interpreting graph visualisations.
The embedded experience matrix
| Product | Key Copilot Capabilities | Primary Use Case |
|---|---|---|
| Defender XDR | Incident summary, alert explain, guided response, script analysis, KQL generation | Incident investigation + response |
| Sentinel | KQL generation, analytics rule assist, hunting queries, workbook queries | Detection engineering + hunting |
| Entra ID | User risk assessment, CA troubleshooting, sign-in analysis | Identity risk management |
| Purview | DLP alert summary, audit search, eDiscovery query, communication review | Data protection investigation |
| Defender for Cloud | Alert explanation, remediation guidance, attack path explanation | Cloud security operations |
| Intune | Device compliance assessment, policy analysis | Endpoint management |
Entra ID: conditional access what-if analysis
One of the most practical Copilot capabilities in Entra ID is explaining conditional access evaluation outcomes. When a sign-in is blocked by conditional access and the user contacts the helpdesk, Copilot can answer: “Why was this sign-in blocked?” by analysing the conditional access policies that applied.
The manual process: Navigate to the sign-in log, find the specific sign-in event, open the Conditional Access tab, review each policy that evaluated, identify which policy blocked the sign-in, and determine which condition failed (location, device compliance, MFA, risk level). For environments with 20+ conditional access policies, this analysis takes 10-15 minutes.
The Copilot process: Ask “Why was j.morrison blocked from accessing SharePoint at 14:30 today?” Copilot analyses the sign-in event, identifies the blocking policy, and explains: “The sign-in was blocked by policy ‘Require compliant device for SharePoint.’ The user’s device (LAPTOP-NGE042) was not marked as compliant in Intune because the compliance policy ‘Windows security baseline’ failed — the device does not have BitLocker enabled.”
This capability is valuable during incident response when you change conditional access policies for containment (e.g., blocking access from all untrusted locations) and need to quickly assess the impact on legitimate users.
Entra ID: identity risk investigation workflow
Beyond individual risk assessments, Copilot supports a complete identity risk investigation workflow:
Prompt 1: “List all users currently flagged as high risk in Entra ID Protection with their risk factors.”
Copilot returns a summary of all high-risk users, their risk level, the triggering detections (anomalous sign-in, leaked credentials, impossible travel), and when the risk was first detected.
Prompt 2: “For user j.morrison who has high risk due to anomalous token activity, explain what this detection means and what investigation steps I should take.”
Copilot explains: token activity anomaly indicates that a session token was used from a different IP than the original authentication IP, which suggests AiTM attack or token theft. Investigation steps: check the sign-in log for the authentication event and compare the IP with subsequent activity IPs, check for inbox rule creation, and check MailItemsAccessed from the anomalous IP.
Prompt 3: “Generate a KQL query to check if j.morrison had any inbox rules created or MailItemsAccessed events from IPs different from their sign-in IP in the last 48 hours.”
This three-prompt sequence is a complete identity risk investigation workflow that takes 5 minutes with Copilot versus 25 minutes manually.
Purview: insider risk investigation assistance
For organisations with Purview Insider Risk Management (Module 3.4-3.5), Copilot provides contextual assistance within the IRM workflow.
Alert context summary. When reviewing an insider risk alert, Copilot summarises the behavioral indicators that contributed to the risk score: “This user’s risk score escalated from Low to Critical over 5 days. Contributing indicators: 500 files downloaded from SharePoint (10x daily average), USB device connected for the first time in 90 days, 500 files copied to USB drive, and personal Dropbox accessed via browser. The sequence matches the departing employee data theft pattern.”
Investigation question generation. Copilot suggests specific investigation questions based on the alert type: “For a departing employee data theft alert, consider investigating: (1) What files were downloaded — are they labeled as confidential? (2) Were the files on the USB also uploaded to cloud storage? (3) Has the user’s manager been notified per your IRM policy? (4) Does the user have access to any other sensitive data stores?”
Evidence summary for HR coordination. When preparing to coordinate with HR (Module 3.5), Copilot generates a sanitised summary suitable for non-technical stakeholders: “An employee in the Finance department who submitted their resignation on March 1 has exhibited data handling behavior that deviates significantly from their historical pattern. Over the past 5 days, the volume of files accessed and copied to external media is 10 times their normal activity. The specific files and their classification are available for review by authorised investigators.”
Defender for Cloud: recommendation prioritisation
Defender for Cloud often presents hundreds of security recommendations. Copilot helps prioritise them by explaining the risk context of each recommendation in natural language.
Prompt: “I have 142 open security recommendations. Which 5 should I fix first and why?”
Copilot analyses the recommendations considering severity, the number of affected resources, whether the recommendation appears in an attack path, and whether it maps to critical compliance controls. It returns a prioritised list with explanations: “Priority 1: ‘Machines should have endpoint protection’ — 23 VMs affected, appears in 3 attack paths, maps to CIS 8.1. Priority 2: ‘Storage accounts should restrict network access’ — 12 storage accounts publicly accessible, 4 contain sensitive data per data-aware scanning.”
This prioritisation capability is what makes Copilot valuable for CSPM operations: it translates a large, undifferentiated list of findings into a focused, justified remediation plan.
Cross-product entity enrichment pattern
A powerful pattern that works across all embedded experiences is entity enrichment — taking an entity from one product and enriching it with data from another product through Copilot.
Example: You are investigating a Defender for Cloud alert about suspicious VM activity. The alert shows the VM was accessed by identity admin@northgateeng.com. You ask Copilot: “What do we know about admin@northgateeng.com across all security products?”
Copilot queries multiple plugins simultaneously: Entra ID (sign-in history, risk level, role assignments), Defender XDR (recent alerts involving this identity), Sentinel (historical alerts and incidents), and Purview (recent audit log activity). The response is a comprehensive entity profile that would take 15 minutes to assemble manually from four different portals.
This cross-product enrichment is one of Copilot’s most time-saving capabilities because it eliminates the portal-switching that consumes a significant portion of investigation time. Instead of navigating to Entra ID, then Defender XDR, then Sentinel, then Purview, you ask one question and get a unified answer.
Practical scenario: Copilot across products during a BEC investigation
To illustrate how the embedded experiences work together, consider this investigation flow:
An Entra ID Protection alert flags j.morrison as high risk. The analyst asks Copilot in Entra ID: “Summarise the risk for j.morrison.” Copilot explains: anomalous token activity from a new IP, MFA satisfied via AiTM proxy. The analyst then switches to the Defender XDR portal and opens the correlated incident. Copilot in Defender XDR provides the incident summary: phishing email delivered, credential harvested, inbox rule created, data exfiltrated. The analyst then checks the Purview embedded Copilot: “Were any DLP alerts triggered for j.morrison in the last 24 hours?” Copilot confirms: 15 credit card numbers sent to external address, action was Audit (delivered). Finally, the analyst checks Defender for Cloud: “Did j.morrison’s identity access any Azure resources during the compromise window?” Copilot queries the AzureActivity plugin and confirms: no Azure resource access detected.
The entire cross-product check — spanning Entra ID, Defender XDR, Purview, and Defender for Cloud — took 4 minutes using embedded Copilot in each portal. Without Copilot, the same checks would require navigating to each portal separately, writing or configuring searches in each, and manually collating the results — approximately 25 minutes.
Copilot in Intune: device compliance and configuration
Intune manages device compliance and configuration policies. When a security investigation involves a device, Copilot can query the Intune plugin to provide device context: is the device compliant? What compliance policies apply? What configuration profiles are deployed? When was the device last checked in?
Investigation use case: A Defender for Endpoint alert fires for a device running outdated antivirus definitions. Before investigating the alert, the analyst asks Copilot: “What is the compliance status of device DESKTOP-NGE042 in Intune?”
Copilot responds: “DESKTOP-NGE042 is marked as non-compliant. The failing compliance policy is ‘Windows Security Baseline v2’ — the device has not applied the latest antivirus definition update (last update: 14 days ago). The device last checked in 3 hours ago. The primary user is j.morrison@northgateeng.com.”
This context immediately explains the alert: the device has outdated antivirus definitions because the compliance policy is failing. The investigation shifts from “is this device compromised?” to “why is the compliance policy failing, and is the outdated antivirus a risk factor for the current alert?”
Time savings across embedded experiences
| Task | Product | Manual | Copilot |
|---|---|---|---|
| User risk assessment | Entra ID | 10-15 min | 1 min |
| CA troubleshooting | Entra ID | 10-15 min | 30 sec |
| DLP alert summary | Purview | 5-10 min | 30 sec |
| Audit log search | Purview | 5 min (query build) | 30 sec |
| Alert explanation | Defender for Cloud | 5-10 min (doc lookup) | 15 sec |
| Recommendation priority | Defender for Cloud | 20-30 min | 2 min |
| Attack path explanation | Defender for Cloud | 10 min | 30 sec |
| Device compliance check | Intune | 5 min | 30 sec |
| Cross-product entity enrichment | All products | 15-20 min | 1 min |
Try it yourself
If Copilot is available in your Entra ID portal, navigate to Entra ID → Protection → Risky users and check whether Copilot provides risk summaries for any listed users. If your environment has risky sign-ins, Copilot's summary should explain the risk factors. If no risky users exist (common in lab environments), review the Microsoft documentation for the Entra ID Copilot experience to understand the capability.
What you should observe
The Copilot panel in Entra ID provides a contextual summary of the selected user's risk. For risky users, it explains: what triggered the risk, the confidence level, recent anomalous activity, and whether the risk pattern matches known attack types. For healthy users, the summary confirms the absence of risk signals.
Knowledge check
Check your understanding
1. A Defender for Cloud attack path is too complex for non-technical management to understand from the graph visualisation. How can Copilot help?