6.5 Custom Logs and API Ingestion
Custom Logs and API Ingestion
By the end of this subsection, you will understand when to use custom log tables, how the Logs Ingestion API works, and be able to choose the right custom ingestion approach for a given scenario.
Not every data source has a built-in connector. Custom applications, SaaS platforms without native Sentinel integration, and proprietary systems require custom ingestion.
Three approaches — choose by scenario
| Approach | Complexity | Best for | How it works |
|---|---|---|---|
| Logs Ingestion API | Low | Apps that can POST JSON (webhooks, scripts, custom apps) | App sends JSON to a DCR endpoint; DCR validates schema and writes to custom table |
| Codeless Connector Platform (CCP) | Medium | Productized connectors for SaaS REST APIs | ARM template defines polling behavior, authentication, and data mapping |
| Azure Functions | Medium-High | Complex polling + transformation from APIs without webhook support | Function runs on schedule, polls API, transforms, sends to Logs Ingestion API |
Logs Ingestion API — the most common path
The Logs Ingestion API accepts JSON payloads over HTTPS and writes them to your workspace through a DCR. This is the foundation for all custom data ingestion.
How it works:
- Create a custom table in your workspace (e.g.,
CustomWebApp_CL). Define the schema: column names and data types. - Create a DCR that maps the incoming JSON fields to the table columns and optionally transforms the data.
- Create an Entra ID app registration for authentication (client ID + secret).
- POST JSON from your application to the DCR endpoint URL with a bearer token.
Example: custom web application sending auth failures
Your internal web application logs authentication failures. You want to detect brute force against this app the same way you detect it against Entra ID.
The application sends a JSON payload:
| |
Once ingested, query it like any standard table:
| |
| SourceIP_s | FailCount | Users |
|---|---|---|
| 203.0.113.88 | 47 | 1 |
| 198.51.100.33 | 12 | 8 |
_s suffix on columns indicates string type (custom table convention). This query is a ready-made analytics rule — the same detection logic from Module 2.5, applied to a non-Microsoft application.Without custom ingestion, your internal applications are invisible to Sentinel. An attacker who brute-forces your HR portal, your VPN concentrator's web interface, or your customer-facing app generates no Sentinel alert. Custom tables close this gap — apply the same KQL detection patterns you learned in Modules 2 and 4 to any data source.
Codeless Connector Platform (CCP)
CCP creates a full Sentinel data connector — with a connector page in the gallery, health monitoring, and configuration UI — without writing code. It uses an ARM template to define API polling.
When to use: You want a reusable, maintainable connector for a SaaS platform with a REST API. The platform does not support webhooks (you must poll). You plan to share the connector with the community or use it across multiple workspaces.
When NOT to use: For one-off data sources, scripts, or applications that can POST JSON, the Logs Ingestion API is simpler and requires no ARM template knowledge.
Azure Functions
An Azure Function runs on a schedule (e.g., every 5 minutes), calls a third-party API, transforms the response, and sends the transformed data to the workspace via the Logs Ingestion API.
When to use: The data source has a REST API but no webhook support. The transformation is too complex for a DCR alone (e.g., enriching events with external data, flattening nested JSON, deduplicating).
Example scenario: Your threat intelligence provider has a REST API that returns new IOCs. An Azure Function polls every 5 minutes, transforms the response into ThreatIntelligenceIndicator format, and ingests it. Now Sentinel matches these IOCs against all incoming data automatically.
Try it yourself
1. Internal web app (webhooks): Logs Ingestion API. The app sends JSON directly to the DCR endpoint. No infrastructure needed beyond the DCR and app registration.
2. SaaS HR platform (REST API, no webhook): Azure Function or CCP. The Function polls the API on a schedule and forwards via Logs Ingestion API. If this is a long-term connector you will maintain across workspace rebuilds, CCP is cleaner. For a quick deployment, the Function is faster to set up.
3. Legacy database (CSV hourly): Azure Function. The Function reads the CSV from a file share or blob storage on a schedule, parses it, transforms to JSON, and sends via Logs Ingestion API. Alternatively, use AMA with a custom text log collection (file-based collection).
Check your understanding
1. Your custom web application can POST JSON to an HTTPS endpoint. Which ingestion approach requires the least infrastructure?
2. Why is custom log ingestion important for security detection coverage?