Firecrawl sends webhook events at each stage of a job’s lifecycle, so you can track progress, capture results, and handle failures in real time without polling.Documentation Index
Fetch the complete documentation index at: https://firecrawl-mog-monitoring-docs.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Quick Reference
| Event | Trigger |
|---|---|
crawl.started | Crawl job begins processing |
crawl.page | A page is scraped during a crawl |
crawl.completed | Crawl job finishes and all pages have been processed |
batch_scrape.started | Batch scrape job begins processing |
batch_scrape.page | A URL is scraped during a batch scrape |
batch_scrape.completed | All URLs in the batch have been processed |
extract.started | Extract job begins processing |
extract.completed | Extraction finishes successfully |
extract.failed | Extraction fails |
agent.started | Agent job begins processing |
agent.action | Agent executes a tool (scrape, search, etc.) |
agent.completed | Agent finishes successfully |
agent.failed | Agent encounters an error |
agent.cancelled | Agent job is cancelled by the user |
monitor.page | A monitored page scrape finishes |
monitor.check.completed | Monitor check finishes and page-level changes are available |
Payload Structure
All webhook events share this structure:| Field | Type | Description |
|---|---|---|
success | boolean | Whether the operation succeeded |
type | string | Event type (e.g. crawl.page) |
id | string | Job ID |
data | array or object | Event-specific data (see examples below) |
metadata | object | Custom metadata from your webhook config |
error | string | Error message (when success is false) |
Crawl Events
crawl.started
Sent when the crawl job begins processing.
crawl.page
Sent for each page scraped. The data array contains the page content and metadata.
crawl.completed
Sent when the crawl job finishes and all pages have been processed.
Batch Scrape Events
batch_scrape.started
Sent when the batch scrape job begins processing.
batch_scrape.page
Sent for each URL scraped. The data array contains the page content and metadata.
batch_scrape.completed
Sent when all URLs in the batch have been processed.
Monitor Events
monitor.page
Sent as each monitored page scrape finishes. This event is emitted from the scrape worker path, so it arrives before the full monitor check is reconciled.
monitor.check.completed
Sent when a monitor check finishes. The data object contains check status and summary counts. Page-level results are only sent through monitor.page events or returned from the monitor check API.
success is true when the check completed without page errors. For partial or failed checks, success is false and error may contain a message.
Extract Events
extract.started
Sent when the extract job begins processing.
extract.completed
Sent when extraction finishes successfully. The data array contains the extracted data and usage info.
extract.failed
Sent when extraction fails. The error field contains the failure reason.
Agent Events
agent.started
Sent when the agent job begins processing.
agent.action
Sent after each tool execution (scrape, search, etc.).
The
creditsUsed value in action events is an estimate of the total
credits used so far. The final accurate credit count is only
available in the completed, failed, or cancelled events.agent.completed
Sent when the agent finishes successfully. The data array contains the extracted data and total credits used.
agent.failed
Sent when the agent encounters an error. The error field contains the failure reason.
agent.cancelled
Sent when the agent job is cancelled by the user.
Event Filtering
By default, you receive all events. To subscribe to specific events only, use theevents array in your webhook config:

