GET/v1/crawlers/{crawler_hash}

Get Crawler Details

Retrieve detailed information about a specific crawler including input parameters and output fields

This endpoint retrieves detailed information about a specific crawler using its hash ID. It helps you understand each crawler's availability, cost, parameters, and potential issues before using it.

Headers

KeyValueRequired
AuthorizationToken YOUR_API_KEYYes
Content-Typeapplication/jsonNo

Response Field Explanations

id
string
Unique identifier for the crawler
Example: c106a44a98044ef18acc59986ae10967
name
string
Human-readable name (e.g., "Google Maps Search Export")
Example: Google Maps (1)
slug
string
URL-friendly identifier
Example: ""
is_public
boolean
Whether the crawler is publicly accessible
Example: true
is_available
boolean
Whether the crawler is available for runs (false if paused)
Example: true
is_premium
boolean
Requires premium subscription or extra credits
Example: false
has_issues
boolean
Currently experiencing issues (runs paused until fixed)
Example: false
account
objectnull
Required account sync details (null if no authentication needed)
Example: null
input
array
Array of configurable parameters (task-level and squid-level)
Example: []
result
array
Array of output field names this crawler collects
Example: []

Understanding Input Parameters

input[].level
string
Parameter level: "task" (per URL/task) or "squid" (overall run behavior)
Example: ""
input[].is_params
string
For task-level: true indicates this parameter is used per task submission
Example: ""
input[].is_details
string
For squid-level: true indicates this controls overall behavior or features
Example: ""
input[].function
string
If true, this is an optional add-on feature with extra credit costs
Example: ""
input[].credits_per_function
string
Extra credits charged per row when this optional function is enabled
Example: ""
input[].name
string
Parameter name to use when configuring the crawler
Example: Google Maps (1)
input[].type
string
Data type of the parameter (string, int, boolean, etc.)
Example: ""
input[].required
string
Whether this parameter is mandatory
Example: ""
input[].default
string
Default value if parameter is not provided
Example: ""

Parameters

crawler_hashstring
Required
The unique identifier (hash/id) of the crawler
Example: 4734d096159ef05210e0e1677e8be823
Pro Tip

Pro Tip

Check the 'has_issues' flag before creating tasks. If true, the crawler is experiencing problems and runs are paused.
Note

Note

Parameters with 'function: true' are optional add-ons that cost extra credits per row (specified in 'credits_per_function').
Warning

Warning

Some crawlers require account synchronization (see 'account' field). You'll need to sync cookies before running these crawlers.
Pro Tip

Pro Tip

Use the 'result' array to know exactly what fields will be returned in your data. This helps you plan your data processing pipeline.