GET/v1/crawlers

List Crawlers

Retrieve a list of all available crawlers and their capabilities

This endpoint returns a paginated list of all available crawlers on the lobstr.io platform. Each crawler is designed to extract specific types of data from different sources.

Use this endpoint to discover which crawlers are available for your data collection needs before creating squids or adding tasks.

Headers

KeyValueRequired
AuthorizationToken YOUR_API_KEYYes
Content-Typeapplication/jsonNo

Response Structure

total_results
integer
Total number of crawlers available
Example: 25
limit
integer
Number of results per page
Example: 50
page
integer
Current page number
Example: 1
total_pages
integer
Total number of pages available
Example: 1
result_from
integer
Starting index of results in current page
Example: 1
result_to
integer
Ending index of results in current page
Example: 25
data
array
Array of crawler objects with comprehensive details
next
stringnull
Full URL to the next page (null if on last page)
Example: null
previous
stringnull
Full URL to the previous page (null if on first page)
Example: null

Crawler Object Fields

data[].id
string
Unique identifier (hash) for the crawler
Example: 4734d096159ef05210e0e1677e8be823
data[].name
string
Human-readable name of the crawler
Example: Google Maps Search Export
data[].slug
string
URL-friendly identifier
Example: google-maps-search-export
data[].is_public
boolean
Whether the crawler is publicly available
Example: true
data[].is_available
boolean
Whether the crawler is currently available for use (false if paused)
Example: true
data[].is_premium
boolean
Requires premium subscription or extra credits
Example: false
data[].has_issues
boolean
Currently experiencing issues (runs paused until fixed)
Example: false
data[].max_concurrency
integer
Maximum number of concurrent tasks allowed for this crawler
Example: 20
data[].credits_per_row
number
Credits charged per result row
Example: 0.5
data[].account
objectnull
Required account sync details (null if no authentication needed)
Example: null
data[].icon
string
Base64-encoded SVG icon for UI display
Example: data:image/svg+xml;base64,PHN2ZyB4bWxucz...
Pro Tip

Pro Tip

Cache the list of crawlers locally as it doesn't change frequently. This reduces unnecessary API calls and improves your application's performance.
Note

Note

Each crawler has a unique hash identifier that you'll use when creating squids or configuring data collection tasks.
Pro Tip

Pro Tip

Check the crawler parameters endpoint to see what configuration options are available for each crawler before starting your scraping operations.