Usage Examples
Practical examples for using the Search API.
Basic Search
Find domains related to a topic:
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "best CRM software",
"max_urls": 10
}'Multiple Queries
Search multiple topics in one request:
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": ["CRM software", "sales automation tools", "customer management platforms"],
"max_urls": 30
}'Response includes which query matched each result:
{
"success": true,
"queries": ["CRM software", "sales automation tools", "customer management platforms"],
"discovered_count": 30,
"results": [
{
"domain": "www.salesforce.com",
"url": "https://www.salesforce.com/crm/",
"title": "What is CRM? CRM Defined - Salesforce.com",
"query": "CRM software"
},
{
"domain": "www.hubspot.com",
"url": "https://www.hubspot.com/products/crm",
"title": "Free CRM Software - HubSpot CRM",
"query": "CRM software"
}
],
"credits_used": 3.0,
"credits_remaining": 9997.0
}Region-Specific Search
Get results from a specific region. The region also determines the proxy location:
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "job boards",
"max_urls": 15,
"region": "de-de"
}'Available Regions
| Region Code | Description |
|---|---|
us-en |
United States (English) |
uk-en |
United Kingdom (English) |
de-de |
Germany (German) |
fr-fr |
France (French) |
es-es |
Spain (Spanish) |
it-it |
Italy (Italian) |
jp-jp |
Japan (Japanese) |
au-en |
Australia (English) |
Async Mode
For large searches, process in the background:
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "cloud computing providers",
"max_urls": 100,
"async": true
}'Check task status:
curl "https://scrape.evomi.com/api/v1/scraper/search/tasks/YOUR_TASK_ID" \
-H "x-api-key: YOUR_API_KEY"Python Example
import requests
API_KEY = "YOUR_API_KEY"
BASE_URL = "https://scrape.evomi.com/api/v1/scraper/search"
def search_domains(query, max_urls=20, region="us-en"):
"""Search for domains related to a query."""
headers = {
"x-api-key": API_KEY,
"Content-Type": "application/json"
}
payload = {
"query": query,
"max_urls": max_urls,
"region": region
}
response = requests.post(BASE_URL, json=payload, headers=headers)
return response.json()
# Example usage
result = search_domains("e-commerce platforms", max_urls=10)
for item in result.get("results", []):
print(f"{item['domain']}: {item['title']}")Python with Multiple Queries
import requests
API_KEY = "YOUR_API_KEY"
BASE_URL = "https://scrape.evomi.com/api/v1/scraper/search"
def search_multiple(queries, max_urls=50):
"""Search for multiple queries at once."""
headers = {
"x-api-key": API_KEY,
"Content-Type": "application/json"
}
payload = {
"query": queries, # Pass list of queries
"max_urls": max_urls
}
response = requests.post(BASE_URL, json=payload, headers=headers)
return response.json()
# Search for multiple competitors
queries = [
"email marketing tools",
"marketing automation software",
"newsletter platforms"
]
result = search_multiple(queries, max_urls=30)
print(f"Found {result['discovered_count']} domains")
for item in result.get("results", []):
print(f" {item['domain']} (from: {item['query']})")Python Async Handling
import requests
import time
API_KEY = "YOUR_API_KEY"
BASE_URL = "https://scrape.evomi.com/api/v1/scraper/search"
def search_async(query, max_urls=100):
"""Start async search and poll for results."""
headers = {
"x-api-key": API_KEY,
"Content-Type": "application/json"
}
# Start async search
payload = {
"query": query,
"max_urls": max_urls,
"async": True
}
response = requests.post(BASE_URL, json=payload, headers=headers)
task_id = response.json().get("task_id")
if not task_id:
return response.json()
# Poll for results
check_url = f"{BASE_URL}/tasks/{task_id}"
while True:
time.sleep(2) # Wait 2 seconds between checks
result = requests.get(check_url, headers=headers).json()
if result.get("status") == "success":
return result
elif result.get("status") == "failure":
raise Exception("Search failed")
# Usage
result = search_async("SaaS companies", max_urls=50)
print(f"Found {result['discovered_count']} domains")Combined with Scraper
Search for domains, then scrape them:
import requests
API_KEY = "YOUR_API_KEY"
SEARCH_URL = "https://scrape.evomi.com/api/v1/scraper/search"
SCRAPER_URL = "https://scrape.evomi.com/api/v1/scraper/realtime"
headers = {
"x-api-key": API_KEY,
"Content-Type": "application/json"
}
# Step 1: Search for domains
search_payload = {
"query": "productivity apps",
"max_urls": 10
}
search_result = requests.post(SEARCH_URL, json=search_payload, headers=headers).json()
# Step 2: Scrape each discovered URL
for item in search_result.get("results", []):
scrape_payload = {
"url": item["url"],
"mode": "request"
}
scrape_result = requests.post(SCRAPER_URL, json=scrape_payload, headers=headers).json()
print(f"Scraped {item['domain']}:")
print(f" Title: {scrape_result.get('title', 'N/A')}")
print()Common Use Cases
Competitor Research
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": ["project management software", "task management tools"],
"max_urls": 20
}'Market Research
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "artificial intelligence startups",
"max_urls": 50,
"region": "us-en"
}'Local Business Discovery
curl -X POST "https://scrape.evomi.com/api/v1/scraper/search" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "web design agencies",
"max_urls": 30,
"region": "uk-en"
}'Tips
-
Start small — Test with low
max_urlsvalues to understand results before scaling. -
Use multiple queries — Different search terms may find different domains.
-
Choose the right region — Use the appropriate region for locally relevant results.
-
Combine with scraping — Use discovered URLs with the Scraper API for complete data extraction.