The Node.js package and full docs are available on NPM.
Install the Package
Download and include the package using the command line.
npm install @darkvisitors/sdk
Initialize the Client
Create a new instance of DarkVisitors
with your access token.
import { AgentType, DarkVisitors } from "@darkvisitors/sdk"
const darkVisitors = new DarkVisitors("your-access-token")
Generate a Robots.txt
Select which AgentType
s you want to block, and a string specifying which URLs are disallowed (e.g. "/"
to disallow all paths).
const robotsTxt = await darkVisitors.generateRobotsTxt.([
AgentType.AIDataScraper,
AgentType.Scraper,
AgentType.IntelligenceGatherer,
AgentType.SEOCrawler
], "/")
Do this periodically (e.g. once per day), then cache and serve robotsTxt
from your website's /robots.txt
endpoint.
Make an HTTP request the REST API from any codebase or programming language.
Generate a Robots.txt
Call the API to generate a new robots.txt periodically (e.g. once per day), then cache and serve the result.
URL |
URL |
https://api.darkvisitors.com/robots-txts |
HTTP Method |
POST |
Headers |
Authorization |
A bearer token with your project's access token (e.g. Bearer 48d7-fc44-4b30-916b-2a59 ). |
Content-Type |
This needs to be set to application/json |
Body |
agent_types |
An array of agent types. Allowed agent types include:
AI Agent AI Assistant AI Data Scraper AI Search Crawler Archiver Developer Helper Fetcher Headless Agent Intelligence Gatherer Scraper SEO Crawler Search Engine Crawler Security Scanner Undocumented AI Agent Uncategorized
|
disallow |
A string specifying which URLs are disallowed. Defaults to / to disallow all URLs. |
Example
curl -X POST https://api.darkvisitors.com/robots-txts \
-H "Authorization: Bearer ${ACCESS_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"agent_types": [
"AI Data Scraper",
"Scraper",
"Intelligence Gatherer",
"SEO Crawler",
],
"disallow": "/"
}'
Serve the Response
The response body is a robots.txt in text/plain
format. You can use this as is, or append additional lines to include things like sitemap directives. Cache and serve this text from your website's /robots.txt
path.
The Shopify integration is in the works. If you want early access, please contact us.
The Python package is in the works. If you want early access, please contact us.
The PHP package is in the works. If you want early access, please contact us.