img BlockScrapers
Defend your site against scrapers. Automatically detect and block scrapers & other bad bots. Stop paying tens of thousands for Bot Management Suites and features you don't use
No DNS Changes
Set-Up in 5 minutes
Your First 1 Million Requests
FREE
Then $1 / 200k requests
gear gear gear gear gear gear gear gear gear
img No DNS changes No DNS changes or additional infrastructure needed. Integrates directly with your existing AWS CloudFront. Set up in 5 minutes or less.
img Eliminate the cruft We just block scrapers and don't have all the bells and whistles others do. Enables use to charge magnitudes less & still block scrapers in minutes.
img Admin Console Set up custom rules for allowing/blocking traffic. One-click support for good bots & common third party integrations (Googlebot, Bingbot, Stripe, AWS SNS, etc).

Types of bots
price Machine Learning Bots scrape massive amounts of images, videos and text for training data.
price Price scraping Bots scrape your prices, inventory list, and product views.
price Scoreboards Bots scrape your live scoreboards, leaderboards, and tournament schedule.
price DDoS Bots can crash your site or bring it down to a crawl.
price Forums & Blogs Bots scrape your user's posts, user details, and will submit mass spam.
price Password Stuffing Bots will use your site to validate millions of credit cards and passwords.
Customer Testimonies “We had multiple, aggressive bots scraping our leaderboards and monetizing the data on their own sites. Ryan stepped in and shut them down. Our data is now only present on our site. We've had Ryan's code running for 3 years now and it's incredible how many bots it still blocks.” - Jim Northcott, Head of Engineering, BlueGolf
How it works
1. JS tag collects browser signals. Scrapers can behave differently than common browsers. We can pick up a few key differences in the frontend.
2. AWS Logs & JS signals are analyzed. Scrapers can also behave differently when initializing the request to your backend. We detect these differences as well.
3. We identify the bad bots. After all the signals are aggregated, our advanced algorithms can detect minute amounts of bad bots out of millions of human requests.
4. We use AWS WAF to block the bots. We enforce the blocks with AWS Web Application Firewall.
flow
Comparison with existing solutions comparsion Other solutions (CloudFlare, Akamai, Imperva, Reblaze, etc) proxy every request in real-time, which is expensive & adds latency. To do this, they require heavy infrastructure or DNS changes BlockScrapers just processes the logs, which significantly lowers the price and provides a similar level of defense. BlockScrapers requires no DNS changes or proxying infrastructure.
Frequently Asked Questions
What about clouds other than AWS? We are gathering interest in other clouds (eg. Azure, GCP). Contact use to express your interest. If you are using physical on-promise servers, we can set up an AWS CloudFront and AWS AWF in front.
Can I whitelist a specific requests? Of course! You can whitelist requests by URL, domain, IP address, or any HTTP Header Field in the Dashboard.
What about robots.txt? Robots.txt is equivalent to politely asking scrapers not to scrape. Many will just ignore it and scrape anyway. BlockScrapers enforces these rules and blocks misbehaving bots.