Because they had been doing the work on the website themselves and had installed a number of plugins that we wouldn’t normally use, it seemed likely that there was a conflict somewhere that was slowing everything down. The normal test is to turn plugins off until the website starts working to find what is causing the conflict. We also increased server resources, which seemed to fix the problem at the time. But the next morning, the website was crashing again.
Other websites on the same server had no issues, so this problem was restricted to this website – and it turned out to be Facebook causing the issue.
You don’t need to be technical to follow this guide, but these four ideas will make everything click:
.htaccess File (The Security Guard): A file that sits outside your kitchen. Before anyone can talk to the Chef, they must pass the Guard. The Guard can reject bad orders before they ever waste the Chef’s time.The dashboard takes a minute to load. Images aren’t appearing. Customers may be seeing the “503 Service Unavailable” error page.
You look for the obvious culprit – a spike in real traffic. But analytics show only a handful of people on the site.
This is the Silent Redline: your server’s engine is screaming at maximum capacity with no one in the driver’s seat.
Standard troubleshooting gets you nowhere:
The true cause is often hiding in your server logs under an innocent-sounding name: meta-externalagent or facebookexternalhit.
This is the Meta Crawler – Facebook’s automated robot. Its normal job is to visit your site so that when someone shares your link on Facebook or Instagram, a preview image and description appear. Under ordinary circumstances, it’s a helpful visitor.
But thanks to a quirk in how modern e-commerce sites work, this bot can accidentally turn into a self-inflicted denial-of-service attack.
Identifying a “Bot Bomb” requires looking past the surface of your website and into the server’s raw data. If you think you may have the same problem, here is how you identify it using cPanel.
The irony is that it was probably the updates the client did that triggered the Facebook bot to show more interest in their website and “re-spider” all the pages, including all the variations on products in their online shop. The Facebook bot was giving the website too much love!
On a website with product filters and multiple options that let customers narrow results by things like height, weight, colour etc, to a human user, selecting two filters feels like one action. To a bot, every combination of filters creates a unique, brand-new web address (URL):
yoursite.com/products/ – easyyoursite.com/products/?filter_weight=2kg – harderyoursite.com/products/?filter_weight=2kg&filter_length=2m&colour=blue – very hardThe Facebook bot doesn’t understand these are just filters on the same products. It sees thousands of “new” pages it hasn’t indexed yet and decides to crawl all of them – simultaneously.
Every time the bot hits one of those filtered URLs, the website must:
When a human does this once, the server barely notices. When a bot fires off 50 of these requests in a single second across 10 different filter combinations, the website collapses under the load.
This is compounded by a WordPress feature called WP-Cron. By default, WordPress checks its chore list (sending emails, running updates) every time a visitor arrives. When the bot hammers your site with complex filter requests, WordPress responds by triggering its chore list hundreds of times per minute – on top of an already-overwhelmed hosting server.
The result: the server is so busy starting new chores and calculating complex searches that it has no capacity left to serve actual customers.
Before changing any settings, you need to see the evidence. Here’s where to look.
If your hosting gives you SSH/Terminal access, the command top -c shows everything the server is currently processing. In a bot bomb scenario, you’ll see a long list of entries like:
lsphp:/home/username/public_html/index.php
lsphp:/home/username/public_html/index.php
lsphp:/home/username/public_html/index.php
Ten or twenty of these running simultaneously confirms the server is being flooded. If wp-cron.php appears repeatedly, the Cron Death Spiral is also in play.
In cPanel, find the “Visitors” or “Raw Access Logs” section. This is a diary of every person or robot that has touched your site in the last 24 hours.
Look for the User Agent column. Search for:
meta-externalagentfacebookexternalhitWhat a bot attack looks like in the logs:
?, &, and filter_ symbols rather than a simple page like /about-us/Every website has a robots.txt file (found at yoursite.com/robots.txt) that tells bots where they are and aren’t allowed to go.
In cPanel File Manager, check your public_html folder. If you can see robots.txt by visiting the URL in a browser, but there is no actual file in File Manager, you have a “virtual” robots.txt – WordPress is generating it on the fly using PHP every single time a bot checks it.
If an aggressive bot checks your rules 50 times a second, that’s 50 unnecessary page-cooking requests wasting CPU on something that should be instant.
Once you’ve confirmed the Meta Crawler is hammering your filtered URLs, work through these three steps in order. The goal is to stop the bot at the front gate before it clogs up PHP.
Why: If WordPress is virtually generating your robots.txt, every bot check wastes CPU. A physical file is handed over instantly – PHP never wakes up.
How:
public_htmlrobots.txt (all lowercase)User-agent: *
Disallow: /?filter_*
Disallow: /page/*?filter_*
User-agent: facebookexternalhit
Disallow: /?filter_*
Disallow: /page/*?filter_*
User-agent: meta-externalagent
Disallow: /?filter_*
Disallow: /page/*?filter_*
Result: Bots get the rules instantly as a plain text file. PHP stays asleep.
Why: This is the most critical step. Even with a good robots.txt, a misbehaving bot may ignore it. The .htaccess file enforces the block at the server level – WordPress is never involved, making it virtually free in terms of CPU.
How:
.htaccess file in public_html# Block Facebook/Meta bots from hitting filtered URLs
RewriteEngine On
RewriteCond %{QUERY_STRING} filter_ [NC]
RewriteCond %{HTTP_USER_AGENT} (facebookexternalhit|meta-externalagent) [NC]
RewriteRule ^ - [F,L]
Result: If a request comes in from the Facebook bot AND the URL contains a filter, the server returns a 403 Forbidden and drops the connection immediately – before PHP or WordPress ever see it. This costs almost nothing in CPU.
Why: By default, WordPress triggers its background tasks (emails, update checks) on every page request. During a bot attack, this runs hundreds of times per minute. Moving this to a proper scheduled task stops the self-inflicted load.
How – Part A: Disable the default behaviour
wp-config.php in public_html/* That's all, stop editing! Happy publishing. */define('DISABLE_WP_CRON', true);
How – Part B: Set up a proper scheduled task
*/10 for the minute field, * for all othersyourusername with your actual cPanel username):php -q /home/yourusername/public_html/wp-cron.php >/dev/null 2>&1
Result: Scheduled tasks (emails, updates, backups) continue to run perfectly – just once every 10 minutes in a controlled way, rather than hundreds of times per second driven by bot traffic.
Within minutes of completing all three steps, CPU usage should drop from a flat 100% to a healthy 2-5%. You haven’t lost any functionality, and you haven’t blocked Facebook from previewing your main product pages – you’ve simply stopped an aggressive robot from redlining your engine on pages it was never meant to index.
Once the immediate crisis has passed, it’s worth doing some housekeeping to prevent a recurrence.
If the site still feels a little slow after the bot is dealt with, you may have accumulated database bloat – old data left behind by plugins you’ve long since removed. A plugin called Query Monitor can show you the size of your “Autoloaded Options” in the database. If it’s over 1MB, your server is working too hard just to start up on each page load. A developer can clean this up in under an hour.
A slow website is rarely caused by just one thing. In this case, it was an aggressive bot finding a weak spot – the combination of unprotected filter URLs and a virtual robots.txt that wasted CPU on every check.
The three-step fix addresses each layer:
| Step | What It Does | Where to Do It |
|---|---|---|
Physical robots.txt | Tells bots the rules instantly, without involving PHP | cPanel File Manager |
.htaccess Guard | Blocks filter requests from Facebook bots at the server gate | cPanel File Manager |
| Disable WP-Cron | Stops background tasks running out of control during bot traffic | wp-config.php + cPanel Cron Jobs |
Together, these turn your server from an open door into a high-performance machine that reserves its energy for the people who matter most: your customers.
Google traffic can drop even when your rankings haven't moved. Here's what's actually causing it…
A basic contact form might seem like enough, but for many small NZ businesses it's…
More NZ service businesses are adding online booking, not because phone calls are dead, but…
Mailchimp is cutting its free plan to just 250 contacts and 500 emails per month…
Mistakes with digital marketing can be a costly lesson. Find out how we rescued a…
The internet as we knew it is already gone. By 2026, most website visitors won't…