A drop in web traffic — whether it’s happening slowly or all at once — can be a real problem. Fortunately, some drops are easy to reverse. Often, you can use simple tweaks to your website to find out why you’re losing traffic and what you can do to boost traffic to your site.
These are some of the most common reasons why your traffic may be dropping off — plus what you can do to identify where you’re losing traffic and how you can improve your numbers.
1. Lost Links
When determining a page’s search ranking, Google takes into account external backlinks to your site. The more reputable sites link to yours, the more likely Google is to consider it trustworthy, which can help push it up in the search rankings.
Backlinks are just one of the many variables Google’s algorithm uses to determine search rankings. But they can have a major impact on how your site ranks.
As a result, losing these links — or breaking them by accident — can negatively impact your search rankings, causing traffic to drop off.
There’s a handful of paid tools that can help you track lost links. With this info, you can try to reclaim some of those links by asking sites that drop links to your site to reinstate the backlink.
2. Search Algorithm Updates
Google is constantly tweaking the algorithm behind the company’s search engine. These changes aren’t always announced formally, but even the smallest adjustments can still have a major impact on how your site pages rank — or if they show up in a search at all.
Following Google’s updates is a good way to know if a traffic drop is due to a recent algorithm change. If you find your traffic has dropped off, it’s always a good idea to check if Google has pushed a major algorithm update in the past few days.
You can also use an algorithm-tracking tool or service to follow unannounced updates to the algorithm. Many SEO and online marketing publications also track Google’s updates and may provide you with an idea of what’s changed and how to respond.
3. Competitor Site Upgrades
Your site traffic is often in direct competition with other businesses. This is especially true if you’re advertising to a specific geographic audience — like people in a specific town or county.
If your competitor improves their site — optimizing for search, securing more external links, or offering more for visitors — it could boost their traffic and negatively impact yours.
Whenever your traffic drops significantly — if you’ve already tried looking elsewhere but still can’t explain the sudden drop in traffic — you might want to check how your competition is faring. If you have a map of regional competitors, for example, you may want to check the traffic received by your competition that’s operating in the same area you’re trying to target.
If you’ve both seen traffic drops, that could mean an algorithmic change has made it harder to draw traffic. On the other hand, if your traffic is going down while theirs is going up, it could be a sign they’re trying something different — and it’s working.
You won’t have as much data on your competition’s traffic as you will on your own pages. The data you can collect, however, will be valuable in understanding why your site’s traffic patterns have changed.
4. Tracking and HTML Errors
Broken or missing pages are sometimes a cause for major traffic drop-offs. If you’re experiencing a major traffic drop-off, you can always start by checking your crawling and HTML errors. A lot of errors could be the cause for lost traffic.
Sometimes, a traffic drop can also be due to a missing tracking code.
If you pull a tracking code from your site, your page won’t return new information on the traffic it’s receiving. This can give the impression that your traffic is way down.
If you notice there are suddenly no sessions being tracked in Google Analytics or in your web analytics platform of choice, that may be a sign that a tracking code is missing, or some other tool has stopped working.
5. Mistakes in Your Robots.txt
A robots.txt is a text file that instructs web bots on how to crawl your site for information. These bots are, in many cases, search engine bots that scan the pages on your site for search indexing purposes.
It’s possible to block web robots from crawling your page with the robots.txt file. Often, web developers or webmasters do this to prevent the bots from indexing pages that aren’t live or that they just don’t want appearing in a search.
Mistakes in a robots.txt file can cause web robots to skip over important pages. As a result, they won’t be indexed, or they may not have a description on the search results page. Both of these mistakes can seriously reduce the traffic a page receives.
If you’ve noticed a major drop-off in traffic — or you’ve recently launched a page and it doesn’t seem to be receiving sessions from web searches — you may want to quickly check your robots.txt to see which pages crawlers may be skipping over.
You can also quickly test your site’s robots.txt using tools, like Google’s robot.txt Tester utility.
Fix Dwindling Site Traffic
Falling web traffic can be alarming — but sometimes all you’ll need is a simple tweak to get back on track.
Adjusting your site for updates to the algorithm or patching up errors in your code, for example, can make a big difference in traffic.
In some cases, you may need to do a little more to grow your web traffic. You may need to get in touch with websites that talk about your brand and recover a handful of lost links. You may also need to improve your site to handle changes your competitors have recently made to their websites. Upgrading your site to make it more appealing than the competition’s web presence may also be necessary to fix falling site traffic.
Author Bio: Eleanor Hecks
Eleanor Hecks is editor-in-chief at Designerly Magazine. She was the creative director at a prominent digital marketing agency prior to becoming a full-time freelance designer. Eleanor lives in Philadelphia with her husband and pup, Bear.