Stay organized with collections Save and categorize content based on your preferences.
How HTTP status codes, and network and DNS errors affect Google SearchThis page describes how different HTTP status codes, network errors, and DNS errors affect Google Search. We cover the top 20 status codes that Googlebot encountered on the web, and the most prominent network and DNS errors. More exotic status codes, such as 418 (I'm a teapot)
, aren't covered. All issues mentioned on this page generate a corresponding error or warning in Search Console's Page Indexing report.
HTTP status codes are generated by the server that's hosting the site when it responds to a request made by a client, for example a browser or a crawler. Every HTTP status code has a different meaning, but often the outcome of the request is the same. For example, there are multiple status codes that signal redirection, but their outcome is the same.
Search Console generates error messages for status codes in the 4xx–5xx
range, and for failed redirections (3xx
). If the server responded with a 2xx
status code, the content received in the response may be considered for indexing.
2xx (success)
status code doesn't guarantee indexing.
The following table contains the most encountered HTTP status codes by Googlebot and an explanation how Google handles each status code.
HTTP status codes2xx (success)
Google considers the content for indexing. If the content suggests an error, for example an empty page or an error message, Search Console will show a soft 404
error.
200 (success)
Google passes on the content to the indexing pipeline. The indexing systems may index the content, but that's not guaranteed.
201 (created)
202 (accepted)
Googlebot waits for the content for a limited time, then passes on whatever it received to the indexing pipeline. The timeout is user agent dependent, for example Googlebot Smartphone may have a different timeout than Googlebot Image.
204 (no content)
Googlebot signals the indexing pipeline that it received no content. Search Console may show a soft 404
error in the site's Page Indexing report.
3xx (redirection)
Googlebot follows up to 10 redirect hops. If the crawler doesn't receive content within 10 hops, Search Console will show a redirect error in the site's Page Indexing report. The number of hops Googlebot follows is user agent dependent; for example, Googlebot Smartphone may have a different value than Googlebot Image.
Any content Googlebot received from the redirecting URL is ignored, and the final target URL's content is considered for indexing. For robots.txt files, learn how Google handles a robots.txt that returns a 3xx
status code.
301 (moved permanently)
Googlebot follows the redirect, and the indexing pipeline uses the redirect as a strong signal that the redirect target should be canonical.
302 (found)
Googlebot follows the redirect, and the indexing pipeline uses the redirect as a weak signal that the redirect target should be canonical.
303 (see other)
304 (not modified)
Googlebot signals the indexing pipeline that the content is the same as last time it was crawled. The indexing pipeline may recalculate signals for the URL, but otherwise the status code has no effect on indexing.
307 (temporary redirect)
Equivalent to 302
. 308 (moved permanently)
Equivalent to 301
. While Google Search treats these status codes the same way, keep in mind that they're semantically different. Use the status code that's appropriate for the redirect so other clients (for example, e-readers, other search engines) may benefit from it. 4xx (client errors)
Google's indexing pipeline doesn't consider URLs that return a 4xx
status code for indexing, and URLs that are already indexed and return a 4xx
status code are removed from the index.
Any content Googlebot received from URLs that return a 4xx
status code is ignored.
400 (bad request)
All 4xx
errors, except 429
, are treated the same: Googlebot signals the indexing pipeline that the content doesn't exist.
The indexing pipeline removes the URL from the index if it was previously indexed. Newly encountered 404
pages aren't processed. The crawling frequency gradually decreases.
401
and 403
status codes for limiting the crawl rate. The 4xx
status codes, except 429
, have no effect on crawl rate. Learn how to limit your crawl rate. 401 (unauthorized)
403 (forbidden)
404 (not found)
410 (gone)
411 (length required)
429 (too many requests)
Googlebot treats the 429
status code as a signal that the server is overloaded, and it's considered a server error.
5xx (server errors)
5xx
and 429
server errors prompt Google's crawlers to temporarily slow down with crawling. Already indexed URLs are preserved in the index, but eventually dropped.
Any content Googlebot received from URLs that return a 5xx
status code is ignored. For robots.txt files, learn how Google handles a robots.txt that returns a 5xx
status code.
500 (internal server error)
Googlebot decreases the crawl rate for the site. The decrease in crawl rate is proportionate to the number of individual URLs that are returning a server error. Google's indexing pipeline removes from the index URLs that persistently return a server error.
502 (bad gateway)
503 (service unavailable)
soft 404
errors
A soft 404
error is when a URL that returns a page telling the user that the page does not exist and also a 200 (success)
status code. In some cases, it might be a page with no main content or empty page.
Such pages may be generated for various reasons by your website's web server or content management system, or the user's browser. For example:
It's a bad user experience to return a 200 (success)
status code, but then display or suggest an error message or some kind of error on the page. Users may think the page is a live working page, but then are presented with some kind of error. Such pages are excluded from Search.
When Google's algorithms detect that the page is actually an error page based on its content, Search Console will show a soft 404
error in the site's Page Indexing report.
soft 404
errors
Depending on the state of the page and the desired outcome, you can solve soft 404
errors in multiple ways:
Try to determine which solution would be the best for your users.
The page and content are no longer availableIf you removed the page and there's no replacement page on your site with similar content, return a 404 (not found)
or 410 (gone)
response (status) code for the page. These status codes indicate to search engines that the page doesn't exist and the content should not be indexed.
If you have access to your server's configuration files, you can make these error pages useful to users by customizing them. A good custom 404
page helps people find the information they're looking for, and also provides other helpful content that encourages people to explore your site further. Here are some tips for designing a useful custom 404
page:
404
page has the same look and feel (including navigation) as the rest of your site.Custom 404
pages are created solely for users. Since these pages are useless from a search engine's perspective, make sure the server returns a 404
HTTP status code to prevent having the pages indexed.
If your page has moved or has a clear replacement on your site, return a 301 (permanent redirect)
to redirect the user. This will not interrupt their browsing experience and it's also a great way to tell search engines about the new location of the page. Use the URL Inspection tool to verify whether your URL is actually returning the correct code.
If an otherwise good page was flagged with a soft 404
error, it's likely it didn't load properly for Googlebot, it was missing critical resources, or it displayed a prominent error message during rendering. Use the URL Inspection tool to examine the rendered content and the returned HTTP code. If the rendered page is blank, nearly blank, or the content has an error message, it could be that your page references many resources that can't be loaded (images, scripts, and other non-textual elements), which can be interpreted as a soft 404
. Reasons that resources can't be loaded include blocked resources (blocked by robots.txt), having too many resources on a page, various server errors, or slow loading or very large resources.
Network and DNS errors have quick, negative effects on a URL's presence in Google Search. Googlebot treats network timeouts, connection reset, and DNS errors similarly to 5xx
server errors. In case of network errors, crawling immediately starts slowing down, as a network error is a sign that the server may not be able to handle the serving load. Since Googlebot couldn't reach the server hosting the site, Google also hasn't received any content from the server. The lack of content means that Google can't index the crawled URLs, and already indexed URLs that are unreachable will be removed from Google's index within days. Search Console may generate errors for each respective error.
These errors happen before Google starts crawling a URL or while Google is crawling the URL. Since the errors may occur before the server can respond and so there's no status code that can hint at issues, diagnosing these errors can be more challenging. To debug timeout and connection reset errors:
The error may be in any server component that handles network traffic. For example, overloaded network interfaces may drop packets leading to timeouts (inability to establish a connection) and reset connections (RST
packet sent because a port was mistakenly closed).
DNS errors are most commonly caused by misconfiguration, but they may be also caused by a firewall rule that's blocking Googlebot DNS queries. To debug DNS errors, do the following:
UDP
and TCP
requests are allowed.A
and CNAME
records are pointing to the right IP addresses and hostname, respectively. For example:
dig +nocmd example.com a +noall +answer
dig +nocmd www.example.com cname +noall +answer
dig +nocmd example.com ns +noall +answer
example.com. 86400 IN NS a.iana-servers.net. example.com. 86400 IN NS b.iana-servers.net.dig +nocmd @a.iana-servers.net example.com +noall +answer
example.com. 86400 IN A 93.184.216.34dig +nocmd @b.iana-servers.net example.com +noall +answer
...
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-02-04 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-02-04 UTC."],[[["HTTP status codes guide Google's indexing process, with 2xx codes generally leading to indexing attempts, 3xx codes indicating redirects, and 4xx/5xx codes hindering indexing and crawling."],["Soft 404 errors happen when a page returns a 200 status but lacks content or displays errors, requiring a fix through proper 404/410 status codes, redirects, or content correction."],["Network and DNS errors prevent Googlebot from accessing your website, impacting crawling and indexing and requiring troubleshooting through firewall checks, DNS verification, and potential hosting provider assistance."],["Google Search Console provides insights into indexing issues, allowing you to identify and resolve errors like soft 404s, network problems, and DNS errors for improved search visibility."],["Resolving soft 404 errors involves using appropriate status codes, redirects, or fixing content issues; network and DNS errors require debugging firewalls, verifying DNS records, and potentially contacting your hosting provider."]]],["HTTP status codes significantly influence Google Search. `2xx` codes signal potential indexing, while `3xx` indicate redirects, with `301` being a strong canonicalization signal. `4xx` codes exclude URLs from indexing, and `5xx` codes slow down crawling and may cause eventual removal from the index. Soft 404s, despite returning `200`, are excluded. Network and DNS errors are treated as `5xx`, impacting indexing negatively. Search Console reports `4xx-5xx` errors, failed redirects, and soft 404s, for better debugging.\n"]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4