Let us take you on a fantastic journey through Google’s Webmaster tools that will help both beginners and intermediate SEOs on your next site audit to discover problems, discover trends, find opportunities and uncover search penalties applied to your site. A huge shout out and thank you to the very talented Gyi Tsakalakis, SEO Director EPL Digital & AttorneySync for all of his insights!
What You’re Looking For:
- Messages – unnatural link messages, googlebot can’t access, etc).
- Crawl Errors – Do you have a large amount of server errors? Not founds?
- Search Queries – How are we trending?
- Sitemaps – URLs submitted vs Indexed? Close? Way off? Major change? Warnings?
Questions to ask:
- Anything “jumping off the page?”
- Which “version” of the domain is in the GWT account (i.e. root? www? sub-domain?).
What You’re Looking For:
- Are there any messages or alerts?
- Were there messages that were deleted? Alerts? Who are the users/owners?
- Site messages can reveal a lot about what’s been going on. That is, of course, they haven’t been deleted.
Major messages to watch for:
- Googlebot can’t access your site
- Disavowed links updated
- No manual spam action found (indicates reconsideration request submitted).
- Increase in not found errors
- Possible outages
- Your site was hacked
Any structured data being reported? If so, what kind and is it appropriate. Any errors?
A couple I usually check:
- Person – authorship? are pages good candidates for authorship or is it just “sitewide” authorship spam?
- Publisher mark up?
- Breadcrumbs – implemented correctly? broken links?
- Reviews – make sure not review spam, marked up correctly, no errors.
- Local Address
Google is expanding the elements you can markup, they recently added local business and articles. If you aren’t getting the crawls & indexationyou want to your eCommerce site, then go in and markup your product pages to make sure good knows where your data is located onpage.
Big area to check! Check for:
- Duplicate / Missing / Long / Short meta descriptions –
- Duplicate / Missing / Long / Title tags –
- Non-indexable content – Intentional? If not, why?
Anything demoted? Why? You very rarely should use this feature, only if you’re getting a sitelink for a stale or off-topic page.
Another very informative GWT report!
How is it trending? Is GWT linked to GA? Can we get bigger time sample set from GA?
Make sure you understand Filters (i.e. what impression/click/pos data you’re looking at). Is your site intended for visitors in a certain location? Looking for trends here (not specific target numbers).
Use with change tab + filters to see relevant trends.
In my opinion, one of the best uses for GWT is identifying pages with SERP CTR problems and coming up with solutions to improve SERP CTR. Dan Shure has some good info on this if you’re looking for more in-depth information.
Pages with a lot of impressions, decent positions, low click-through-rates? Why? Thin? Compelling titles/meta descriptions? Should the page continue to exist? Does it need an update?
TIP: Mine keyword data at the page level. GWT will show Impressions and clicks by page. VERY USEFUL for developing content strategy/themes.
(Wish they would add avg pos at this level of detail)
Also, good place to look for pages that “shouldn’t be there” (i.e. hacked site, params, etc).
Links to Your Site
Another “valuable data here” alert.
Who links the most
You can find more about what’s “going-on” with a site here than in most other places. Important to keep in mind that it’s not a comprehensive list (need to cross-ref w/ Moz, Ahrefs, Majestic).
Gives a really good sense of penalty risk. Also useful for checking for negative SEO attacks (yes, it’s a thing).
Your most linked content
Another great section to view internal links structure, but also how people are linking. Are the majority of your links going to your home page? Deep pages? Blog posts? Helps answer the: who, what, why and how of your pages’ link earning power.
How your data is linked
General insight to what Google thinks your pages are about.
Helpful to understanding overall site organization. I prefer Screaming Frog for this analysis.
Again, helpful to see your site how Google sees it.
ALERT: Important one is there’s anything in here, needs to be addressed promptly.
Hopefully, it always says: No manual webspam actions found.
If you do get a manual webspam action, make sure you understand which type and what Google recommends in terms of addressing the problem.
If you work with someone to remedy manual webspam action, make sure they have experience getting these issues resolved. If their first recommendation is “disavow all the links,” they don’t know what they’re talking about.
What is indexed? Anything indexed that shouldn’t be? Anything blocked that we want indexed?
As we add new pages is Google adding them to index? Is our indexed page count flat-lined? Going down?
Another one of those, “how does Google understand my site,” tools. Also good for mining new keyword variants that you might not have thought of.
I usually use other means (robots.txt, meta robots tags, etc) for URL removal. One exception: site hacks. Good way to block entire hacked directories quickly.
Essential to identifying site problems. Especially when checking for:
- Server errors
- Missing pages
- Stuff getting crawled that shouldn’t be.
Make sure that you fix as many of these errors as possible, be sure to use our Htaccess wizard to build a set of 301 redirects.
General “Google site EKG.” Looking for major increases or decreases in any of:
- Pages crawled per day
- Kilobytes downloaded per day
- Time spent downloading a page
Fluctuations here aren’t necessarily dispositive of major issue, but Google crawlers are telling you “something happened” when there are big swings in these metrics.
Fetch as Google
This is really how Google “sees” your pages. Check for server header errors and code anomalies (i.e. whether code in fetch matches code you intend to show).
Can also be used to notify Google of major site changes (shouldn’t be regularly used for this purpose). Theme changes, major section moves/updates, etc.
Robots.txt analysis, the root of so many major site problems.
If you see:
Well, there’s your SEO problem…
Also useful for keeping out your site’s trash (stuff that shouldn’t be crawled/indexed).
Some people have dismissed sitemaps as less useful these days. I still think they’re very helpful in auditing site performance. Especially when trying to determine what should be in/out of the index. Some webmasters will “kill pages” but fail to remove from sitemap which can confuse crawler.
I tend to tell people stay out of here unless they’re an advanced SEO or working with large e-commerce sites.
I’ve seen more sites “hacked” recently. This is where you can identify whether Google has found a hack. However, I wouldn’t rely on this section as your “hacked/not hacked” gauge. By the time Google finds the hack, it might be too late to head it off at the pass.
Additional Webmaster Tools Resources:
Glimpse into possible future of GWT: http://www.mattcutts.com/blog/webmaster-feature-requests/