A Guide to Auditing Your Google Webmaster Tools

A

Let us take you on a fantastic journey through Google’s Webmaster tools that will help both beginners and intermediate SEOs on your next site audit to discover problems, discover trends, find opportunities and uncover search penalties applied to your site. A huge shout out and thank you to the very talented Gyi Tsakalakis, SEO Director EPL Digital & AttorneySync for all of his insights!

Site Dashboard

Start at the GWMT Dashboard for a snapshot

What You’re Looking For:

  • Messages – unnatural link messages, googlebot can’t access, etc).
  • Crawl Errors – Do you have a large amount of server errors? Not founds?
  • Search Queries – How are we trending?
  • Sitemaps – URLs submitted vs Indexed? Close? Way off? Major change? Warnings?

Questions to ask:

  • Anything “jumping off the page?”
  • Which “version” of the domain is in the GWT account (i.e. root? www? sub-domain?).

Site Messages

Check your messages in GWMT!

What You’re Looking For:

  • Are there any messages or alerts?
  • Were there messages that were deleted? Alerts? Who are the users/owners?
  • Site messages can reveal a lot about what’s been going on. That is, of course, they haven’t been deleted.

Major messages to watch for:

  • Googlebot can’t access your site
  • Disavowed links updated
  • No manual spam action found (indicates reconsideration request submitted).
  • Increase in not found errors
  • Possible outages
  • Your site was hacked

Search Appearance

Structured Data

Any structured data being reported? If so, what kind and is it appropriate. Any errors?

A couple I usually check:

  • Person – authorship? are pages good candidates for authorship or is it just “sitewide” authorship spam?
  • Publisher mark up?
  • Breadcrumbs – implemented correctly? broken links?
  • Reviews – make sure not review spam, marked up correctly, no errors.
  • VideoObject
  • Local Address

Data Highlighter

Use the Data Highlighter in Webmaster Tools

Google is expanding the elements you can markup, they recently added local business and articles. If you aren’t getting the crawls & indexationyou want to your eCommerce site, then go in and markup your product pages to make sure good knows where your data is located onpage.

HTML Improvements

Big area to check! Check for:

  • Duplicate / Missing / Long / Short meta descriptions –
  • Duplicate / Missing / Long / Title tags –
  • Non-indexable content – Intentional? If not, why?

Sitelinks

Anything demoted? Why? You very rarely should use this feature, only if you’re getting a sitelink for a stale or off-topic page.

Search Traffic

Search Queries

Another very informative GWT report!

Top Queries

How is it trending? Is GWT linked to GA? Can we get bigger time sample set from GA?

Make sure you understand Filters (i.e. what impression/click/pos data you’re looking at). Is your site intended for visitors in a certain location? Looking for trends here (not specific target numbers).

Use with change tab + filters to see relevant trends.

In my opinion, one of the best uses for GWT is identifying pages with SERP CTR problems and coming up with solutions to improve SERP CTR. Dan Shure has some good info on this if you’re looking for more in-depth information.

Top Pages

Pages with a lot of impressions, decent positions, low click-through-rates? Why? Thin? Compelling titles/meta descriptions? Should the page continue to exist? Does it need an update?

TIP: Mine keyword data at the page level. GWT will show Impressions and clicks by page. VERY USEFUL for developing content strategy/themes.

(Wish they would add avg pos at this level of detail)

Also, good place to look for pages that “shouldn’t be there” (i.e. hacked site, params, etc).

Links to Your Site

Another “valuable data here” alert.

Who links the most

You can find more about what’s “going-on” with a site here than in most other places. Important to keep in mind that it’s not a comprehensive list (need to cross-ref w/ Moz, Ahrefs, Majestic).

Gives a really good sense of penalty risk. Also useful for checking for negative SEO attacks (yes, it’s a thing).

Your most linked content

Another great section to view internal links structure, but also how people are linking. Are the majority of your links going to your home page? Deep pages? Blog posts? Helps answer the: who, what, why and how of your pages’ link earning power.

How your data is linked

General insight to what Google thinks your pages are about.

Internal Links

Helpful to understanding overall site organization. I prefer Screaming Frog for this analysis.

Again, helpful to see your site how Google sees it.

Manual Actions

ALERT: Important one is there’s anything in here, needs to be addressed promptly.

Hopefully, it always says: No manual webspam actions found.

If you do get a manual webspam action, make sure you understand which type and what Google recommends in terms of addressing the problem.

If you work with someone to remedy manual webspam action, make sure they have experience getting these issues resolved. If their first recommendation is “disavow all the links,” they don’t know what they’re talking about.

Google Index

Index Status

What is indexed? Anything indexed that shouldn’t be? Anything blocked that we want indexed?

As we add new pages is Google adding them to index? Is our indexed page count flat-lined? Going down?

Content Keywords

Another one of those, “how does Google understand my site,” tools. Also good for mining new keyword variants that you might not have thought of.

Remove URLs

I usually use other means (robots.txt, meta robots tags, etc) for URL removal. One exception: site hacks. Good way to block entire hacked directories quickly.

 

Crawl

Crawl Errors

Essential to identifying site problems. Especially when checking for:

  • Server errors
  • Missing pages
  • Stuff getting crawled that shouldn’t be.

Make sure that you fix as many of these errors as possible, be sure to use our Htaccess wizard to build a set of 301 redirects.

Crawl Stats

General “Google site EKG.” Looking for major increases or decreases in any of:

  • Pages crawled per day
  • Kilobytes downloaded per day
  • Time spent downloading a page

Fluctuations here aren’t necessarily dispositive of major issue, but Google crawlers are telling you “something happened” when there are big swings in these metrics.

Fetch as Google

This is really how Google “sees” your pages. Check for server header errors and code anomalies (i.e. whether code in fetch matches code you intend to show).

Can also be used to notify Google of major site changes (shouldn’t be regularly used for this purpose). Theme changes, major section moves/updates, etc.

Blocked URLs

Robots.txt analysis, the root of so many major site problems.

If you see:

User-agent: *

Disallow: /

Well, there’s your SEO problem…

Also useful for keeping out your site’s trash (stuff that shouldn’t be crawled/indexed).

Sitemaps

Some people have dismissed sitemaps as less useful these days. I still think they’re very helpful in auditing site performance. Especially when trying to determine what should be in/out of the index. Some webmasters will “kill pages” but fail to remove from sitemap which can confuse crawler.

URL Parameters

I tend to tell people stay out of here unless they’re an advanced SEO or working with large e-commerce sites.

Security Issues

I’ve seen more sites “hacked” recently. This is where you can identify whether Google has found a hack. However, I wouldn’t rely on this section as your “hacked/not hacked” gauge. By the time Google finds the hack, it might be too late to head it off at the pass.

Additional Webmaster Tools Resources:

Glimpse into possible future of GWT: http://www.mattcutts.com/blog/webmaster-feature-requests/

 

Add Comment