Google’s Webmaster tools have become invaluable, but the more they add, the more opportunity there is for improvement.
At Danny Sullivan’s upcoming SMX show in Seattle, the intimate setting should offer ample opportunity to talk with some of the people that you’ve been reading about for years now. Googlers like Matt Cutts, Vanessa Fox, and others will also be on hand, soliciting feedback from attendees, as well as providing information.
Here’s what I plan to mention so far, but feel free to chime in with more…
- When viewing your “Webmaster dashboard” (list of all your domains), the ones that contain errors should be easily identified. Currently the only way to view “errors” is to log in and out of each domain, which can be time consuming and unnecessary. I do seem to remember that errors used to show here a long time ago. Am I wrong, or were they actually removed?
- When viewing your 404 errors not found, this statement is just not helpful – “Googlebot found these pages either in your Sitemap or by following links from other pages during a discovery crawl”. Google, why in the world would you not tell us which is which one it is, and if it is a link, tell us where the bad link is located?
- No single addition has been more helpful to me than the new links tab. In the external links view, clicking on the number of external links shows you every page that is linking to your site. Obviously, the usefulness of this tool would be vastly improved if it also showed the link text.
- When viewing statistics – query stats, the column on the left, “Top search queries” is great for quick identification of phrases, you may not even know that a website is ranking for. The ability to download the table is very useful too. Increasing the number from its current 20, and go up to to 100 or 1000 would be very useful.
- When viewing the same area, and the column on the right, “Top search query clicks” is also very useful, and it’s interesting to note the discrepancies between the two columns. This allows you to review possible other reasons why no one is clicking on your result (like perhaps a poor description tag?). Increasing this number up to a hundred or a thousand would also be much more practical, and greatly appreciated.
- This one is my pet peeve – Since we can upload a site map to tell you of all of our pages, and we can verify ownership through various authentication methods, then why can we not efficiently change a domain name without losing PageRank, losing rankings, losing inbound links, and losing trust? Since Google can easily take away all of the existing ranking and trust when a domain is sold or repurchased from a registrar, then how hard could it be to keep them intact when a domain name is legally changed? (I mentioned this issue at Pubcon Vegas 2006, and again before Rand Fishkin’s interview with Vanessa Fox. In her interview, she seemed to indicate it may be a future possibility.
Google has long been seeking input for ways to improve their tools, and has been very responsive to suggestions for a long time now. Be sure to take the opportunity during Q & A, or any other chance you get to make your suggestions.
If you’re not attending, feel free to write your ideas here, and I’ll make sure to mention them. Of course, if they’re brilliant ideas, I’ll have to take the credit 😉