Select Page
closeLook how old this is!
I post at SearchCommander.com now, and this post was published 15 years 3 months 16 days ago. This industry changes FAST, so blindly following the advice here *may not* be a good idea! If you're at all unsure, feel free to hit me up on Twitter and ask.

At the 2008 SMX advanced show in Seattle, one catchy phrase that I heard which stuck in my head was “cache dates are the new PageRank”.

Whomever said it in their presentation was spot on, and subsequent research tells me that Aaron Wall (the guy who literally “wrote the book” on SEO) first wrote about it nearly two full years ago.

Do a search for your chosen key phrase and look at the cache dates of those that outrank you. Are their dates newer than yours? I’ll bet they are in many cases.

In fact, look at the cache date for anything on the front page of Google, and I’ll bet it’s cache date might surprise you, despite the age of the page itself.

A recent launch of a new website spurred the research for this article, and before we had made it through this process, we began to see improved results immediately.

Another website, with over 18,000 pages hasn’t yet been so lucky yet, but we’re pulling out all the stops listed here to make it happen in a timely manner.

Obviously, the key to fresher cache dates is getting more frequent visits by the spiders, and these are all ways I can think of to increase that frequency. If you have other suggestions, BELIEVE me I want to hear about them, so don’t you dare leave without adding a comment!

Sitemap.xml
Of course getting account at Google Webmaster tools, and submitting an XML site map is one of the most common methods for getting your pages indexed, and for being able to measure the frequency crawl rate by Google.

While some people disagree entirely, I DO still like to use them for the information I can get from Google Webmaster Tools. I also recommend having multiple site maps if necessary, rather than on giant massive list.

In my experience it’s been far easier to have the majority of URLs indexed for several small site maps (without duplicate URL’s) as opposed to zipping one really large one with thousands of pages.

Also, something I learned just recently, is that Google prefers that you keep your direct image URLs completely OUT of your XML site map. Some free site map creators don’t allow you to filter out certain file types and truly, they are really not adequate.

Add New Content Regularly – NOT Just One Big Batch
If this is a brand-new website, then of course you’re going to add most of your content all in the beginning, but do hold something back so that you can continue to grow steadily.

The rate of your growth is going to have to be comparable with your industry competition, so there’s no “magic answer” to the question of frequency, but I’ve found that no matter how small the website, adding at least a page a month can make a drastic improvement.

In more competitive industries, where the competition is fierce, you’ll often see multiple authors adding content on a near daily basis, so how often you should post for your industry is admittedly all over the map.

Each time you add content, (assuming you have a good process and site structure) you’re telling the search engines that you’re YET AGAIN adding to your own authority in your chosen field.

The more often you post, the more often they’ll visit, and the faster your content will get ranked. I expect this post to be crawled and indexed within 4 or 5 minutes, and back when I was posting more often, I would usually have get a new post indexed in under 60 seconds.

Update Content on Old Pages Dynamically
Simply by placing some code on old pages you can continuously use your own RSS feeds to update and fresh and content throughout your site.

There are likely pages on your site where the content hasn’t changed for years, and frankly they don’t get cached or visited all that often, if ever. (Just look at some of your oldest pages buried in the index and view their cache dates!)

Assuming you’re adding content regularly, then the chances are also good that you’re doing it the right way . For example, millions are using a content management system like WordPress, which will provide them with RSS feeds.

Don’t be lazy and just add the main blog feed site wide – do more. Take advantage of your own subject specific RSS feeds to provide laser targeted fresh content to any page on your site using the RSS feeds from not only your main blog, but from your categories, and even your individual tags.

Regularly freshening the content of your pages will get you spidered and cached more often. If you don’t know how to add RSS to your pages, you might like my RSS tool that allows you to manipulate the feed and display it exactly as you with.

Deep Link to Yourself Intelligently
The easiest place to get a well anchored deep text link to the content of your choice is from someone you have total control over – yourself.

As you add articles and information to your website, don’t miss opportunities to link to other relevant areas of your site.

To be entirely smart about it, do a “site:domain.com key phrase” search at Google, and find pages that ALREADY rank for that phrase in your own little kingdom, and then use those as the targets of your own internal link campaign.

While you’re at it, review the title tags and descriptions of the pages you’re linking to and if necessary, make adjustments, because you probably wrote them a long time ago before you knew nearly as much as you know now.

In WordPress this can be accomplished automatically too, with a “Related Posts” plugin, adding related links to (I really love this one – Yet Another Related Posts Plugin) and this will increase the spider activity throughout your site, and keeping that old content fresh and in the “mind” of the search engines.

Make Static Sitemaps – Several if Necessary

Sometimes Googlebot is stubborn and simply will NOT index the links you want in your XML site map as fast or as often as you like. This is a good time to go back to basics with a static HTML site map, or even more than one.

By ensuring that you have a site map available, like I’ve recommended in my SEO 101 Top 10 for years now, and ensuring that it’s linked from the footer of your home page, you’re going to be giving the spiders easy access to every single link on your site.

If your site is hundreds or thousands of pages, then you’ll have to add multiple sitemaps, and strategically interlink these static maps so they all have less than 100 links on each. Even though it’s commonly believed that Google can handle 150, I usually prefer to keep them under 100.

For example, let’s say you have 2000 movie theaters across the country and want each page to be indexed. Multiple site maps would be necessary to ensure that all of your theater links get crawled from that map, so you could have one site map showing all the states, and then on each of those site maps you would have the individual theaters.

Linkvertising – Buy Inbound Links
Getting inbound deep links for free from other frequently crawled sites is obviously better, but it’s probably not going to happen as fast as you need it to when you’re just trying to get indexed.

Sometimes Google needs a push, (shove, kick, beating) when it comes to visiting all the pages, and when you find this is the case, you have to do what’s necessary. Often that means buying some links!

Google says it’s not okay to buy links for the purpose of improving your ranking but there’s certainly nothing wrong with buying links to improve your traffic flow both from visitors and spiders. Think of it as advertising.

There are many of credible link brokers out there, and often times a jumpstart is all it takes. Running some paid deep links for just a couple or few months is not that expensive, and it can do wonders for increasing your exposure to the spiders.

It is certainly possible to buy no followed links, and the search engines will still follow those links for indexing and caching purposes, bringing you visitors that are genuinely interested in what you’ve got.

Here are some recommended link brokers that will provide nofollowed links upon request – Text Link Ads, Linkworth and Link XL. On the other hand, something like Linkvana will not offer you nofollowed links, so be careful about “overdoing it” in those cases.

Submit a Press Release
There’s a a blast from the past, right? The fact that you’ve launched a new website or redesigned your old one is certainly newsworthy, so get off your duff and take advantage of this genuine news by submitting a press release at a decent level with PR Leap or PRWeb.

Press releases are still a highly effective way of telling the world and the search engines, “hey, look at me”, and by including DEEP links to new sections or to infrequently cached sections of your website, you’re going to be drawing attention of not only new viewers and customers, but also of the spiders to those “forgotten” (or new) areas of your website.

Update Member Profile Pages
If you belong to any member organizations like your local Chamber of Commerce, then you probably have a profile page where you can write a biography, and often you may have included a link to your homepage.

Over the past few years, many of the organizations that you belong to have likely enhanced your ability as a member to add deep links to your website from your profile. These can be hugely valuable, but are often overlooked by business owners.

If you’re a landscaper, and you’re a member of the National Landscapers Society (is there one?), then it makes perfect sense that you would want multiple deep links on your profile page, going to your yard maintenance category for example, and your tree trimming section, your lawn tips area, and so on.

Talk to your employees and business peers, and look through your memberships. What can you improve upon? At the same time you’re taking advantage of these new deep linking opportunities be sure to examine your original anchor text.

Even if they’ve not allowed you to place more than one link before, you still might be able to improve upon the text you’re using for your inbound link, or even send it to a new section.

Create Something Free
There are hundreds of credible software directories out there with plenty of inbound links themselves, and lots of delicious link juice to pass around to worthy businesses who are offering something of true value for free.

One tactic we’ve used in the past has been recently revived with good success by creating a free product, like a toolbar or a screensaver, and then submitting it to the hundreds of free software directories out there for distribution.

This used to be an incredibly tedious process that we had all but abandoned – that was until Michelle McPherson came out with her 30 minute backlinks process about a year ago.

Another thing that makes this so great, is that since the rules are different for every directory, when the links appear in the directories, some will use your chosen anchor text, and some wont. Some will link only to your software PAD file, and others will link to your home page, and some will link to your chosen giveaway page.

If you create a good giveaway page for your free software, like sort of a mini site map with plenty of good text links to your most important areas, then this one tactic can bring you a surprising amount of activity not only from spiders but from humans as well.

All of the links and subsequent spider activity created by the process will appear completely natural in nature because they ARE natural, and that’s a good thing! 30 Minute Backlinks just makes the process a lot easier, and it’s 100% white hat.

Add New Articles and Update Your Bio’s
Remember all those crusty old article directory accounts that you set up tediously when you started doing this a long time ago? Well it’s time to get back in there and examine your bio links, and perhaps adjust them either for anchor text or landing pages to new areas that you wish to have crawled.

Adding a couple of new articles to each of the directories with well anchored deep links to important deep pages is well worth your effort to get a fresh crawl from the spiders, and increase activity, and for anyone that says otherwise, I’d have to agree to disagree.

Get Links to Some New URL’s
This should go without saying, but it often gets overlooked in the strategies of day-to-day link building. As you add new areas to your site, be sure to add or adjust your link building tactics, whether those be by exchange, content trades or whatever, to include these new sections.

Remember, your inbound link profile should be well dispersed throughout your site, and not all pointing at all of your home page. Pointing links at these new or older sections can make a world of difference quickly.

I’ve been using the SEOmoz Pro Tools quite a bit this past month, including their new Linkscape, and I’m finding that in many cases we’re actually doing a poor job with diversification. It takes a lot of work to make it look natural!

Twitter & Get SEOcial
Make it easy for your visitors to share what they see on every page of your site by socializing those areas with popular social networking icons.

Take the 45 minutes necessary to go out and register your company name at all of these social networking groups even if you don’t intend to use them all is far better than someone else grabbing your name.

Social network applications like Twit This make it incredibly easy to share your content throughout the Web, and ensure that every post you make appears not only on your own website but on other websites too. This creates inbound links, and the ‘bots will come more often.

Yes, I know, that due to the no follow tag, these links may not always help your ranking, but they DO cite references to deeply categorized pages, and therefore, will increase that spider activity in a hurry.

Trade Some Content
Write industry relevant articles and offer them to other websites if they’ll include a deep link to your site in the bosy of the text. Having your content appear on other websites is another way to not only build inbound links, but will increase spidering activity as well, and of course, the more often you’re spidered, the fresher your cache dates will be.

UGC – User Generated Content
When posts are written that inspire comments, or you come right out and ask people questions, then the content on that page will change each and every time someone posts a new comment on that post.

A couple of good examples of pages that are cached frequently and crawled often on my site are one about fixing your Comcast email problems, and another about a terrible health insurance company that I worked with briefly back in 2002.

Both of these posts have inspired more comments from readers than I can even believe, and even though the posts are years old, the cache date is seldom more than a week old at any given time.

What can you do with a crusty old buried webpage gets indexed week after week? I’ll leave that to your imagination 😉

Other Ideas?
So it cache dates are the (not so) new PageRank now, what other creative ways can you think of that will provide us all with more ‘bot activity? I’m sure there are more, but this is all I can think of here, so please, feel free to share your own ideas in the comments below.

If you like what you've seen here, would you please share this?