Google just keeps on keeping on, working to make your search experience better, and last month (April 2012), it updated 53 aspects of its search algorithm. Google reps say that they make over 500 adjustments per year and 2012 has been a busy year for Google updates, and we’re not even at the halfway point yet. Their thrust is to make the end user experience better, to provide better results that are fresh and of good quality. That’s what I want when I search for something, don’t you?
So, what’s that mean to you as a webmaster? Let’s go over some of the updates — at least the ones that will affect your pages in search the most:
- Pagination update: This change affects web pages that are part of a set. For example, an ebook online, spread out for several web pages. It seems that in some cases many of the pages were showing up on the same SERPs (search engine results pages). So, Google tweaked that part of the algorithm to provide less sameness and more uniqueness to the results.
- Anchors bug fixed: Not sure if this means “anchor text” or just page anchors. They aren’t clear on the meaning, but I’m guessing it has more to do with anchor text than page anchors. (Anchor text is any word or phase that is hyperlinked.)
- More domain diversity: Rather than returning myriad results from one domain, this fix will attempt to provide information from more unique domains. For example, if you search “seo bounce rate” right now, you’ll see my picture in the 2nd and 3rd position (still… remember our test?). This fix is supposed to provide different domains, rather than the same domain down the page. Makes sense to me.
- Improvements to how search terms are scored: This changes how terms are scored and whether your page shows up in the SERPs or not and if it does, where it’s placed.
- More text from the beginning of your page in snippets: This kicks in where the beginning text on your page is relevant to the search query. So, basically, throw out your well-crafted description and use the beginning of your page for the SERPs description or “snippet.”
- Smoother ranking change for fresh results: Newer content is better and will be ranked higher than old stuff. Google wants fresh. So, all things being equal, the newer content will kick old content down or completely out of the SERPs.
This is why it’s so vital to create new content on a regular basis. If you’re not doing this, you’re not going to have a great showing in the SERPs. Google has also tweaked the algo to recognize fresh content better, and if it’s low-quality content, it won’t get preference over a higher-quality document, no matter how fresh it is. Squeeze pages are web spam to Google. Keep that in mind. You’ll find it very hard to get a squeeze page ranked these days.
- Better query interpretation: A boost for the system to understand a query better, no matter how crappily we type it in to the search box.
- News results improvements: For breaking news, you’ll likely see two results with a thumbnail image at the top of the SERPs page when you search for an important current event.
- Concise and informative titles: Google wants to change our titles to make them smarter than we originally type them. This one doesn’t make me very happy. If I craft a good SEO title, it should appear in the SERPs. I don’t really like Google reversing a Marcello decision, but that’s me. Writers are very sensitive to anyone messing with their work. 🙂
- Lots more spelling corrections: People don’t know how to spell. It’s a fact, and I’ve seen this so much on the Web. Well, not to worry. If you’re Ella the Bad Spellah, Google’s got your back.
- Fewer autocomplete predictions leading to low-quality content: Hmm… Google isn’t going to point you to as much garbage information anymore. We’ll see how that works out.
- Increase base index size by 15%: Yep… The Big Dog just got bigger.
- A new indexing tier: There used to be a supplemental index, but that went away some time ago or at least the label that used to appear on the results pages for it did. Apparently, they still have this shadow index where documents that don’t really make it to the SERPs live, and it grew up or down a new level. This is where lower-quality pages live. They did this to, “…support continued comprehensiveness in search results.”
- Sub-sitelinks expanded: For big sites, they may have no description at all, but expanded sitelinks instead. So, a site like Best Buy would have department links instead of the snippet of description. Google has also improved the way these sitelinks are ranked.
- Keyword stuffing classifier improvement: Oh, come on… We all know that keyword stuffing is a bad, bad thing, but now, Google has a better way of classifying the stuffage. Nuked from the planet! Fire, fire, fire!
- More authoritative results: This could hurt you, if you have a small site. Authority doesn’t just come from who you are and what your site is about, though it can. Search Engine Land, for example, has more authority than this blog. Why? Because those folks are a) known for SEO and b) they have more pages in their site. (No matter how much I write, I’ll never catch up, either. Those folks are prolific!) “Authority” in this case, though, means more pages. The more quality content you provide that is focused on a particular topic, the higher your site’s authority. Greater authority will always rank better in search, and a good reason why you shouldn’t try to compete for generic keywords. You’ll never rank that way.
- Another step toward rewarding high-quality sites: Here’s what Google says about that —
“We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available ‘above the fold.'”
And here we are again… Making more CONTENT available above the fold. Don’t make people scroll down to see the good stuff. If you’re using white hat SEO, Google likes that because it not only provides great content, but it helps spiders figure out what a page is about, gives end users easier navigation, and if you’re doing things right, also concerns making the site work faster. Google loves that. This is HUGE. They want to crush web spam, which includes “black hat” techniques, and as time goes on, it will be harder and harder to “game the system,” so why bother?
- Updates to the Rich Snippets Testing Tool: Now, if you have a web page that you want to add rich snippets markup to, you can check it before it goes live on the Web by adding the HTML to the tool. That’s very cool. Plus, as I mentioned the other day, you don’t have to tell Google about your author status now, they just detect it. My photo images showed up again after I got it all the updates fixed and the page was crawled. I didn’t have to let Google know. One less step in getting author status. Awesome!
So, that’s some of the updates, not all 53, and some of them are way more important than others to us, as consultants and searchers. If you want to see the whole picture, you can visit the Official Google Webmaster Central Blog and check them out.
The April 2012 Google Updates big takeaways are:
- Don’t use black hat tactics, like keyword stuffing
- Use better anchor text and better ALT tags for images
- Keep adding fresh content to your site, and pump up the volume!
- Be one of the “good guys” and you’ll get not only an all-day lollipop, but better search placement, too.
- Continue to craft great SEO titles and descriptions, but don’t freak out because Google might change them.
Well, I lied about the lollipop, but don’t be a bad egg in the Big Dog’s eyes. Think of your audience first. Give them cool stuff all the time, and use white hat SEO. It will help you get all that cool, free, targeted traffic that only search engines can provide.
Much better than a lollipop.