Archive for May, 2011



20
May
11

Friday’s SEO Site of the Week

We’re always looking for ways to make this business easier. And of all the things that need to be easier, link acquisition would have to be right up there at the top. Face it. Link acquisition sucks. Particularly if you are determined to get good quality, white hat, traffic-driving, useful, and interesting links—not to even mention links that pass SEO juice.

So when I tell you that the SEO site of the week this week is YouTube, you have to think about it this way:

Where else can you post a 2 minute video of a cat chasing its shadow that can potentially turn into 1,000s of links overnight?

Seriously. This is some mighty mojo. Of course, it’s not that easy. You have to do things just right to get any link value out of YouTube. You have to

  • produce video clips that are relevant to your website (tutorials, reviews, interviews, etc)
  • make videos that are high enough quality to suit the intended audience (doesn’t mean you need Spielberg, just that you need to think through a script, use fairly decent equipment, and edit the thing)
  • upload videos that are in some way attractive (funny is good, but useful, interesting, odd, or even seductive can be good)
  • make sure it’s embeddable (because when somebody embeds your vid, that’s what makes it a link)

If you’re lucky enough or good enough, your vid might even go viral. And that’s a good way to gather lots and lots of links. Let’s give ’em a matched trio of wizard hats.

19
May
11

5 Ways Google Analytics Can Help You with SEO

Pop Quiz:

What’s the most important, totally free thing you can (and should) do before starting an SEO campaign? Give up?

Analyze your traffic data.

And that means either sifting through densely packed visit logs until your brain explodes, or turning to one of the many available log data analysis packages. Some are easy to use, some are more difficult, some are sophisticated and deep, some barely scratch the surface. Some are very, very expensive. Some are free.

The best of the lot—for easy-of-use, depth of data, flexible reporting, customize-ability, conversion tracking, and more—also happens to be one of the free ones. Google Analytics (GA). If you haven’t already installed GA on your website, you should right now. Today. Because every day without GA is a day lost to the mists of information purgatory.

Here are 5 things you can learn from GA that will definitely boost your chances of search marketing success.

  1. Traffic patterns.  This is basic. The number of people visiting your site from day to day (even hour to hour) is crucial information. How else will you know when something works? Or doesn’t? Any serious SEO manager should always have a clear idea of how many people visit their sites and how that number changes over time.
  2. Visitor identification. Knowing just who these visitors are is pure gold. GA can tell you where they come from, refined to the country, region, and city they’re coming from; whether or not this is the first time they’ve visited; how many pages they looked at; how long they stayed;  which browser they used; even what language they speak. Understanding visitor demographics puts you in control of your marketing messages.
  3. Traffic sources. How did your visitors find you? GA can tell you if they clicked a link from another site, or used a search engine, or came from a paid placement ad, or typed your domain straight into the address box. You’d be surprised at how many SEO pros can’t tell you the percentage of visits that come through, say, Bing. You, however, should know this.
  4. Pages visited. When people visit your site, you probably want them to do more than scan your home page and split. You want them to go inside. You want them to find something they care about. You want them to perform some act that benefits them, and hopefully, you. So, you need to know which pages they hit, how often they hit them, and how long they stay there.
  5. Keywords used. And here is the real treasure. Before you optimize a webpage around particular keywords and key phrases, you must make yourself aware of the keywords that are already drawing traffic. Why? Because if they are the wrong keywords—keywords that are less-than-relevant, keywords that not conducive to sales conversion, or even keywords that place your business in a negative light—you can use your SEO mojo to fix it. If the keywords driving traffic to your site are perfect, but you want more out of them, well, you can’t grow it if you don’t know it.

But that’s not all! There’s more! If you act now, you will also receive the ability to set and track specific goals! AND filter out visits that might be unduly inflating your numbers! And more, much more!

And the best part is, of course, that it’s free. Google Analytics can be installed on almost any website, usually in a matter of minutes. All you need is a Google Account—and if you have Gmail, you already have one.

18
May
11

Keyword Density: Does It Matter?

If you’ve ever looked into SEO as a possible solution to your website traffic shortage, you’ve probably come across the concept of “Keyword density.” This is usually expressed in a statement something like:

Is your keyword density optimal? Get your keyword density to the magic number xx% and riches will rain down on you!

Run a Google search on “keyword density” and you’ll return about 1.7 million results, many of them tools designed to analyze your density so you can optimize it and become like an SEO God.

Okay. Keyword density refers to the ratio of a particular keyword to the rest of the indexable words on a given page. So, in the sentence:

Try Seafood Charlies deluxe seafood package–just right for Mom!

The density of the keyword “seafood” would be 2 out of 10 keywords, or 2/10, or 20%.  Supposedly, this tells Google, Bing, et al that the page is about “seafood” and convinces them to rank it higher than a similar page with only a 10% density for that particular keyword.

Only the story goes that too high a keyword density means you are “over optimized” and that counts against you. The gurus tell you that there is a sweet spot of keyword density. I’ve heard anywhere from 2% to 9% pretty often.

So, is it true? Do search engines really care about the ratio of keyword to text in any given page?

Um, well, maybe not so much. At least not in the simple way usually implied. Sure, Google’s famous 200+ element algorithm probably figures in density somewhere, in some fashion, to some particular end. But let’s be clear here:

There is no “optimum” keyword density percentage.

What there probably is:  a red flag that arises if the density is high enough to raise suspicions of manipulation. And the keyword density number that triggers that red flag will not be predictable at all because:

  • it will be a different value for different keywords
  • it will be a different value for different industries
  • it will be a different value for different page text word counts
  • it will be a different value for different contextual situations
  • it will depend on whether there is other evidence of manipulation on the same page

Which brings up the real reason keyword density is a bit of a straw man in a stuffed shirt chasing a red herring.

Google (and Bing too unless they’re really behind the curve) has grown smart enough to parse context. And not by counting words. The modern algorithm understands context in much the same way you do. They scan headlines and subheads, they note emphasized text, they determine what a paragraph is actually about by reading the sentence subjects and relating them to the words surrounding them. They pick out the key concepts and relate them to synonyms appearing in the immediate vicinity. Relate them to images. Relate them to the text in links pointing to that page.

But mostly, they DON’T GIVE A DAMN if the keyword density of a page is 2% or 10% or 35%, just as long as the page content makes sense and isn’t full of deceitful practices.

Proof? The number 1 page in Google for the search term “SEO” is:

http://en.wikipedia.org/wiki/Search_engine_optimization with a keyword density of 0.77%. That’s pretty low by any standard.

The good news? It turns out that if you write reasonably well, with your subject matter firmly in mind, with a keyword or two on a sticky note at the bottom of your monitor, you should naturally come up with a lovely keyword density. No really.

http://www.speedoflightenterprises.com/seo.html, with a density of 2.88% was not tweaked in any way to enhance the density. (I know because I wrote it, and I DO NOT CARE about keyword density, not one little bit.)

Write naturally. Write for a human audience. Know what you’re talking about. And it doesn’t hurt to understand what the humans are most likely to type into Google when they want to read what you’ve written.

17
May
11

SEO Comics: CEO View of SEO, part 4

SEO Comics

16
May
11

SEO and the Web Filter Bubble

I’ve mentioned this sort of thing beforecustomized, personalized search results based on what engines and websites think you want to see. Which means that everybody’s search results are different. Which means that you can never know just exactly where you will show up for any individual person’s search. And so far, there’s not a damn thing you, as a search engine professional, can do about it.

This is what will kill SEO. And the web. And probably everything else we hold dear.

Brought to you by TED. The smartest place on the planet.

13
May
11

Friday’s SEO Site of the Week

This week, we’ll profile one of SEO’s hidden gems. This isn’t a site as much as it’s a program. A free program. Free as in Free Beer program. That happens to be one of the most useful little free programs ever.

Xenu’s Link Sleuth.

You wouldn’t know it from looking at the project page linked to above. Wow. This page looks like it was made in Mrs. Marcola’s Second Grade HTML Fun Day class. Might have been. But if you snicker and bail the minute you see this page, you’ll miss out. Because what Link Sleuth is, is nothing less than your own personal pet spider.

Interested yet? Maybe it will help if I list a few things a personal pet spider might be good for….

Xenu’s Link Sleuth will

  • check your entire site for broken pages
  • check your entire site for broken links
  • show you exactly how a search spider treats your navigation
  • points out duplicated content
  • points out duplicated page titles
  • looks for local “orphan” files
  • check a nearly unlimited number of links
  • do all of this very, very fast.

Now, to be fair, this is an old, old free program that has not been updated in a long, long time. There may well be better free spiders out there, I haven’t looked for a while. And there are probably much better link checkers available, if you want to spend money on them.

I’m personally very fond of “free.”

I’ll give it several beers.

12
May
11

10 Step SEO: Sitemaps, part 2

Let’s put this 10 Step thing to bed, then, shall we? Last week we talked about HTML sitemaps, the kind that live on your site and are linked to from your pages and that actual real people might even be able to access and use. This week, we’ll delve into their more esoteric cousin: the XML sitemap.

In 2005, Google unveiled what they called the “Sitemaps Protocol.”  The idea was to create a single format for building a sitemap file that all (or at least “most”) search engines could use to find and index pages that might be otherwise difficult to crawl.  This protocol uses XML as a formatting medium. It’s simple enough to code by hand, but robust enough to support dynamic, database-driven systems.

At first, only Google crawled sitemap.xml files, but they encouraged webmasters to create and publish them by opening a submission service. You would build an XML sitemap, upload it to your web server, then submit the URL to Google via their webmaster interface. The Goog would crawl it, and—in theory—follow all the links and index all your pages.

It actually worked rather well. Pretty soon, all the web pros were calling the system “Google Sitemaps” and uploading and submitting like crazy. With so many sitemaps installed on so many websites, it wasn’t long before the other major engines adopted the protocol.

Are XML sitemaps a magic bullet?

No. Don’t be silly. But they are useful additions to a website’s structural navigation, especially for complex architectures that may be resistant to spider crawls. We’ve used them on many sites and find that a valid XML sitemap can lead to a faster, more accurate indexing.

So what is this thing?

It’s really a pretty simple construction. You could easily make one without any understanding of XML at all. The Sitemaps Protocol dictates a text file, with the extension “xml,” using this template:

<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
        <url>
                <loc>http://example.com/</loc>
                <lastmod>2006-11-18</lastmod>
                <changefreq>daily</changefreq>
                <priority>0.8</priority>
        </url>
</urlset>

Every page on your site that you want crawled will have an entry between <loc></locl> markers. You do not have to set every parameter. This would be a valid sitemap for a site with a home page and three internal pages.

<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
        <url>
                <loc>http://example.com/</loc>
                <loc>http://example.com/page.html</loc>
                <loc>http://example.com/page2.html</loc>
                <loc>http://example.com/page3.html</loc>
        </url>
</urlset>

The other parameters—lastmod, changefreq, and priority—are nice ideas, but ideas we’ve never seen have any effect. So use ’em or don’t. You can write an XML sitemap with any text editor. Just be sure to save it with “utf-8” encoding and with the name sitemap.xml. (To save in “utf-8” encoding in Notepad, click “save as” and you’ll find it in a pull-down menu at the very bottom of the box.)

And wait! It can be even simpler! The Sitemaps Protocol also stipulates that a simple list of URLs in a text file like:

http://example.com/
http://example.com/page.html
http://example.com/page2.html
http://example.com/page3.html

(The file would be named “sitemap.txt” instead of “sitemap.xml” and also must be “utf-8” encoded.)

And wait again! Even simpler than that! There are a host of online tools that will turn a list of urls into an XML sitemap, or even spider your site for you and produce the sitemap file from that.

There are just a couple of rules to be mindful of:

  • Sitemap files cannot be over 10 MB
  • Sitemap files can be compressed as a gzip file
  • The maximum number of URLs per file is 50,000
  • Multiple sitemaps can be linked together with a “Master Sitemap”
  • Sitemaps should not contain duplicate URLs
  • Sitemaps should be referenced in your robots.txt file using this notation:
    • Sitemap: <sitemap_location>
      (of course, “sitemap_location” would be the actual URL address of your sitemap file)

When you have the file ready, you should use one of the many XML sitemap verification services. An invalid sitemap won’t help much.

Should you submit the file to search engines?

You can. If your site is brand new, it might help. But if you’ve done it right—complete with an entry in the robots.txt file—you really shouldn’t have to. Google, Bing, and Yahoo all know where you live.

Other sitemap resources

Sitemaps.org
Wikipedia on Sitemaps
Google’s List of Sitemap Generators
xml Sitemap Validator