Is Normal Google Search Net Neutral?

When it comes to net neutrality, Google’s position has shifted rather considerably over the last five or so years. What was an initial CEO declaration of fighting the good fight has devolved into a compromise with Verizon.

Google would probably stand by their organic, unpaid listing as in the spirit of an open web. But is it? Given the massive conduit to web activity that it is, should we hold it to the same scrunity and openness standards to which we hold large telecom companies?

I believe that comparison between Google’s search results pages and ISP traffic-tiering is fair. They both provide the means of access and a filtering of what is actually accessed.

The Google Traffic Filter

Google’s listings represent the bottleneck in positive terms through a limitation of visible first page results selected via an ever-evolving algorithm. With Google, the user initiates the search, but because search results are limited per page (i.e. Google can’t show everything at once), Google is filtering traffic in its own way.

So the question is whether or not the Google algorithm is analogous to the direct, negative terms bandwidth throttling that can come with a lack of net neutrality.

Davids & Goliaths in the SERPs

The core of the issue is whether or not the little guy is getting muscled out. An abolishment of net neutrality will favour (in addition to the telecom companies, of course) the companies that can afford to pay to have their content seen first, or only. It would seem obvious that this would be the case with paid listings through their AdWords pay-per-click program, but what about organic listings?

Some search engine optimization is difficult, and even the parts that aren’t rocket science can still consume significant time and labour. PageRank (which correlates with rankings) come from links, and links are meant to come naturally from high quality content. Surely the biggest and best companies are more likely to be able to provide this.

Even if a little guy was able to develop an extremely link-worthy page suited to and optimized for a particular keyword, a strong site (in SEO terms) to house the content is extremely important for rankings.

The Big, The Old, and The Trusted

It takes money to make money, and it takes PageRank to make PageRank . Of course, you can find other ways to attract links, just as you can find ways to make money without significant investment capital. But the advantage for the future is heavily connected to the success of the past.

The black hole site phenomenon allows well-placing sites to be a common destination for new links and is not uncommon. Any time you have sought a quick reference to link to and chosen a conveniently located Wikipedia page, you have thrown more mass into the black hole — just as I did starting in the fifth word of this post.

Sites that already rank are more accessible and more likely to attract more links. Little guys without links, not so much.

Add to this the fact that the age of a site is a clear and established ranking factor. There is an element of correlation here; the longer a site has been on the web, the more likely it is to have attracted links.

But age is also directly relevant, as it is a marker of established trust. The newest sites face a kind of probation that older sites have no need to contend with.

Paying for Your Dues

Another irony is that, despite paid listings being just about the only place for a new, little guy to get noticed, it’s the big organic players who have the cash, will, and good sense to pay for paid listings.

AdWords functions on a keyword-bidding system, and a new little guy has his work cut out for him in trying to outbid a major, established competitor, but at least they can play the game. For some keywords and their niches, an organic listing will be flat out impossible to achieve, and the only chance for a Google-driven click will be bought. Not very neutral.

Google’s Goals?

We have to ask what Google’s goals are likely to be, and whether they ought to be given the responsibility that has come with their power. If their goal is to provide the best and most relevant content on the web to their users, then their current algorithm is potentially best overall.

But this is because the biggest and most powerful companies are the ones who are likely to have the means to produce the best content, or deliver products at the best prices, or have been around long enough to be established and trusted.

Consider the result of something akin to the Walmart effect on small businesses. Why couldn’t telecom companies adopt a similar rationale? If they can offer better content or better deals, they might not only be favoured corporately, but by users themselves. People will go through a biased web just as they still shop at Walmart.

Yellow Pages directories, aside from their advertising, link to businesses online and offline alphabetically. Abuse aside (come shop at A.A. Adelson & Sons Butchers!), this may be more fair, but less useful in finding the best businesses.

Indeed, local listings may be the only semi-neutral space left. Searches by proximity are the most fair, and companies can still stand out from the crowd by receiving positive reviews, which may not necessarily be easy for big companies to compete with, despite price being largely tied to a satisfying consumer experience. But on much of the web, location isn’t an issue, and as such, neither is proximity.

Net neutrality will continue to be a hot issue as it slowly erodes, and Google will continue to be a major player in the debate. But whether we praise Google for fighting for an open web on those terms, or chide them for selling out, remember that Google’s organic search hasn’t exactly been net neutral all along.

Longtail Locations: Conquering the SEO Landscape

The longtail is one of those beautiful concepts in SEO. It’s an elegant thought; once the volumes are small enough (longtail enough) for a specific term, they seemingly disappear in unimportance. In a game where quantity is so important, the battles are fought over the juicier prizes. But when you simply count up these longtail phrases, you realize how many there are, and the implications of such a quantity and their massive power in aggregate.

Where else does the longtail shape appear?

The local search market is serious, and always going to get more so as populations creep ever closer towards a 100% saturation of regular and competent Internet users, and as more and more of life gets wired. The classic [big city + keyword] combo isn’t going anywhere, and even keyword searches without specified locations are starting to be treated as local searches.

But what about small cities and towns? And then the longtail thought: how many…? If each tiny town is not so much worth the general ambitious SEO’s time, what about considering them in aggregate?

The only sites that seem to really be able to operate on this level are large directories like the Yellow Pages, who have reason to not only have pages for each town, but pages for each town and their adjoining business categories. Is this the only way?

As has been the case throughout history, the conqueror tends to have better technology than the conqueree. There’s a curious phenomenon whereby if you check smaller and small cities for their local web sites, they tend to get crappier and crappier. The web saturation is often lower in these areas, and there tends to not be the demand to push things forward. This creates a strange situation in which the urban SEO is so embroiled in his own conflicts that no time is spent mopping up easier territory, because it isn’t juicy enough. Isn’t it, though? Is there a way to reach enough at once, such that it certainly is?

Firstly, if the state of web use isn’t impressive in a tiny town now, it might very well be in the future. This is what you call “getting in early”. Can the web possibly stagnate to a halt in a given region, never quite catching up? I say, plant your seeds, and win some easy battles. Secondly, if you find a system that works, why not multiply it out? The longtail adds up. How many [longtail phrases + longtail locations] keywords are there?

Centralized vs Decentralized

Let’s say we’re going to go for the longtail locations. Which is better? To follow the Yellow Pages structure of one major site with pages for just about everything, or build local sites for each? The more finely grained you go in what you target on a per site level, the more actual sites you have to develop, multiplying out work that’s much easier on a page by page level within a single powerhouse. On the other hand, going fine grained not only gives a more targeted user experience and potentially engenders a feeling of local pride, but can also serve to differentiate and avoid a vs Yellow Pages sites head-to-head that is hard to win.

We know how Yellow Pages work. The next level down is the city directory. I live in Oxford, where Daily Info dominates the classified market. The site doesn’t come off as super au courant despite a Facebook and Twitter presence, but in a way it adds to the charm given that a printed paper version of weekly classifieds has been running since 1964. The online version is well-populated with reviews, and the fresh information has no equal within the city. Making sites like this means one site per city. No small task as far as work, and still potentially not targeted enough to have the highest in credibility.

The highest in credibility comes with a local listing for a specific area, for a specific niche. I grew up in a suburban area outside Montreal known as the West Island. Enter West Island Restaurants, and aptly named site fulfilling this specific function. It is not as long running as sites like Yellow Pages or Yelp!, yet given the focus, it has been attracting increasing numbers of reviews. How much more likely are people to comment and leave reviews on a site that seems like it’s made for them and their community? But making pages to cover the breadth of longtail locations, without some large scale automation, is just about impossible. A site for every niche in every city might be just too much, but could potentially have the most targeted, user-participation inciting appeal. Plus, do people in a city want to check a different site for everything to do with their city or town? Or is a site like Daily Info a better compromise for retention and a greater ability to cross over niches?

An SEO can try to pick a small subset of location+category combinations from which to build sites, but then the heart of the longtail concept is lost, and research on where to focus will ultimately, most likely, make it a medium tail project at its longest. That’s not necessarily a bad thing, but it’s something to consider vis-a-vis the longtail location discussion.

Local SEO vs Organic SEO

I have to be clear about what kind of SEO I’m talking about here, especially given the recent changes to Google local search listings. While local and organic results are increasingly blended, we can still considering them as two relatively separate strategies offering relatively separate results, even if they share the same results page.

The introduction of Google Places in shaping local search results points to a muscling out of directories that don’t offer much beyond basic listing information. Google provides its own basic information for those businesses that have been picked up on the Google Places radar, and doesn’t want to push traffic seeking such information elsewhere when they can address it themselves.

That said, the main way these Google Listings offer support to other people’s sites is in connecting users to external reviews on trusted sources. With this in mind, if offering a locally-tuned site is more likely to generate reviews, this might be the only way to grab local traffic. As with Daily Info, an offline presence may be necessary. Just as competing with Yellow Pages is tough, competing with Google as a Yellow Pages competitor is that much more daunting. More specific is the only way to go.

A very important question that needs to be answered is how well Google would be able to identify local city directories or city+category directories as authorities whose reviews are worth linking to from their Places results. If you build a small town directory, even if it’s full of reviews, will Google even notice? The Places listings review traffic, assuming it doesn’t go anywhere, might be important enough that a lack of assurance in Google noticing local directories and their reviews could be a deal-breaker. It’s certainly easier to draw from reviews on the big sites like Yelp! than try to find the little guys, even if they might have more to say.

Also, as it stands now, Places results are still mixed with organic results, and a city/town-targeted site has some immediate credentials for organic relevance, at least.

An obvious tough part is attracting link popularity, for which efforts are much more collected and as such easily distributed on a mammoth single site. But because the locations are as targeted as they are, they have potential for natural linking rankings that can snowball into other keywords. The organic competition is weaker, and the Places results are, so far, not as populated. As Andrew Shotland points out, there are still relatively few localified listings for small towns.

This is not to say that small local domination will be a walk in the park. Local listings are evolving quickly, and even if people take a while to sign their businesses up to Places, it doesn’t mean Google won’t take care of it for them. Every site will be an older local site than yours, but there is a lot less design and usability quality and SEO know-how, so these are winnable fights.

The biggest obstacle, whether the city directory option is chosen or the city+category option is chosen, is the development work. Business listing data is expensive and often requires significant cleanup and management of multiple sources. Scraping listings has ethical (legal?) considerations and can still be subject to disorganization and inaccuracy. If the longtail principle is followed, this could mean developing hundreds if not thousands of sites. Even automating processes, that’s still a whack of a hosting investment, and requires significant planning. Building would have to be streamlined, always at the risk of a cookie-cutter look suggesting a network. Public awareness of a network, from Google and from searching people, could impact performance.

So what do you think? Can a value-in-aggregate local SEO strategy mirror the organic SEO strategy in targeting the longtail of a given keyword niche? Does the ever-changing landscape of Google’s local listings make the endeavor entirely too risky to justify the expense? Is there anything unethical about even trying?

International SEO: Ranking At Home and Abroad

When a company targets different national markets offline, they often have separate strategies for each market. This will include distinct branding, packaging, and messaging for each market.

Well, the online world is no exception. When it comes to ranking in the SERPs in different languages and countries, you need a separate SEO strategy for each of the countries you’re targeting.

Google’s Global Faces

It’s no secret that Google strives to personalize and localize search results. And when it comes to national or linguistic markets, Google geo-targets search results at two levels.

First, there are the country specific version of Google, such as Google.ca and Google.co.uk. These are the default engines for users from their respective countries, and will serve up search results that are more localized.

For example, just as SERPs from Google.com are different from Google.co.uk, French results on Google.fr will be different than on the French version of Gooogle.ca.

Then, there is the US and International versions of Google.com. Basically, if you’re in Canada and manually navigate to Google.com, you’ll be forwarded to Google.ca. But you can then choose to click through to Google.com if you want.

But this doesn’t mean that you’ll get “US” or “International” or “objective” search results. Instead, you’ll get a blend of results from abroad and the country you’re in.

So if you’re targeting multiple countries, you need to monitor your rankings in several places. First, you’ll have to rank on the versions of Google specific to the countries you’re targeting. Secondly, you’ll also have to rank on Google.com for searches done from those countries.

The Elements of Ranking Abroad

There are several criteria that Google looks at to determine how relevant a site/page is to a specific national market. The more of them that you can meet, the better your site will rank in your targeted markets.

Site Domain: TLD vs ccTLD

One of the first places that search engines look to determine a site’s national relevance is its Top Level Domain (TLD). A TLD is simply the extension that appears at the end of a domain, such as .com, .net, etc. If your TLD is more relevant to a national market, the more likely it is that your site will rank on searches from that country.

There are two kinds of TLDs to choose from: general TLDs and country specific TLDs. General TLDs include .com, .org, .net, .edu, etc. These are better for ranking a site internationally, such as your parent brand or a multinational site.

Then there are Country Code TLDs (ccTLD), such as .ca for Canada, .co.uk for the UK, and .de for Germany. These are ideal for ranking within a specific country (but will limit your ranking potential abroad).

Site IP Address

Another factor that search engines use to determine the national relevance of a site is its IP address. All websites are hosted on a server somewhere, and all servers have an IP address which indicates where that server is located. So if ranking in a specific country is important to you, you should consider hosting your country specific site within that country.

Onsite Content (is King)

So we’ve all hear that content is king. Well, in SEO, content is king, queen, and pope all rolled into one.

Onsite content is the most fundamental part of SEO. And this goes beyond just page copy. It also includes page titles and meta descriptions. So if you’re targeting different linguistic markets, you will need page titles, meta descriptions, and page copy (such as product descriptions) in each of those languages.

Backlink Profile

Getting targeted backlinks to your site is also a big part of ranking well. The more backlinks you have, the better your site will rank overall. And when it comes to ranking on targeted terms, it helps to get links feature targeted anchor text.

Well, just as Google judges your site, it also judges the sites linking back to you, and looks at the TLD, IP address, and onsite content of those sites linking back to you. For example, getting a backlink from a .co.uk site that’s hosted in the UK will boost your rankings in the UK more than a link from a .com or a .co.uk that’s hosted in the US.

So for every market you target, you’re also going to need a separate linkbuilding/link-baiting campaign. This will involve obtaining backlinks from sites that have TLDs, IP addresses, and onsite content that all correspond to the country you’re trying to rank in.

Building a Site that Rank Internationally

So now it’s time to build different sites for different countries. But you don’t have bottomless pockets. Well, you have three general options for each country you’re targeting:

  • separate sites on separate TLDs and ccTLDs,
  • country specific subdomains under a primary TLD,
  • country specific subdirectories on a primary TLD.

Each of these has their pros and cons, and the one that’s right for your business will depend on factors such budget, available IT resources, and just how valuable any given market is to your business.

Separate Sites with Separate Domains

The optimal option for ranking in any given country is to build a unique site with relevant ccTLDs just for that country. Granted, this path is also requires the biggest investment – both to set-up and to maintain.

So, this option is probably best for large ecommerce entities that are targeting several large national markets at once (i.e. the return will justify the investment) and have sufficient marketing budgets to do so.

Pros

  • ccTLD – best chance for ranking locally.
  • IP Address – can host each site in target country.
  • Conversions – users confident they found local business.
  • Linkbuilding – local sites will be more willing to link to your local ccTLD.
  • Interlinking – can interlink each separate site to one another.

Cons

  • Maintenance – multiple sites to develop & maintain.
  • Investment – as many times SEO work as you have sites/ccTLDs.

Subdomains for Each Target Market

One cost-effective alternative to maintaining a separate sites with their own of ccTLD and on separate server is to have country specific subdomains (e.g. uk.domain.com, etc.). It can be a more challenging to get a subdomain to rank locally than it is a ccTLD, but there are some economies of scope to be had.

For starters, site maintenance is much simpler. Aslo, each subdomain can have a country specific IP address. Lastly, every link pointing to each subdomain will also increase the overall ranking of the parent TLD.

Pros

  • Maintenance – only one site to develop, host, and maintain .
  • IP Address – each site can be hosted in the country you’re targeting.
  • Backlinks – every backlink to each subdomain benefits entire TLD as a whole.
  • Interlinking – can interlinking each country subdomain to one another.
  • Usability – can group content by language instead of country.

Cons

  • Domain – your TLD must compete against ccTLDs in local SERPs.
  • Conversions – users will not have as much confidence as they would in a ccTLD.
  • Linkbuilding – more difficult to get links from local sites.

Subdirectories for Each Target Market

Finally, the least SEO friendly option (but most cost-effective one), is having country specific subdirectories (e.g. domain.com/uk). This option probably might make sense for (1) targeting smaller markets that can’t justify a bigger investment, or (2) smaller ecommerce portals that must remain focused on their products and services rather than their IT infrastructure.

Pros

  • Maintenance – only one site to develop, host, and maintain.
  • Backlinks – every backlink benefits entire TLD.
  • Usability – can group content by language instead of country.

Cons

  • Domain – your TLD must compete against ccTLDs in local SERPs.
  • Conversions – users will not have as much confidence as they would in a ccTLD.
  • Linkbuilding – more difficult to get links from local sites.
  • IP Address – entire site will be hosted in just one country.

Different Sites for Different Markets

Of course, before choosing what kind of site architecture you’re going go with, you have to do a cost-benefit analysis, and ask yourself question, such as: How valuable are the different markets you’re targeting? What is your budget per market? Is the priority order in which you should optimize your different sites?

While some markets will represent a source of revenue, they won’t necessarily generate enough to warrant a significant investment. So you might end up choosing to develop comprehensive ccTLD sites to target the US, Canada, and the UK, but opt for subdomains or subdirectories (on your .com site) to target other countries.

Essentially, you should approach international SEO just like you would any other international marketing effort. Start by examining the market opportunity that’s there, the value it offers your business, and then invest your IT and marketing resources in a way that makes good business sense.

The Frustration of Facebook Ads

With 500 million Facebook users globally, half of which log in every day, it’s no surprise that Facebook has become an integral part of online marketing. In fact, according to a recent ComScore report, Facebook now controls nearly 25% of the online market share. In an effort to begin turning a profit in 2007, Facebook introduced Facebook Ads. During the last quarter, the big FB has provided advertisers with more than 297 billion display ad impressions. Facebook Ads accounted for revenue exceeding $500 million in 2009 for Facebook. This number is expected to double in 2010.

You would expect such a highly used tool to be fairly intuitive. Sadly…this is far from the case. When compared to other ad tools, Facebook is almost comically difficult to use especially for business/enterprise users.

To help illustrate this point, I attempted to setup a business profile for the purpose of creating ads. Along the way I will take the opportunity to vent my frustrations with Facebook Ads in general.

Step 1. Create a business profile. Setup an ad or Fan page:

That’s right! Facebook places a baffling roadblock between you and your business account. You need to either create an ad or a Fan page before you can take a peek at the backend. The reason for this is unclear to me but if your goal is a business profile then you need to do one of the two in order to continue. This really set the tone for the remainder of the process.

I took the Ad route for this attempt:

Step 1a. Design an Ad:

 

If you’ve created PPC ads on other platforms you’ll find one noticeable difference: capitalization. I filled out the destination URL and Title with little problem but when it came to the Body Text I learned about Facebook’s “capitalization guidelines”.

From the Facebook site:

“You can only capitalize the first letter of the sentence along with all proper nouns, such as an individual or location’s names, days of the week and months, and cities, states, and countries.”

If you don’t follow these rules you’ll see this warning text:

 

I made the adjustments and tried to submit my ad except this time I received an error telling me I needed to choose an image… Now, I understand the purpose of having an image on an ad. I understand why it may not be a good idea to run an ad without one, but try to keep in mind the purpose of this exercise is to create a business account! Just to be clear, you need to have a dummy image handy if you plan on making a business account. Amazing. The process keeps getting longer…

Ok, so I designed my ad… now I need to set the targeting info:

Step 1b: Targeting

 

I put Location and Demographic info in then moved onto to “Likes & Interests”. Depending on what keyword you chose, Facebook will make suggestions related to the keyword you selected. The main problem with this tool is if you don’t like any of their suggestions then you have to continue on with guesses. There is no basic listing of suggestions. The only way you can see suggestions is in real-time as you type.

 

Additional Rant: There is also the problem with the search box not allowing you to copy and paste. This is a pretty huge problem that only compounds itself when you’re running several campaigns that need modifying. With no ability to copy & paste, a quick 15 minute campaign modification turns into hours!

Step 1c: Campaigns, Pricing and Scheduling

This step has the most long term implications of any other step. Of course, this was unknown to me when I first embarked on this adventure.

 

This is where I discovered one of the biggest problems of forcing people to create an ad before creating an account. The currency you choose at this step becomes the default currency for the account you create. That, in and of itself, is not a huge issue. The problem comes when you decide to change it later on.

Rant Begins: When you change the currency of your account, Facebook actually creates a separate account under your business profile. Why do they do this? I haven’t a clue. But this second account doesn’t copy the billing information from the first so if you start any campaigns on this new account then they won’t launch. You’re required to re-enter your billing information on the second account. Facebook never indicates this is the case when you try to sign up. Again, keep in mind I am still just creating a test ad for the purpose of signing up in the first place!

The next big problem with this process is in the schedule section. By default, Facebook sets ‘Today’ as the starting time for your campaign. You may think that’s not a big deal because it’s just a test ad, right? Well it is a big deal.

Rant initiated: Once you’ve created the ad you can begin the process of creating a business profile. Part of this process involves putting in your billing information. The second you’ve done this, guess what happens to the test ad you created? Yep, it’s launched.
This brings me to one final annoyance of setting up a business profile: Birthdays.

 

It may not seem like a huge deal but it just shows how little effort Facebook has put into Facebook Ads. There really is no reason for Facebook to know the birthday of a person running the ads. It seems obvious the only reason the field is there is because they’ve used the same signup form as they use for personal profiles where birthdates are required by law. Is it really too much effort to remove this requirement?

Facebook Ads accounts for over 80% of Facebook’s revenue. It’s been 3 years since its launch and I think we’re overdue for some much needed usability improvements.

Google’s Recent Local Listings Transformation… In Point Form

By now you may have heard about the significant changes to Google’s results pages for local search queries, and the mixed reviews from the SEO community. There are some great explanations and analyses put forward from prominent SEO writers from which I’ve drawn up a point form summary for the “tl;dr” type. Links to their articles will be at the bottom of the post, but for now, here’s the rundown for those in a hurry.

General Info

  • A New method of search called Google Places, clickable at the left of the SERP (near image search, blog search, book search, etc.) You should claim your listing ASAP.
  • Title tags and meta description are often being used in SERPs, organic-style. Time to make those meta descriptions as clickable as possible with local searchers using these new listings in mind.
  • This will engage businesses more directly, and allow more revenue to be driven to Google from the organic space.

The 7 Pack

  • The “7-pack” (list of local results above organic results) has been largely replaced withlocal (Google Places) results now blended with organic results.
  • Local listings no longer just shove down organic results like the 7-pack did.
  • There is much more information in local listings than before, including quotes from actual reviews and thumbnailsSo make sure your business’ Google Places listing has a picture associated with it.
  • 7-pack showed how many total reviews a listing had. Now you not only have a review quote included in the listing, but links to the pages on third party sites that show the reviews, with the number of reviews for that business on that site. If you own or work with a site that has the potential to get reviews in these SERP listings, it’s time to promote that feature.

Directories

  • SERPs are now juicy enough that clicks to directories are largely discouraged unless the user wants reviews, and page 2 is that much less likely to be reached.
  • Speculation is that this is a strong step in cutting out search engine middlemen (directories), keeping users from having to, essentially, search twice. So far directories are still ranking organically, but this may change. Even if they don’t, clickthroughs to directories will probably go down with full listings now in the SERPs.

Local vs Organic Results

  • A local listing is now as valuable as an organic listing, giving a chance for businesses to squeeze in some otherwise saturated spaces.
  • Supposedly the total of organic listings and local listings hasn’t changed.
  • There are some variations in listings and layout combinations, sometimes depending on how many local results are included. Sometimes local first, sometimes organic first.
  • It might be hard for businesses without websites to do as well in the new listings.
  • For some search queries without a specified location (eg. “restaurants” on its own), Google is starting to assume local search intent and putting local results anyway.

Local Maps

  • Map has been moved to the right sidebar, above PPC ads. This could be beneficial in attracting eyes to that part of the page. More likely, however, it will damage click-through rates and hike CPC. It also follows you in fixed position when you scroll down, blocking PPC ads.
  • Supposedly map spam has been more addressed, which title tags and meta description in the results should help. At this point it’s still a point of worry for SEOs who are used to local/map results being far more spammy than organic results.

To be updated, corrected, and refined as the situation evolves and more details emerge!

SEO for Enterprise Level Publishers

We’ve all heard/read the phase: Content is King. But when kingdoms get to big, they get spread too thin and up either crumbling or imploding.

Well, enterprise level publishers face a double-edge sword when it comes to SEO for similar reasons. On the one hand, the more content they have, the more they can give Google to chew on and the more short-, medium-, and long-tail keywords they can rank on. On the other hand, Google can easily be confused by the site structure, discover duplicate content issues, and end up penalizing (or outright banning) that publisher.

So how do enterprise level publisher get the most out of their content while avoiding getting slapped by Google? Well, it all starts onsite.

SEO & Digital Publishing

Online publishers are driven by (1) reader acquisition, (2) converting those readers into repeat users, and (3) retaining those users so that they can continue to grow. Well, SEO is an integral part of both the acquisition and conversion process.

Specifically, a well planned SEO strategy will help enterprise publishers deal with:

  • Optimizing onsite content for targeted terms
  • Dealing with Duplicate Content
  • and getting Restricted Content to rank in the SERPs

Page Structure

The first place that all good SEO should start is on each pages. Specifically, every page should have unique and targeted meta info. This includes:

  • Page Titles: <title>Insert 65 Characters</title>
  • Meta Descriptions: <meta name=”description content=”Insert 150 characters.” />
  • Meta Keywords: don’t even bother; the big search engines stopped indexing these in 2005, so all they do is tell your competitors what you’re trying to rank for.

By unique, I mean that no two pages should have the same Title or Meta Description. By targeted, I mean that you shouldn’t just stuff in there what keywords you think users are searching for; rather, you should actually use tools like Google Adword’s Keyword Tool to figure out what relevant keyword combinations get the highest volume of searches.

Duplicate Content

There are two reasons why duplicate content is a bad thing from an SEO perspective. First, it confuses search engines, and they’re not sure which page to include in their index. Second, search engines can see it as spam, and penalize/ban your site from the SERPs altogether.

On major sites with multiple categories and tags (especially blogs), duplicate content tends to appear in three different places:

  • Index Page: if your index page features a content feed of latest article/posts (like on a blog), then you’ll probably have some content overlap between your index page and category pages.
  • Article/Post Pages: if your index or blog page features the latest articles/posts in full, then you’ll have duplicate content problems between those articles/posts and the other places they appear in full on your site.
  • Categories/Tags: if you have articles that fall into multiple categories/tags, then you will most certainly have duplicate content issues across several category/tags pages.

There are four steps you can take to ensure that these duplicate content issues do not affect your rankings in any way.

Step 1: Content Teaser Excerpts

First off, the only place where an article should appear in its entirety is on the actual article page. Every other page that might list that article (e.g. blog page, index page, category page, or tag page), shoud only feature an teaser from that article.

Ideally, your teasers should be completely unique, and not appear in the article itself. However, many sites choose to just feature the first 150-300 characters from the article.

Step 2: Titles & Meta Descriptions

As we mentioned above, make sure that every page has a unique and targeted page title and meta description. This is your first opportunity to tell search engines how each page is unique. This will be particularly important for category pages, where content can be duplicated several times over.

Step 3: Unique Static Content

Give every page where content might be duplicated some unique static content that appears at the top. This should include (1) its own, unique H1 tag (hint: related to your title tag), and (2) a descriptive paragraph that will appear between that H1 tag and the content feed that may be duplicating content from other areas of your site.

Step 4: NoIndex Duplicate Content

If your site produce a lot of content across many categories and tags, unique page titles, meta descriptions, H1 tags, and intro paragraphs may not be enough. In this case, you will want to block the really redundant pages. You can do this in two ways:

  • by adding these pages to your robots.txt file
  • or by adding <meta name=”robots” content=”noindex” /> to the page source

So how do you know what pages to exclude from the index? Well generally, it is best to exclude pages that are there for usability or navigation, but not search engines:

  • tag pages (but keep category pages),
  • author pages/feeds (unless you have high profile authors you want to rank for),
  • archive by date page

Restricted Content – First Click Free

Many Enterprise level publishers feature content that is for registered users only. But if your content is restricted, how do you get it into the SERPs so that you can attract new registered users?

For publishers that feature restricted content, Google offers a service called First Click Free:

Implementing Google’s First Click Free (FCF) for your content allows you to include your restricted content in Google’s main search index. […] First Click Free has two main goals:

1. To include high-quality content in Google’s search index, providing a better experience for Google users who may not have known that content existed.

2. To provide a promotion and discovery opportunity for webmasters of sites with restricted content.

To implement First Click Free, you need to allow all users who find a document on your site via Google search to see the full text of that document, even if they have not registered or subscribed to see that content. The user’s first click to your content area is free. However, once that user clicks a link on the original page, you can require them to sign in or register to read further.

So through FCF, enterprise publishers can help ensure that all of their restricted content is indexed. Then, one a user find that content in the SERPs, they can access it but will have to become a register users themselves if they want to access additional content.

Big, Bad SEO

As all things in business, size can be both a pro and a con. On the organizational side of things, large companies have more resources, but are slower to react to changes in the marketplace. On the SEO side of things, enterprise level publishers have more content they can use to rank on more terms and attract links, but the bigger your sitemap, the easier it is for search engines to get lost.

A little bit of onsite SEO, however, can go a long way in terms of ensuring that you rank on most possible terms and don’t get penalized in the process. What it comes down to is making sure that every page is as unique as possible (think page title, meta description, and H1 tag), there is as little duplicate content as possible, and that if there is content behind a registration wall, you can let Google in to index it.

Enterprise publishers who take all these steps will not only ensure to avoid penalties, but over time they will see an incredibly amount of their organic traffic coming through on older pieces of content. That is, after all, what having the size advantage is all about.

Optimizing WordPress for Search & Social at Wordcamp Montreal

Today we gave a presentation at Optimizing WordPress for Search and Social. The goal of this presentation was to explore how to optimize WordPress blogs/sites for both Search and Social Media.

First we looked at the fundamentals of onsite SEO, including page structure and duplicate content, and what themes and plugins can help you address these issues. Then we considered integrating WordPress with Facebook Connect, and plugins that can with that integration.

Below is our presentation, as well as some links to the themes and plugins that we discussed during the presentation.

Optimizing WordPress for Search & Social

View more presentations from NVI interactive strategy.

SEO Themes & Plugins for WordPress

Thesis Theme — I actually use this theme to power my personal blog. It isn’t a free them, but it offers many user-friendly SEO and web design options that make it well worth the $87 it costs.

Genesis Theme — We haven’t test out this theme yet ourselves, but Chris Brogan uses it, so it can’t be that bad. Also, before Chris moved to Genesis, he used Thesis. So if it’s good enough for Chris, it’s probably good enough for the rest of us. Genesis sells for $60.

All in One SEO Pack — This is a free SEO plugin for WordPress. It will also get you about 3/4 of what Thesis or Genesis will get you.

MembersWing Plugin — This plugin allows you to implement Google’s First-Click-Free, which allows Google to index restricted content behind a registration wall. There are both free and paid versions, but we haven’t tested either of them out ourselves just yet.

Facebook Connect WordPress Plugins

Here is the list of plugins that we discussed during our presentation. You might want to try to integrate Facebook Connect with WordPress.

  • Simple Facebook Connect — I’ve personally tested this one out, and it gets you just about everything that Facebook Connect has to offer. However, it doesn’t always play well with every theme.
  • Faux Facebook Connect — This plugin allows users to comment on a WordPress blog using their Facebook credentials.
  • Gigya Socializer — This one is designed to help increase site registration and engagement by using a number of social ID APIs, including Facebook Connect, MySpaceID, Twitter and OpenID.

This, of course, isn’t an exhaustive list. If you want to shop around a bit more, you might also want to check out this list of 12 Facebook Plugins for Bloggers.

Yellow Pages on Local SEO

Meet John Fanous and Bill Aver of the Yellow Pages Group. We caught up with John and Bill at SES Toronto 2010 and sat down with them to chat about Local SEO.

Yellow Pages is now offering Local SEO services in Canada. John explained to us that the company is in the business of helping businesses be found, and that their Local SEO services are designed to help businesses be found online. Bill and John also elaborated on how this Local SEO offering doesn’t represent a coflict of interest vis a vis their print directory.

Jeff Jones on SEO in 2010

Another person we caught up with at SES Toronto 2010 was Jeff Jones, the Senior Product Manager at gShift Labs. We sat down with Jeff to chat about how SEO is evolving, and what it’s going to mean for strategy in 2010

Jeff cited two main factors that will affect SEO in 2010. The first was site load times — particularly how Google has come out to say that a site’s load times will affect how well it ranks. The second factor he touched on was Google Caffeine — specifically how Google will now include so many more kinds of content in search results, such as Twitter, blogs, etc. Jeff also touched on personalized search, and speculated on what it’s going to mean for brands who are working to rank on competitive terms.

Jim Hedger on Local & Mobile Search

During SES Toronto 2010, we also caught up with Jim Hedger (@jimhedger). In this first part of our interview with Jim, he shares with us what he sees as two recent developments in the world of search that are impacting both online and brick & mortar retailers.

The first development had more to do with user behavior than with the search algorithm itself: mobile. As Jim points out, mobile is changing the game by making local search more relevant. As a result, the second major development in the world of search is local search. The increasing importance of local search (1) opens up competition to localized SMBs, and (2) offers both SMBs and big brands a new opportunity to drive foot traffic.