Is Normal Google Search Net Neutral?

When it comes to net neutrality, Google’s position has shifted rather considerably over the last five or so years. What was an initial CEO declaration of fighting the good fight has devolved into a compromise with Verizon.

Google would probably stand by their organic, unpaid listing as in the spirit of an open web. But is it? Given the massive conduit to web activity that it is, should we hold it to the same scrunity and openness standards to which we hold large telecom companies?

I believe that comparison between Google’s search results pages and ISP traffic-tiering is fair. They both provide the means of access and a filtering of what is actually accessed.

The Google Traffic Filter

Google’s listings represent the bottleneck in positive terms through a limitation of visible first page results selected via an ever-evolving algorithm. With Google, the user initiates the search, but because search results are limited per page (i.e. Google can’t show everything at once), Google is filtering traffic in its own way.

So the question is whether or not the Google algorithm is analogous to the direct, negative terms bandwidth throttling that can come with a lack of net neutrality.

Davids & Goliaths in the SERPs

The core of the issue is whether or not the little guy is getting muscled out. An abolishment of net neutrality will favour (in addition to the telecom companies, of course) the companies that can afford to pay to have their content seen first, or only. It would seem obvious that this would be the case with paid listings through their AdWords pay-per-click program, but what about organic listings?

Some search engine optimization is difficult, and even the parts that aren’t rocket science can still consume significant time and labour. PageRank (which correlates with rankings) come from links, and links are meant to come naturally from high quality content. Surely the biggest and best companies are more likely to be able to provide this.

Even if a little guy was able to develop an extremely link-worthy page suited to and optimized for a particular keyword, a strong site (in SEO terms) to house the content is extremely important for rankings.

The Big, The Old, and The Trusted

It takes money to make money, and it takes PageRank to make PageRank . Of course, you can find other ways to attract links, just as you can find ways to make money without significant investment capital. But the advantage for the future is heavily connected to the success of the past.

The black hole site phenomenon allows well-placing sites to be a common destination for new links and is not uncommon. Any time you have sought a quick reference to link to and chosen a conveniently located Wikipedia page, you have thrown more mass into the black hole — just as I did starting in the fifth word of this post.

Sites that already rank are more accessible and more likely to attract more links. Little guys without links, not so much.

Add to this the fact that the age of a site is a clear and established ranking factor. There is an element of correlation here; the longer a site has been on the web, the more likely it is to have attracted links.

But age is also directly relevant, as it is a marker of established trust. The newest sites face a kind of probation that older sites have no need to contend with.

Paying for Your Dues

Another irony is that, despite paid listings being just about the only place for a new, little guy to get noticed, it’s the big organic players who have the cash, will, and good sense to pay for paid listings.

AdWords functions on a keyword-bidding system, and a new little guy has his work cut out for him in trying to outbid a major, established competitor, but at least they can play the game. For some keywords and their niches, an organic listing will be flat out impossible to achieve, and the only chance for a Google-driven click will be bought. Not very neutral.

Google’s Goals?

We have to ask what Google’s goals are likely to be, and whether they ought to be given the responsibility that has come with their power. If their goal is to provide the best and most relevant content on the web to their users, then their current algorithm is potentially best overall.

But this is because the biggest and most powerful companies are the ones who are likely to have the means to produce the best content, or deliver products at the best prices, or have been around long enough to be established and trusted.

Consider the result of something akin to the Walmart effect on small businesses. Why couldn’t telecom companies adopt a similar rationale? If they can offer better content or better deals, they might not only be favoured corporately, but by users themselves. People will go through a biased web just as they still shop at Walmart.

Yellow Pages directories, aside from their advertising, link to businesses online and offline alphabetically. Abuse aside (come shop at A.A. Adelson & Sons Butchers!), this may be more fair, but less useful in finding the best businesses.

Indeed, local listings may be the only semi-neutral space left. Searches by proximity are the most fair, and companies can still stand out from the crowd by receiving positive reviews, which may not necessarily be easy for big companies to compete with, despite price being largely tied to a satisfying consumer experience. But on much of the web, location isn’t an issue, and as such, neither is proximity.

Net neutrality will continue to be a hot issue as it slowly erodes, and Google will continue to be a major player in the debate. But whether we praise Google for fighting for an open web on those terms, or chide them for selling out, remember that Google’s organic search hasn’t exactly been net neutral all along.

Leave a Reply

Your email address will not be published. Required fields are marked *