Google Indexing Geo Specific Content

A question came up on cofounderslab asking about how to deal with pages that changed content based on geolocation, in particular it was about google indexing the content.

The problem is trying to serve geo specific content on the same URL’s, this is breaking URL Semantics. The “R” in URL means Resource and in the case given the city is the resource, at least when I searched it was city specific content that I got on the site, if the city changes based on users location the resource is changing and this breaks the semantics. There are two ways to deal with this, one is sub directories and the other is sub domains.


You can redirect the user to a subdirectory ie[city here]

You can hit some gotchas with cities with the same name etc but this way the google index will see each url as unique content which it is. An example of sub directories is yelp, if I visit yelp at their main domain I get redirected from



Craigslist uses subdomains, if I go to their main website I get redirected to

The thing to remember is that a web resource should remain fixed at a single URL, the method chosen has some ramifications but as you can see from Yelp and craigslist both methods work well.

I’ve used both methods and the sub domain route is a bit more work because it involves DNS and some way to manage it that isn’t manual but it’s easily doable ie attach DNS to a database etc (PowerDNS is wonderful at this).

DMOZ Corruption

Having tried for several years to get into DMOZ and getting rejected every time I always wondered why. There are literally dozens of sites doing exactly the same thing as me,, etc and they are listed. These site have a lot of clout and a lot of money whereas I am a one man band with little cash.

Imagine my surprise when I came across Corrupt DMOZ Editor. I may just have been a bit naive when I assumed that DMOZ was the real deal but this blog has given me food for thought.

Google and various other large sites use DMOZ for seeding their search or for their own directories etc. So If there is any truth in this and DMOZ is rife with corruption then anyone using the data is perpetuating the problem. Putting on my cynical head I can see how DMOZ could be easily corrupted. What I would like to know is if there is any truth in it or not. If so I can forget about DMOZ.

I actually admire what DMOZ is trying to do. I am not sure if it as relevant as it was but I do believe that there is a place online for a human edited directory. DMOZ needs to change though. It is becoming less relevant as time goes by and it would be a shame to see it all go to waste.

The more I dig into this topic on Google the more I feel dismayed. It looks like a lot of the top editors are corrupt and are disabling account of the lower editors that are not. To paraphrase Charles Adams “Bad Editors drive out the Good”,

Here are a few more interesting entries on the topic of DMOZ.

This guys site got banned after he refused to pay a bribe!

Removed from DMOZ for not paying Bribe

The following DMOZ editor’s account was disabled when he tried to do something about the guys website above.

Editor Banned for doing the right thing

Time for ODP to Close (Danny Sullivan)

ODP Founder Coments (Danny Sullivan)

Lords of ODP

Checking Page Rank

I tend to check the PageRank on some of my sites once in a while and I can never remember which site I used the last time. This meant I farted around looking for one that doesn’t involve a catchpa or try to sell me something while I am using it. With this in mind I wrote my own PageRank Checking tool, now I no longer need to go hunting anymore.
I am seriously considering creating a whole suite of tools to check things like:
Yahoo and google backlinks
indexed pages
alexa ranking
quantcast ranking
etc etc
When I want to see any one of these things it would be nice to see them all.