It’s been very interesting following the drama with Trulia. In case you missed it, here is a quick recap.
The other day, I noticed Trulia had fallen out of the SERPs (Search Engine Result Pages). Eric Bramlett did some research and noticed that a blog setup on a subdomain of trulia.com was hacked. He put two and two together and assumed because the hack injected hidden text, Google applied a penalty. To be honest, I would have immediately thought the same thing. But I recently consulted with an agent who also had their WordPress blog hacked. Like Trulia, the blog resided on a subdomain. And while yes, it did result in Google penalizing the subdomain, the main website was not affected. This led me to think Trulia’s low ranking and their hacked blog were two separate issues.
Trulia Confirmed
Rudy stopped by both my blog and Eric’s blog to confirm that the two issues were unrelated and that Trulia was looking into the issue – Kudos to Rudy for his open and quick response. Eric then dug a little more into the issue and offered some additional insight as to what was happening with Trulia’s rankings. It turns that only a specific type of page was dropping in ranking.
http://www.trulia.com/TX/Austin/
http://www.trulia.com/CA/Los_Angeles/
http://www.trulia.com/FL/Miami/
http://www.trulia.com/NV/Las_Vegas/
I took a quick look at these pages to see if there was anything that would cause Google to cry foul. That’s when it struck me. This may not be a penalty, but more of a result in Google refining their algorithm.
Gaming the System with Property Listings
Many in the Real Estate/SEO world know that MLS listings can be a gold mine when it comes to Search Engine Optimization. After all, if your local board has 10,000 listings, and you pull these listings into your site via IDX, then you suddenly have 10,000 index pages for Google! Remember, each listing will link to a details page. For years, webmasters have creatively used IDX to feed Google content. But about a month ago, something happened. I suddenly saw Google dramatically de-indexing these automated pages that had nothing but IDX listing results. It seemed overnight sites went from having thousands of pages indexed in Google to only having a few hundred.
Now take a look at one of these “optimized” pages that have suddenly been dropped from Google’s index. For example, let’s look at the Miami page – http://www.trulia.com/FL/Miami/ . Now mentally take out the left search form. Generally speaking, Google really doesn’t care about forms. And let’s assume that Google is putting the crack down on generic listing data (after all, everyone and their brother has IDX data). What’s left on that site? Specifically, what’s left on the site that would be seen as value to search engines? Not much right? Especially when you compare that to content on other sites such as www.kevintomlinson.com or www.southbeachrealestateblog.com
Additionally, take a look at the footer of Trulia’s Miami page. Now compare that against, http://www.trulia.com/NV/Las_Vegas/ or any other city for that matter, it’s nothing but generic content that has been injected with automated city references.
Automation Is Not The Answer
Given the fact that Trulia serves up information for so many different markets, generic information that dynamically pulls in geographic references may seem like the right approach. Heck, I talk to agents every day who want this same “instant content” approach and they only have to worry about a handful of areas in their market. But the reality is that Google still rewards people who take the time to offer real value that is truly specific to the area. Not automated plug and play.
My Advice for Trulia
Trulia already has tons of original content associated with local areas via Trulia blogs. But interestingly enough, they do not pull in any of this content into their city pages. Why not? Trulia: allow agents to associate content to specific areas and populate that content within the city pages. Yes, people will try to game the system and spam you. But given your programming resources, you can easily develop an algorithm to catch 99.9% of that. Remember, don’t bit the hand that feeds you. Reward those agents who supply you with good solid content. Do this and Google will reward you as well. It seems like a win-win for everyone.
Great advice Brad and kudos to Rudy and Eric for their quick response. Frank Schulte-Ladbeck just wrote about this at http://tinyurl.com/d2eb66 and suggested we provide better access to our content rich market snapshots http://www.homefinder.com/GA/Atlanta/local-real-estate when searching listings. Roger that! And next… how about using a nationwide news media network to capture hyper-local real estate content. Relevance and integration!
Wow. I’m so glad that you (with a heavy hand) make me stay on the straight and narrow.
No spamming for me.
Remember, don’t bit the hand that feeds you. Reward those agents who supply you with good solid content.
Good advice. Why wouldn’t they be doing that anyway?
Trulia seems to be the large scale example that you need to provide quality content. Listings are great, and yes buyers are searching for them, but without the context of good content, listings don’t add much value (hence search engines thrasing Trulia).
Good content takes time to develop, but the rewards are huge 🙂
Interesting theory Brad. We’re pretty confident it’s not a penalty, rather a technical issue. We’re still looking into it.
Rudy
Social Media Guru
If I am reading you right, you are saying that Trulia is getting penalized for Duplicate Content? Where a ton of Trulia’s pages are getting penalized because they have lots of generic and similar data?
Google actually doesn’t penalize for duplicate content, they just send those pages to the supplementary indexes. I am actually really, really impressed with just how good Google is at originating content, specifically blog articles, to the original author and not a spam or resyndication blog.
But I 100% agree with you that automation is not the answer.
And while I am changing the subject slightly from Trulia to a more generalized listing website conversation, I was always under the impression that red flags would go up in google when a site goes from 100 pages or even a thousand pages to 20+k pages overnight.
I believe it is hard to discern auto good content and auto spam content. But I could be wrong.
I remember reading somewhere on google webmasters about building a website page count gradually. Can’t remember now where I found that, I will have to look.