You may have heard this week that Google made a decent sized changed to it’s ranking system, called PageRank. What you probably didn’t hear is some of the specifics of what was changed. One of the biggest items they took a look at was site scraping. Site scraping is the process by which someone copies much or all of the content of your site and puts it on their own.
Turns out that over the past year, many web sites that were scraping content had employed strategies that got them ranking higher than the content originators. This obviously was getting frustating. So Google changed the system and it’s already seeing an effect on the rankings of these so called “syndication sites.” (not referring to Trulia and Zillow here, but I can see where the terminology would be quite confusing.)
However, this has some very interesting implications for Agents and Brokers who want to get listings in their entire MLS indexed by Google. By changing the ranking for those that scrape content, they will almost certainly have to have changed the same system that looks at sites that have similar content as another site. This looks to squarely effect those with MLS listings since Google probably views them all a similar way. Keep in mind that if you subscribe to an IDX service and the service looks primarily the same from site to site, there is a good chance this change will effect indexing. (If you have questions on this please feel free to leave comments below and we’ll be happy to answer them.)
Read about Google’s Site Scraping Algorithm Change at Matt Cutts’ blog. Matt is the head of the Webspam team at Google.
We have yet to see much of an impact on any sites that Tribus maintains. However if you have a site with thousands of listings indexed by Google, please comment if you’re seeing any dramatic change in the number of pages Google has for you. (If you don’t know how to do this go to Google and type in site:yourdomain.com You’ll be able to see the total number of listings Google knows about on your site.)