The Erics and I have been trying to figure out why Google has been holding out on us for coming on two months now. We tripped some trigger, obviously, but we can’t figure out which. The upshot is that we’re losing between 500 and 1,000 hard clicks we would have gotten every day — most all of them with a uselessly-high bounce rate. The other end of the stick is that, because we’re seeing nothing but serious visitors, our pageviews and time on site are way up.
And at the other other end of the stick, there is Technorati, the gift that keeps on giving. Our rank is 615 as I write this, as high as it has ever been. I don’t know how many more links we will need to make it into the Top 5,000 weblogs — but I would love to find out.
Technorati Tags: real estate, real estate marketing
Doug Quance says:
Something is going on at Google, fer sure.
I used to see my posts in Google Alerts… but not anymore. Haven’t seen one in a long time.
At least Bloodhound is still pulling a page rank 6. 🙂
July 12, 2008 — 7:08 am
Matt McGee says:
Greg,
Deactivate the plugin that creates and updates your XML sitemap.
Delete sitemap.xml off your server.
If you use Webmaster Central, log in and remove the sitemap from there, too. Get rid of any and all mentions of your XML sitemap.
Wait a week and see what happens. If nothing’s changed, you’ve only wasted about 5 minutes.
July 12, 2008 — 5:21 pm
Greg Swann says:
> Deactivate the plugin that creates and updates your XML sitemap.
Done. etc. Assuming results improve, just leave it off? What if nothing changes?
July 12, 2008 — 6:34 pm
Matt McGee says:
Yep, leave it off. Your blog shouldn’t need an XML Sitemap (very few sites actually need them). Googlebot should be crawling you just fine, and I suspect something in the sitemap isn’t to the bot’s liking and is causing posts not to get crawled/indexed.
I helped a friend with her (similarly authoritative) blog which had the same problem you’re having, and a week after we zapped the XML sitemap, she was back in business. I wrote about it here if you’re interested:
http://www.smallbusinesssem.com/xml-sitemaps-the-most-overrated-seo-tactic-ever/1193/
I’m guessing this’ll solve your situation. If nothing changes, just keep digging. You might also ask for assistance in the Google Webmaster Central Help Group on Google Groups.
July 12, 2008 — 8:45 pm
Greg Swann says:
Interesting. I never actually completed the hookup. When I signed up for the Google Webmaster account, it wanted an XML sitemap. I played with a few, didn’t like them, didn’t worry about it. Later I added that plug-in, but never plugged it all the way in with Google Webmaster. It’s plausible to me that Google might be having an allergic reaction to it anyway, particularly since so much of what BHB does is driven by PHP. In indexable pages, that PHP would come through as pure HTML, but I don’t know how it would look to the plug-in. IAC, we’ve gotten used to being spidered all the time, 2,000+ pages a week. We actually have a decent measure right now: On the order of 600 pages have been spidered since we added John Rowles to the roster. I’ll let you know what happens as a result of killing the plug-in.
July 12, 2008 — 11:25 pm
Barry Cunningham says:
Wow Matt..this is interesting…Greg you have to follow up and let us all know if this worked!
July 13, 2008 — 8:28 am
Eric Bramlett says:
As Greg & I have discussed in email, I really think it’s a duplicate content issue. Until recently, BHB had it’s archive & category pages linked to and indexed from every post in the site. Those pages are 100% duplicate content, and b/c BHB is a strong site, it can confuse the bots as to which page is the original source of the content. By “noindex, following” those pages, we tell the bots what the original source is, and it hopefully helps them index & rank the content that richly deserves exposure in the SERPs. (and hey, we didn’t even need to do a cloaked redirect in order to fix the dupe content issue.)
July 13, 2008 — 10:07 am
Matt McGee says:
What’s interesting to me is that some recent posts are in the index, while others are not. And there doesn’t seem to be any rhyme or reason to it. This post we’re commenting on is in the index, but going back a couple months, posts like
https://www.bloodhoundrealty.com/BloodhoundBlog/?p=3122
https://www.bloodhoundrealty.com/BloodhoundBlog/?p=3113
https://www.bloodhoundrealty.com/BloodhoundBlog/?p=3078
are not in the index. This is so similar to what I experienced with the other blogger I was helping that I have to think killing the XML sitemap will help.
Eric — I’m with you on the potential dupe issue, and archive pages (blog/page2/, /blog/page3/, etc.) should definitely be noindexed.
Keep us posted on things, Greg.
July 13, 2008 — 3:11 pm
Todd says:
Has this blog’s site map been “fine tuned” to exclude all the “bloodhound” + “dog” and + “K9” Google traffic? Doing so will cull the high bounce rates of those out of context search result traffic hits.
July 14, 2008 — 7:05 am