Wow! What a debacle I just went through with one of my client’s websites — designed in WordPress — and getting it indexed in Google.
The client had a site that had been taken down for a few months — no content. His previous site was a template site and didn’t have much content. We decided that we would redo the site on the WordPress platform (as I do all my client sites) as I am able to get some really good Google ‘love’ for targeted keywords pretty quickly (I’ve actually had one client go from zero to first page in two weeks for 13 out of 30 keywords, and on the first page of Google for 25 of the 30 keywords).
To do this, I obviously have to have good on-page content, but furthermore, we need to really be sure to have the right plugins in place. I use the XML Sitemap plugin and PC Robots.txt to be sure I ‘m creating the right stuff that Google likes for proper indexing. I have never had an issue with any client sites with these plugins.
Now for the fun part …
I set this site up as a ‘working’ site as a subsite of this site (dynamicmarketingpartners.com/subsite) and installed WordPress there. I set the Privacy to ‘do not index’ so that the subsite didn’t register in Google. I did the whole install, completed the site, and then moved it over to the client’s actual site using ManageWP (an awesome program to manage multiple WordPress blogs). Everything looked good.
We were doing a comprehensive SEO program with lots of content and lots of links being build. Again, this typically results in near immediate results for clients (or at least lots of positive upwards movement). For this client, nothing was happening,and I was confused. I thought that it could have been due to several things: possibly I left the Robots.txt file open and did’t block the subsite, so I was getting dinged for duplicate content; or, the site was somehow ‘sandboxed’ by Google due to being dead for a while. I was really confused.
I started digging deeper into the site and looked at the Google Webmaster tools. I updated all the Sitemaps and found that it said that it was getting blocked by the Robots.txt file I went back into the site and when I went to regenerate the sitemap, there was an error saying that the site was being restricted by the Robots.txt file and I had to adjust the privacy settings to correct. I clicked the link for the privacy settings and there was a radio button for ‘Allow search engines to index this site’ or ‘Ask search engines not to index this site’. The ‘not’ button was activated (not sure why, but it was). So, I simply changed to ‘allow’ and clicked save. However, this didn’t save. It kept kicking me back to saying ‘not’. I couldn’t understand why.
I searched and found several forums talking about why this could happen and most of them dealt with people having their Robots.txt file in a folder other than the root. That wasn’t my case. I looked at the http://subsite.xml sitemap and it was there. I looked at http://subsite.com/robots.txt file and it was there and everything was set ok. I couldn’t figure out why my privacy settings wouldn’t change.
I went into the database (via MyPhpAdmin) and searched for ‘disallow’ and changed every instance to ‘allow’ – thinking this would catch any deep-seeded robots.txt restrictions. Still no luck.
I went into the hosting (GoDaddy hosting) and checked on the redirects to be sure that http://subsite.com and http://www.subsite.com were the same (it was all set).
I went back to Webmaster tools and looked what as was coming up. Still blocked.
Back to the admin panel on WordPress. I looked at the Settings — General and the site was set for http://subsite.com. I went back to Google Webmaster Tools and re-ran the sitemap. Still blocked. I then clicked on the link for the sitemap it was pulling and found that it was looking at http://www.subsite.com/sitemap.xml.
What I found is that since the default site was set for http://subsite.com (no www) the sitemap defaulted there, but GWT was looking for the www version. I went into Settings — General and changed the site to http://www.subsite.com. I then ran the sitemap building plugins and checked them with Webmaster tools. STILL not being allowed by Robots.txt. I disabled the Robots.txt plugin, thinking that something was glitching there.
Then, as I was going through all pages of the settings, I found it — in the ‘Reading’ settings page — a checkbox that says “Discourage search engines from indexing this site”. It was checked. Apparently THIS is the checkbox that sets the Privacy Settings that I couldn’t get changed.
After unchecking that box, saving, waiting about 3 minutes, and then resubmitting my sitemap though Google Webmaster Tools, the site is being indexed.
Wow … what an ordeal. Glad it’s fixed.
Do you have any tips on WordPress? Please share them here.