I took a look and sure enough the site’s robots.txt file was set up to block Google and the other search engines from crawling the entire site. Fortunately the fix was easy. I changed the file from this:
(You can also just remove the file.)
I might be going out on a limb here, but I’ve seen more problems caused by misuse of the robots.txt file than solved.
One of the big misconceptions about robots.txt disallow directives is that they are a fool proof way to keep pages out of the Google index. Not only is this not true, but when the pages are indexed – they are indexed with almost no information adding a lot of low quality almost duplicate content into the index which might drag down the SEO performance of your site.
The robots.txt file has been around for years. In those early days, bandwidth was more precious and Googlebot often taxed servers, even crashing them, when it crawled a site. So using the disallow directive to keep Google from crawling pages often helped keep a site up. Those concerns are a distant memory today.
Then I wrote a script to vet many of the services that many bloggers were using and found several returned errors or didn’t respond at all. That was the first red flag. The second red flag showed up when I found out about a Matt Cutts warning that Google looked unfavorably on some of the services that many were using. Essentially some of them are spam magnets and you really don’t want your site associated with them.
So I cut down my ping list down from 31 services to just 3. And published my post, which got a lot of attention due to my contrarian stance.
Recently I ran into a problem with adding a Facebook pixel to a LeadPage (which we had mapped to WordPress using the LeadPages plugin). I could see both the Google Analytics code and the Facebook pixel code in LeadPages’s tracking code dialog, but only the Google Analytics code was showing up in the source code. I solved the problem by reversing the order of the scripts, but at that point I decided there had to be a better way to manage all the tracking scripts you might need for a paid traffic campaign or optimizing a funnel.
There is, it’s called Google Tag Manager.
Once you have it installed, Google Tag Manager makes it easy to manage your scripts. Instead of having to update your website with each new script, you just log into Google Tag Manager and add it as a new tag. The other benefit is that it will help your site’s performance as Google utilizes their own CDNs to execute the tag, and the GTM script itself fires asynchronously; taking the burden off of your server and not impacting your render time.
I was inspired to put together this infographic when I read Trond Lyngbø’s Search Engine Land’s article:
As the introduction says: “Many business owner see SEO and content marketing as separate, but columnist Trond Lyngbø argues that solid keyword research can and should be used to inform content marketing strategy.” – SearchEngineLand
I couldn’t agree more. Augmented by customer and market research, keyword research becomes a potent tool in your hands, giving you valuable insight into just not content marketing and SEO as Lyngbø asserts but also into multiple aspects of online marketing, including social and paid traffic.
To really do a thorough job with your keyword research, you should be including less traditional keyword research tools such #tagboard. My list of 22 Keyword Research Tools has plenty of interesting tools for you to choose from.
Now that everyone has more or less jumped on the Responsive Web Design bandwagon (especially in the WordPress world), Google has come along with AMP (Accelerated Mobile Pages). What is AMP and do you need to care?
Responsive Web Design
Remember, oh say five years ago, browsing the internet on a smartphone? It was rather painful, you had to pinch and zoom practically every site just so you could actually read the text. Then Webmasters realized this mobile thing wasn’t going away and starting making their sites mobile friendly. There are a number of ways to make your site mobile friendly, however Responsive Web Design (often just referred to just as “Responsive” or “RWD”) became really popular as a solution.
Even in an era of the semantic web, keyword research is still important. But it’s not just about the specific keywords. We need research to understand our market and competitive landscape, we need research to help brainstorm the topic of our new blog post. The 22 keyword research tools I cover below will not only help determine keywords for your on page SEO, but help you assess your competition and may even help you validate a new idea. Most of the tools listed below are free, but I’ve included a few that are not.
Although WordPress is my preferred platform, I’ll provide SEO and digital marketing consulting on any site built in any technology. I’ve looked at sites built in Adobe CQ, Magento, Joomla, as well as many sites built using an in house custom CMS (content management system).
Recently I’ve come across several sites built with WiX. Wix.com is one of several site builders that aims to make it really easy for the non geek to build a website. Others are SquareSpace and Weebly.
You can see the attraction; point and click, drag and drop, and presto you have a website! But what about SEO?
I’ve now looked at WiX multiple times and it’s really a case of “your mileage may vary” when it comes to SEO.
This week’s post is an answer to a reader’s question.
SuZen Marie asks: My brother has no interest in blogging on the website I built for him and I know this is not helping the site’s SEO. Is guest blogging or a featured column the way to go? Does “borrowed” content count to boost SEO ranking as well as original content?
You’re right to worry about having content on your brother’s website. In today’s competitive landscape, excellent content is the way to win at SEO.
That being said, since you have a site that provides a local service, local SEO tactics, can get you pretty far. Ideally, though, you want to rank in both the “map” results as well as with a pure organic listing so that you have multiple opportunities to get the click.
It all comes down to how competitive your niche is. With some niches, some basic on page optimization and enough citations (local SEO listings) might be enough. But those niches are getting rarer and rarer, especially if you are in a bigger city.
These moves can be really dependent on the scenario you are dealing with, for example: who has the domain registered etc., so I’m just going to detail what I did and hopefully it will help others.