If you haven’t already noticed, Google Keyword Planner has severely limited access to its keyword data to marketers. Unless you have a consistent active Adwords campaign, when logging into Keyword Planner today, you’ll see wide ranges for average monthly search volume instead of an actual number.
A range like 100-1K is useless for most keyword research. If you are like most SEOs or marketers your exploration of your keyword space starts with Google Keyword Planner. So what is the alternative?
Do you remember when you learned how to do on page SEO optimization by updating meta tags? For me, it felt like I got the keys to the kingdom. By placing keywords into the meta tags on your page you had a tool that would magically rank your page at the top of the Google search results where it belonged. Right?
Well not so fast, in today’s SEO, optimizing your meta tags may not have the impact you think.
I had the privilege of attending a Meetup featuring the one and only Joost de Valk who along with his wife and partner Marieke van de Rakt gave a presentation on “Beyond SEO: Copywriting for Professionals with Yoast”.The talk covered Joost’s view of Holistic SEO and the increasing importance of Quality Content in SEO.
If you are not familiar with who Joost de Valk is, he is the creator of the very popular Yoast SEO WordPress plugin which just about handles all your WordPress SEO needs. Even though I haven’t gotten around to migrating this site to Yoast SEO, I have extensively used it on many client sites as well as some of my other sites and have watched it’s evolution over several years.
The latest version (3.3) of Yoast SEO has some new features which evaluates the readability of your page or post; Joost and Marieke covered the new readability analysis feature and gave us an inside look on it came about.
I took a look and sure enough the site’s robots.txt file was set up to block Google and the other search engines from crawling the entire site. Fortunately the fix was easy. I changed the file from this:
(You can also just remove the file.)
I might be going out on a limb here, but I’ve seen more problems caused by misuse of the robots.txt file than solved.
One of the big misconceptions about robots.txt disallow directives is that they are a fool proof way to keep pages out of the Google index. Not only is this not true, but when the pages are indexed – they are indexed with almost no information adding a lot of low quality almost duplicate content into the index which might drag down the SEO performance of your site.
The robots.txt file has been around for years. In those early days, bandwidth was more precious and Googlebot often taxed servers, even crashing them, when it crawled a site. So using the disallow directive to keep Google from crawling pages often helped keep a site up. Those concerns are a distant memory today.
Then I wrote a script to vet many of the services that many bloggers were using and found several returned errors or didn’t respond at all. That was the first red flag. The second red flag showed up when I found out about a Matt Cutts warning that Google looked unfavorably on some of the services that many were using. Essentially some of them are spam magnets and you really don’t want your site associated with them.
So I cut down my ping list down from 31 services to just 3. And published my post, which got a lot of attention due to my contrarian stance.
I was inspired to put together this infographic when I read Trond Lyngbø’s Search Engine Land’s article:
As the introduction says: “Many business owner see SEO and content marketing as separate, but columnist Trond Lyngbø argues that solid keyword research can and should be used to inform content marketing strategy.” – SearchEngineLand
I couldn’t agree more. Augmented by customer and market research, keyword research becomes a potent tool in your hands, giving you valuable insight into just not content marketing and SEO as Lyngbø asserts but also into multiple aspects of online marketing, including social and paid traffic.
To really do a thorough job with your keyword research, you should be including less traditional keyword research tools such #tagboard. My list of 22 Keyword Research Tools has plenty of interesting tools for you to choose from.
Even in an era of the semantic web, keyword research is still important. But it’s not just about the specific keywords. We need research to understand our market and competitive landscape, we need research to help brainstorm the topic of our new blog post. The 22 keyword research tools I cover below will not only help determine keywords for your on page SEO, but help you assess your competition and may even help you validate a new idea. Most of the tools listed below are free, but I’ve included a few that are not.
Although WordPress is my preferred platform, I’ll provide SEO and digital marketing consulting on any site built in any technology. I’ve looked at sites built in Adobe CQ, Magento, Joomla, as well as many sites built using an in house custom CMS (content management system).
Recently I’ve come across several sites built with WiX. Wix.com is one of several site builders that aims to make it really easy for the non geek to build a website. Others are SquareSpace and Weebly.
You can see the attraction; point and click, drag and drop, and presto you have a website! But what about SEO?
I’ve now looked at WiX multiple times and it’s really a case of “your mileage may vary” when it comes to SEO.
This week’s post is an answer to a reader’s question.
SuZen Marie asks: My brother has no interest in blogging on the website I built for him and I know this is not helping the site’s SEO. Is guest blogging or a featured column the way to go? Does “borrowed” content count to boost SEO ranking as well as original content?
You’re right to worry about having content on your brother’s website. In today’s competitive landscape, excellent content is the way to win at SEO.
That being said, since you have a site that provides a local service, local SEO tactics, can get you pretty far. Ideally, though, you want to rank in both the “map” results as well as with a pure organic listing so that you have multiple opportunities to get the click.
It all comes down to how competitive your niche is. With some niches, some basic on page optimization and enough citations (local SEO listings) might be enough. But those niches are getting rarer and rarer, especially if you are in a bigger city.
Recently an email from a top media site was shared with me:
“…we would really rather not edit the links in our author bios to “follows” from “nofollow.” With the number of contributed articles (and articles, period) that we publish every day, we are leaking SEO authority with every “follow” link we allow.”
Given this is 2016, I had thought that the notion that outbound linking “leaks” link juice from a site had died a deserved death many years ago. In fact, many SEOs believe that linking out to relevant and high authority helps your site by establishing trust.