Updated for 2017.
Recently an email from a top media site was shared with me:
“…we would really rather not edit the links in our author bios to “follows” from “nofollow.” With the number of contributed articles (and articles, period) that we publish every day, we are leaking SEO authority with every “follow” link we allow.”
At the time I saw this email I was surprised, I had thought that the notion that outbound linking “leaks” link juice from a site had died a deserved death many years ago.
To understand the reasoning behind this, it’s helpful to understand that each site is assigned a certain amount of SEO authority that flows from the home page through first the links on the home page and then through out the rest of the site. So does make some logical sense that you are losing SEO authority when you link out to external site.
However most SEOs believe this is a myth.
An outbound link is a link that links outside of your domain to an external site. When a visitor clicks on an outbound link, they will leave your site, this is why some websites open external links in a new browser window. Conversely inbound links (often referred to as backlinks) are incoming links from other sites to yours.
You know the performance of your site matters. But everytime you venture into site speed optimization you feel like you stumbled into a land of foreign geek speak.
Unfortunately site speed is a complex topic and technical (“Configure Entity Tags” anyone?). Some changes can require a web developer to implement the changes. With this post you’ll understand better where performance problems can crop up so you can have a better conversation with your developer. We’ll also cover the low hanging fruit that you can tackle on your own as well as the tools you’ll need to get started.
Although WordPress is my preferred platform, I’ll provide SEO and digital marketing consulting on any site built in any technology. I’ve looked at sites built in Adobe CQ, Magento, Joomla, as well as many sites built using an in house custom CMS (content management system).
I’ve also looked at many WiX sites. WiX is a popular website builder that enables the non geek to build very nice looking websites. You can see the attraction; point and click, drag and drop, and presto you have a website! But what about SEO?
This post has been extensively rewritten to bring it up to date – March 4, 2017
When I first wrote this post in 2015 (and even when I updated it in 2016) my answer to the question “Is WiX SEO friendly?” was a definite NO. WiX had significant problems when it came to SEO.
Today WiX has significantly improved and I’m no longer recommending against it.
However there are some few remaining SEO problems with WiX that you should be aware of. And just because the platform has improved doesn’t mean your site is SEO optimized, you have to make sure you use WiX’s SEO features wisely to have the best chance of ranking well.
If you haven’t already noticed, Google Keyword Planner has severely limited access to its keyword data to marketers. Unless you have a consistent active Adwords campaign, when logging into Keyword Planner today, you’ll see wide ranges for average monthly search volume instead of an actual number.
A range like 100-1K is useless for most keyword research. If you are like most SEOs or marketers your exploration of your keyword space starts with Google Keyword Planner. So what is the alternative?
Do you remember when you learned how to do on page SEO optimization by updating meta tags? For me, it felt like I got the keys to the kingdom. By placing keywords into the meta tags on your page you had a tool that would magically rank your page at the top of the Google search results where it belonged. Right?
Well not so fast, in today’s SEO, optimizing your meta tags may not have the impact you think.
I had the privilege of attending a Meetup featuring the one and only Joost de Valk who along with his wife and partner Marieke van de Rakt gave a presentation on “Beyond SEO: Copywriting for Professionals with Yoast”.The talk covered Joost’s view of Holistic SEO and the increasing importance of Quality Content in SEO.
If you are not familiar with who Joost de Valk is, he is the creator of the very popular Yoast SEO WordPress plugin which just about handles all your WordPress SEO needs. Even though I haven’t gotten around to migrating this site to Yoast SEO, I have extensively used it on many client sites as well as some of my other sites and have watched it’s evolution over several years.
The latest version (3.3) of Yoast SEO has some new features which evaluates the readability of your page or post; Joost and Marieke covered the new readability analysis feature and gave us an inside look on it came about.
I took a look and sure enough the site’s robots.txt file was set up to block Google and the other search engines from crawling the entire site. Fortunately the fix was easy. I changed the file from this:
(You can also just remove the file.)
I might be going out on a limb here, but I’ve seen more problems caused by misuse of the robots.txt file than solved.
One of the big misconceptions about robots.txt disallow directives is that they are a fool proof way to keep pages out of the Google index. Not only is this not true, but when the pages are indexed – they are indexed with almost no information adding a lot of low quality almost duplicate content into the index which might drag down the SEO performance of your site.
The robots.txt file has been around for years. In those early days, bandwidth was more precious and Googlebot often taxed servers, even crashing them, when it crawled a site. So using the disallow directive to keep Google from crawling pages often helped keep a site up. Those concerns are a distant memory today.
Then I wrote a script to vet many of the services that many bloggers were using and found several returned errors or didn’t respond at all. That was the first red flag. The second red flag showed up when I found out about a Matt Cutts warning that Google looked unfavorably on some of the services that many were using. Essentially some of them are spam magnets and you really don’t want your site associated with them.
So I cut down my ping list down from 31 services to just 3. And published my post, which got a lot of attention due to my contrarian stance.
I was inspired to put together this infographic when I read Trond Lyngbø’s Search Engine Land’s article:
As the introduction says: “Many business owner see SEO and content marketing as separate, but columnist Trond Lyngbø argues that solid keyword research can and should be used to inform content marketing strategy.” – SearchEngineLand
I couldn’t agree more. Augmented by customer and market research, keyword research becomes a potent tool in your hands, giving you valuable insight into just not content marketing and SEO as Lyngbø asserts but also into multiple aspects of online marketing, including social and paid traffic.
To really do a thorough job with your keyword research, you should be including less traditional keyword research tools such #tagboard. My list of 22 Keyword Research Tools has plenty of interesting tools for you to choose from.
Even in an era of the semantic web, keyword research is still important. But it’s not just about the specific keywords. We need research to understand our market and competitive landscape, we need research to help brainstorm the topic of our new blog post. The 22 keyword research tools I cover below will not only help determine keywords for your on page SEO, but help you assess your competition and may even help you validate a new idea. Most of the tools listed below are free, but I’ve included a few that are not.