On January 8, 2018, Google announced that the new Google Search Console will be rolling out to all users. This new version has been in beta for a few months, soon it will be available to everyone.
One very exciting aspect of the new GSC is that keyword and ranking data will be available for up to 16 months! A big improvement over the old limited 90 days of data that we used to get.
Keep an eye out for an email from Google with the subject “Introducing the new Google Search Console” that will notify you when you have access.
I have a client that has access so I took a tour and have screenshots to share with you in this post.
Having a perfectly optimized site for SEO is a beautiful thing.
So why aren’t more visitors flocking to your site?
One possible overlooked reason is that you are just not getting the click.
There is a reason that realtors like to see a nice front door for the house they are selling. It sets the tone for the rest of the house and leaves a lasting first impression.
A ugly front door? Not a great first impression.
A freshly painted door that has a nice design? The potential buyer starts dreaming of living there even before opening the door.
Each of your site’s web pages has a front door. Called snippets, these are the listings in the search engine results pages.
If you are familiar with SEO, you know that, that in most cases, the snippets are created from the title tag and meta description tag.
You also probably know that it’s important to have your keywords in both of these tags. While the meta description tag doesn’t help you rank, when it comes to your title tag it is the most important element of your HTML for ranking.
Neither the title tag or the meta description actually show up on the page. They live in the “HEAD” section of your html along with other meta information that describe the page and its characteristics.
You can see your title tag in your browser tab just like in the screenshot below.
As a SEO consultant I work everyday with clients to help their sites get more organic traffic by ranking highly in the search engine results pages. I’m often surprised at how often my clients don’t look at their “front doors” for their site.
It’s like dressing up the bathrooms with pretty towels but not doing anything about your dingy front door. If your prospective buyer drives by your house and crosses it off the list due to poor curb appeal, then those pretty towels haven’t helped you make your sale one bit.
Both Google and Bing have a handy command you can use with search to get a list of your pages snippets.
Simply add “site:” in front of your domain name and you should see a list of your website’s pages. If you have a larger site you likely won’t get a full list but you’ll see a sample.
The first thing to check is to make sure you have keywords in your snippets.
The second thing is to put yourself in the shoes of your ideal prospect and for each of your important pages ask yourself the question:
“Would I click on this?”
If the answer is “no” or “maybe” then you have some work to do.
And this is where SEO copywriting comes in.
Updated for 2017.
Recently an email from a top media site was shared with me:
“…we would really rather not edit the links in our author bios to “follows” from “nofollow.” With the number of contributed articles (and articles, period) that we publish every day, we are leaking SEO authority with every “follow” link we allow.”
At the time I saw this email I was surprised, I had thought that the notion that outbound linking “leaks” link juice from a site had died a deserved death many years ago.
To understand the reasoning behind this, it’s helpful to understand that each site is assigned a certain amount of SEO authority that flows from the home page through first the links on the home page and then through out the rest of the site. So does make some logical sense that you are losing SEO authority when you link out to external site.
However most SEOs believe this is a myth.
An outbound link is a link that links outside of your domain to an external site. When a visitor clicks on an outbound link, they will leave your site, this is why some websites open external links in a new browser window. Conversely inbound links (often referred to as backlinks) are incoming links from other sites to yours.
You know the performance of your site matters. But everytime you venture into site speed optimization you feel like you stumbled into a land of foreign geek speak.
Unfortunately site speed is a complex topic and technical (“Configure Entity Tags” anyone?). Some changes can require a web developer to implement the changes. With this post you’ll understand better where performance problems can crop up so you can have a better conversation with your developer. We’ll also cover the low hanging fruit that you can tackle on your own as well as the tools you’ll need to get started.
Although WordPress is my preferred platform, I’ll provide SEO and digital marketing consulting on any site built in any technology. I’ve looked at sites built in Adobe CQ, Magento, Joomla, as well as many sites built using an in house custom CMS (content management system).
I’ve also looked at many WiX sites. WiX is a popular website builder that enables the non geek to build very nice looking websites. You can see the attraction; point and click, drag and drop, and presto you have a website! But what about SEO?
This post has been extensively rewritten to bring it up to date – March 4, 2017
When I first wrote this post in 2015 (and even when I updated it in 2016) my answer to the question “Is WiX SEO friendly?” was a definite NO. WiX had significant problems when it came to SEO.
Today WiX has significantly improved and I’m no longer recommending against it.
However there are some few remaining SEO problems with WiX that you should be aware of. And just because the platform has improved doesn’t mean your site is SEO optimized, you have to make sure you use WiX’s SEO features wisely to have the best chance of ranking well.
If you haven’t already noticed, Google Keyword Planner has severely limited access to its keyword data to marketers. Unless you have a consistent active Adwords campaign, when logging into Keyword Planner today, you’ll see wide ranges for average monthly search volume instead of an actual number.
A range like 100-1K is useless for most keyword research. If you are like most SEOs or marketers your exploration of your keyword space starts with Google Keyword Planner. So what is the alternative?
Do you remember when you learned how to do on page SEO optimization by updating meta tags? For me, it felt like I got the keys to the kingdom. By placing keywords into the meta tags on your page you had a tool that would magically rank your page at the top of the Google search results where it belonged. Right?
Well not so fast, in today’s SEO, optimizing your meta tags may not have the impact you think.
I had the privilege of attending a Meetup featuring the one and only Joost de Valk who along with his wife and partner Marieke van de Rakt gave a presentation on “Beyond SEO: Copywriting for Professionals with Yoast”.The talk covered Joost’s view of Holistic SEO and the increasing importance of Quality Content in SEO.
If you are not familiar with who Joost de Valk is, he is the creator of the very popular Yoast SEO WordPress plugin which just about handles all your WordPress SEO needs. Even though I haven’t gotten around to migrating this site to Yoast SEO, I have extensively used it on many client sites as well as some of my other sites and have watched it’s evolution over several years.
The latest version (3.3) of Yoast SEO has some new features which evaluates the readability of your page or post; Joost and Marieke covered the new readability analysis feature and gave us an inside look on it came about.
I took a look and sure enough the site’s robots.txt file was set up to block Google and the other search engines from crawling the entire site. Fortunately the fix was easy. I changed the file from this:
(You can also just remove the file.)
I might be going out on a limb here, but I’ve seen more problems caused by misuse of the robots.txt file than solved.
One of the big misconceptions about robots.txt disallow directives is that they are a fool proof way to keep pages out of the Google index. Not only is this not true, but when the pages are indexed – they are indexed with almost no information adding a lot of low quality almost duplicate content into the index which might drag down the SEO performance of your site.
The robots.txt file has been around for years. In those early days, bandwidth was more precious and Googlebot often taxed servers, even crashing them, when it crawled a site. So using the disallow directive to keep Google from crawling pages often helped keep a site up. Those concerns are a distant memory today.
Then I wrote a script to vet many of the services that many bloggers were using and found several returned errors or didn’t respond at all. That was the first red flag. The second red flag showed up when I found out about a Matt Cutts warning that Google looked unfavorably on some of the services that many were using. Essentially some of them are spam magnets and you really don’t want your site associated with them.
So I cut down my ping list down from 31 services to just 3. And published my post, which got a lot of attention due to my contrarian stance.