UPDATE: January 13, 2014. On December 31, 2013 Google updated GWMT search query data to be “less rounded”. If you visit GWMT you’ll see a line marking where the change took place.
UPDATE: September 30, 2013. Last week we found out that Google is extending SSL protection to users that are not signed in. This means that for ALL organic search traffic that Google sends to your site, the keywords will be removed and replaced with “not provided”. At this point in time, Bing is continuing to provide webmaster keywords in the referrer string.
Starting in October 2011, keywords used in secure searches are now hidden from webmasters by Google. This means that in Google Analytics (or in any other analytics tool), keywords from these searches show up as not provided. A secure search could mean the user is logged into a Google account, is using Firefox 14 or is just explicitly using secure search. A lot has been written about this controversial move, which was done ostensibly to protect privacy but did not extend to Adwords.
Although Google initially stated that the change would affect 10% of queries, there are many sites where the percentage is much higher. In WebEnso’s case, keywords for 62% of searches fall under not provided, and the percentage is still climbing. 62% is a huge percentage. This means that more than two out of three keywords used to find my site is hidden from me. So what to do?
Guest posting can broaden your blog’s reach and build quality backlinks to your site. But as with any popular technique, abuse is on the rise. I manage guest posting campaigns for clients and have written a few guest posts myself. And even though I don’t provide a contact us form on this blog and I don’t solicit them, I get a number emails pitching me on guest posts for WebEnso, which I usually don’t accept.
As a SEO consultant you develop certain filters, some might even say blinders. You look at websites differently than other people. Some aspects of a website you ignore, but others, like the user experience and the content you pay close attention to. A good example of this is site search, the search functionality you find on many sites.
A routine technical SEO recommendation is to noindex any search pages that are crawlable on a site.
If you are not sure what I mean by search pages: go to the search box on this site, webenso.com and type in “wordpress seo” you’ll get a page that looks like the below and has a URL that has a
?s=wordpress+seo query string in it.
Either you don’t want Google to find those URLs or the pages should have the meta robots tag on it set to noindex.
But I’m not here to dive into the details of noindex and technical SEO. My point is that as a SEO you disregard the search pages once addressed and forget about them. This is what I mean by blinders. We SEOs are so focused on Google search, with an occasional journey into Bing search, that we don’t always see the potential that the other search engines have to disrupt the search industry.
When you create a video for your online marketing you have two distinct strategies available to you:
By embedding video on your site, I’m not talking about copying and pasting the YouTube video embed code on your site. Yes that might help your visitor stick around longer which indirectly helps your site’s SEO. However unless you have a really powerful site, your page is almost never going to outrank the video you have uploaded onto YouTube. Especially if the video is well tagged in YouTube.
I’ve given a couple of presentations on WordPress SEO and recommended several plugins. Due to demand I’ve been providing my list as a PDF to attendees. Now here it is for my blog readers.
Most people outside the industry think that Technical SEO is all about on page optimization, in other words; updating a page’s meta data and content to have the right keywords in it. However there is another aspect of Technical SEO that has nothing with keywords and can be very powerful for larger dynamic websites. This discipline makes sure the site is search engine crawler friendly by making sure the search bots can find all the pages and also that there are not multiple URLs for a single page, also known as duplicate content. Duplicate content within a site is more common that you might think and I have repeatedly seen significant traffic increases by eliminating it. This is a case study of how we addressed over-indexation of duplicate content by Google and increased traffic by more than 150% overall.
The site in question is a B2B eCommerce site which does a brisk business on weekdays but has much lower traffic during the weekends and holidays. The chart below tracks its weekly non branded organic search traffic for the last 9 months. Here “non branded” means that we exclude organic traffic that arrives via company and domain name keywords, however note we do include product keywords which is a significant part of their traffic.Continue reading
This post originally appeared on October 28, 2011, in January 2014, I made several changes to it to bring it up to date. On August 28, 2014, Google ended the Google Authorship program citing low participation and concern over how it was cluttering the search experience, especially for mobile users. Despite this, you may want to still implement it if you are active on Google+. Users who are logged into their Google account and that follow you on Google+ will see the the enhanced snippet. Otherwise it’s not worth the effort.
Ah the old days. The early days when all Google looked at was how many backlinks pointed to your page. These days, trust, creditability and authority signals are a big part of what Google looks for when deciding whether to rank your page in the top ten. In the summer of 2011, leveraging Google profiles, Google released another feature that helps it do just that. You can now tag your articles with
rel="author" HTML markup to link your pages to your Google profile – which makes you more “real” in Google’s eyes. The nice thing about this is that then Google will put a pic of you next to the snippet of your article that shows up in the search results as Danny Sullivan appears below.
Google’s desire to tie content to “real” people perhaps sheds some light on the 2011 mass removal of non user Google+ accounts.
Recently I came across some client WordPress sites that had the relprev and relnext tags embedded on their archive pages. The sites were set up as a classic WordPress blog, with the reverse chronological listing of the post teasers on the home page with the “Next” button leading to the older posts.
It was these pages that had the tag, for example looking at the HTML source on page 2:
<link rel="prev" href="http://www.awordpresssite.com/page/1/" />
<link rel="next" href="hhttp://www.awordpresssite.com/page/3/" />
BTW: I just randomly came up with the domain “awordpresssite” for the above example, otherwise I don’t know anything about this parked domain.
The question is, is this the correct use of the relprev and relnext tags? My answer is: Probably Not.
HTTP (Hypertext Transfer Protocol) is the communication language (or in technical terms – the protocol) that underlies the communication of the web. When your browser contacts a web server, they speak HTTP to each other. It’s a little like international road sign standards (which is sadly not in use in the US), even though you don’t speak the country’s language you can understand the symbols on the signs on the road. HTTP is how mobile devices, different browsers like Internet Explorer and Chrome, and servers running on different operating systems can all understand each other.
HTTPS is HTTP on top on another protocol called SSL (secure sockets layer), so underneath this common language there is a layer that encrypts the messages that go back and forth on the web. This makes it more difficult for a malicious third party intercepting these messages to use the data for nefarious purposes.