Three years is an awfully long time ago in the internet age. Let’s take a trip in the Internet way back machine to see what SEO used to be. That’s when we used to think of SEO of as a two pronged discipline, namely on-page optimization and off page optimization. Off page optimization, or linkbuilding has been replaced by inbound marketing as I just wrote recently. On page optimization is still very valid, but it’s boundaries have gotten rather fuzzy.
It used to be all about keywords, keywords, keywords. Keywords in your title tag, keywords in your header tags, keyword density. Keywords are still really important, but so is engagement and user behavior as signals the search engine algorithms pay attention to. It’s not just about getting the traffic, it’s winning the hearts and minds of the user once they are on your site.
You want to take that optimization and conversion mindset and apply it to the little welcome mat that Google builds for your page when it ranks for a search query. Known as a snippet, this is the listing that appears in Google’s search results pages, called SERPs for short.
WordPress is a great CMS and sitebuilder, but one area it can fall a little short is its interstitial pages that are designed to help you navigate a site’s posts via category, author or even date. WordPress automatically generates archive pages for each of these grouping mechanisms, but from the SEO perspective these pages fall a little short in providing unique content that the search engines love.
Search engines such as Google want to see pages that have content that can’t be found anywhere else on the site (or in fact elsewhere on the internet). But WordPress’s default archive pages just show a list of posts for that given category (or author). If the blogger has used the “more” tag, then the post’s content is shown on the archive page up to the tag, otherwise the post is shown in it’s entirety. Either way there is no new content on the page that isn’t already on on the individual post pages. Today we are going to look at ways to modify your WordPress category page to be more SEO friendly.
UPDATE: January 13, 2014. On December 31, 2013 Google updated GWMT search query data to be “less rounded”. If you visit GWMT you’ll see a line marking where the change took place.
UPDATE: September 30, 2013. Last week we found out that Google is extending SSL protection to users that are not signed in. This means that for ALL organic search traffic that Google sends to your site, the keywords will be removed and replaced with “not provided”. At this point in time, Bing is continuing to provide webmaster keywords in the referrer string.
Starting in October 2011, keywords used in secure searches are now hidden from webmasters by Google. This means that in Google Analytics (or in any other analytics tool), keywords from these searches show up as not provided. A secure search could mean the user is logged into a Google account, is using Firefox 14 or is just explicitly using secure search. A lot has been written about this controversial move, which was done ostensibly to protect privacy but did not extend to Adwords.
Although Google initially stated that the change would affect 10% of queries, there are many sites where the percentage is much higher. In WebEnso’s case, keywords for 62% of searches fall under not provided, and the percentage is still climbing. 62% is a huge percentage. This means that more than two out of three keywords used to find my site is hidden from me. So what to do?
Guest posting can broaden your blog’s reach and build quality backlinks to your site. But as with any popular technique, abuse is on the rise. I manage guest posting campaigns for clients and have written a few guest posts myself. And even though I don’t provide a contact us form on this blog and I don’t solicit them, I get a number emails pitching me on guest posts for WebEnso, which I usually don’t accept.
As a SEO consultant you develop certain filters, some might even say blinders. You look at websites differently than other people. Some aspects of a website you ignore, but others, like the user experience and the content you pay close attention to. A good example of this is site search, the search functionality you find on many sites.
A routine technical SEO recommendation is to noindex any search pages that are crawlable on a site.
If you are not sure what I mean by search pages: go to the search box on this site, webenso.com and type in “wordpress seo” you’ll get a page that looks like the below and has a URL that has a
?s=wordpress+seo query string in it.
Either you don’t want Google to find those URLs or the pages should have the meta robots tag on it set to noindex.
But I’m not here to dive into the details of noindex and technical SEO. My point is that as a SEO you disregard the search pages once addressed and forget about them. This is what I mean by blinders. We SEOs are so focused on Google search, with an occasional journey into Bing search, that we don’t always see the potential that the other search engines have to disrupt the search industry.
When you create a video for your online marketing you have two distinct strategies available to you:
By embedding video on your site, I’m not talking about copying and pasting the YouTube video embed code on your site. Yes that might help your visitor stick around longer which indirectly helps your site’s SEO. However unless you have a really powerful site, your page is almost never going to outrank the video you have uploaded onto YouTube. Especially if the video is well tagged in YouTube.
Most people outside the industry think that Technical SEO is all about on page optimization, in other words; updating a page’s meta data and content to have the right keywords in it. However there is another aspect of Technical SEO that has nothing with keywords and can be very powerful for larger dynamic websites. This discipline makes sure the site is search engine crawler friendly by making sure the search bots can find all the pages and also that there are not multiple URLs for a single page, also known as duplicate content. Duplicate content within a site is more common that you might think and I have repeatedly seen significant traffic increases by eliminating it. This is a case study of how we addressed over-indexation of duplicate content by Google and increased traffic by more than 150% overall.
The site in question is a B2B eCommerce site which does a brisk business on weekdays but has much lower traffic during the weekends and holidays. The chart below tracks its weekly non branded organic search traffic for the last 9 months. Here “non branded” means that we exclude organic traffic that arrives via company and domain name keywords, however note we do include product keywords which is a significant part of their traffic.Continue reading
This post originally appeared on October 28, 2011, in January 2014, I made several changes to it to bring it up to date. On August 28, 2014, Google ended the Google Authorship program citing low participation and concern over how it was cluttering the search experience, especially for mobile users. Despite this, you may want to still implement it if you are active on Google+. Users who are logged into their Google account and that follow you on Google+ will see the the enhanced snippet. Otherwise it’s not worth the effort.
Ah the old days. The early days when all Google looked at was how many backlinks pointed to your page. These days, trust, creditability and authority signals are a big part of what Google looks for when deciding whether to rank your page in the top ten. In the summer of 2011, leveraging Google profiles, Google released another feature that helps it do just that. You can now tag your articles with
rel="author" HTML markup to link your pages to your Google profile – which makes you more “real” in Google’s eyes. The nice thing about this is that then Google will put a pic of you next to the snippet of your article that shows up in the search results as Danny Sullivan appears below.
Google’s desire to tie content to “real” people perhaps sheds some light on the 2011 mass removal of non user Google+ accounts.
It’s never boring in SEO land. Last week the SEOsphere was all abuzz with the October 16th release of the Google disavow links tool. This past summer I, like many other SEOs, have been quite busy doing “Penguin recoveries”. This name, although convenient, is actually misleading, most sites I’ve been helping aren’t recovering from Penguin itself, but rather from manual penalties imposed by Google. The difference? Penguin is an algorithmic change that that focuses on the quality of the backlinks to your site, and lowers your ranking if much of your backlink profile quality is low and spammy, while a manual penalty is an action taken by a human and usually accompanied by a “unnatural links warning” message in GWMT (google webmaster tools). With Penguin, if you clean up the problem, your site should recover in time, with a manual penalty you have to submit a reconsideration request to get your traffic back.
What we mean by indexation is the pages that Google has crawled of your site and put into its index. When you type in a search term into Google, it then uses the index as a repository to retrieve pages to show in the search engine results pages. Just because a search engine has crawled a page on your site doesn’t mean that it puts it into the index. For one thing you might have told it not to index it. Or Google might have decided it’s not worthy of being indexed.
To see whether Google has indexed your site at all, type the following query into Google, replacing your domain name for “foo.com” below.
If you get a list of URLs you are in business, if you don’t then the simplest way to get your site indexed is to add a link back to your site from a social media network such as Twitter or LinkedIn. Assuming your site is indexed, the next step is click through the pages (at the bottom of the search result page) to see what Google has indexed. You might be surprised.