Most people outside the industry think that Technical SEO is all about on page optimization, in other words; updating a page’s meta data and content to have the right keywords in it. However there is another aspect of Technical SEO that has nothing with keywords and can be very powerful for larger dynamic websites. This discipline makes sure the site is search engine crawler friendly by making sure the search bots can find all the pages and also that there are not multiple URLs for a single page, also known as duplicate content. Duplicate content within a site is more common that you might think and I have repeatedly seen significant traffic increases by eliminating it. This is a case study of how we addressed over-indexation of duplicate content by Google and increased traffic by more than 150% overall.
The site in question is a B2B eCommerce site which does a brisk business on weekdays but has much lower traffic during the weekends and holidays. The chart below tracks its weekly non branded organic search traffic for the last 9 months. Here “non branded” means that we exclude organic traffic that arrives via company and domain name keywords, however note we do include product keywords which is a significant part of their traffic.Continue reading
This post originally appeared on October 28, 2011, in January 2014, I made several changes to it to bring it up to date. On August 28, 2014, Google ended the Google Authorship program citing low participation and concern over how it was cluttering the search experience, especially for mobile users. Despite this, you may want to still implement it if you are active on Google+. Users who are logged into their Google account and that follow you on Google+ will see the the enhanced snippet. Otherwise it’s not worth the effort.
Ah the old days. The early days when all Google looked at was how many backlinks pointed to your page. These days, trust, creditability and authority signals are a big part of what Google looks for when deciding whether to rank your page in the top ten. In the summer of 2011, leveraging Google profiles, Google released another feature that helps it do just that. You can now tag your articles with
rel="author" HTML markup to link your pages to your Google profile – which makes you more “real” in Google’s eyes. The nice thing about this is that then Google will put a pic of you next to the snippet of your article that shows up in the search results as Danny Sullivan appears below.
Google’s desire to tie content to “real” people perhaps sheds some light on the 2011 mass removal of non user Google+ accounts.
Recently I came across some client WordPress sites that had the relprev and relnext tags embedded on their archive pages. The sites were set up as a classic WordPress blog, with the reverse chronological listing of the post teasers on the home page with the “Next” button leading to the older posts.
It was these pages that had the tag, for example looking at the HTML source on page 2:
<link rel="prev" href="http://www.awordpresssite.com/page/1/" />
<link rel="next" href="hhttp://www.awordpresssite.com/page/3/" />
BTW: I just randomly came up with the domain “awordpresssite” for the above example, otherwise I don’t know anything about this parked domain.
The question is, is this the correct use of the relprev and relnext tags? My answer is: Probably Not.
HTTP (Hypertext Transfer Protocol) is the communication language (or in technical terms – the protocol) that underlies the communication of the web. When your browser contacts a web server, they speak HTTP to each other. It’s a little like international road sign standards (which is sadly not in use in the US), even though you don’t speak the country’s language you can understand the symbols on the signs on the road. HTTP is how mobile devices, different browsers like Internet Explorer and Chrome, and servers running on different operating systems can all understand each other.
HTTPS is HTTP on top on another protocol called SSL (secure sockets layer), so underneath this common language there is a layer that encrypts the messages that go back and forth on the web. This makes it more difficult for a malicious third party intercepting these messages to use the data for nefarious purposes.
It’s never boring in SEO land. Last week the SEOsphere was all abuzz with the October 16th release of the Google disavow links tool. This past summer I, like many other SEOs, have been quite busy doing “Penguin recoveries”. This name, although convenient, is actually misleading, most sites I’ve been helping aren’t recovering from Penguin itself, but rather from manual penalties imposed by Google. The difference? Penguin is an algorithmic change that that focuses on the quality of the backlinks to your site, and lowers your ranking if much of your backlink profile quality is low and spammy, while a manual penalty is an action taken by a human and usually accompanied by a “unnatural links warning” message in GWMT (google webmaster tools). With Penguin, if you clean up the problem, your site should recover in time, with a manual penalty you have to submit a reconsideration request to get your traffic back.
What we mean by indexation is the pages that Google has crawled of your site and put into its index. When you type in a search term into Google, it then uses the index as a repository to retrieve pages to show in the search engine results pages. Just because a search engine has crawled a page on your site doesn’t mean that it puts it into the index. For one thing you might have told it not to index it. Or Google might have decided it’s not worthy of being indexed.
To see whether Google has indexed your site at all, type the following query into Google, replacing your domain name for “foo.com” below.
If you get a list of URLs you are in business, if you don’t then the simplest way to get your site indexed is to add a link back to your site from a social media network such as Twitter or LinkedIn. Assuming your site is indexed, the next step is click through the pages (at the bottom of the search result page) to see what Google has indexed. You might be surprised.
It doesn’t help. At least not directly. Does that surprise you? Why then, you might be asking, does each SEO article you read about meta tags strongly recommend that you have meta description tags on each page of your website? Because SEO shouldn’t be your only focus when improving your website. And it is true that meta description tags may help your site perform better in the search engine results, albeit indirectly.
Google has ramped up the fight against online piracy by putting more teeth into how it addresses copyright issues and violations. Content stealing and scraping (including images and video) is all too common on the web, up to now the main tool Google offered was the ability to file a DMCA takedown request. A DMCA takedown request, if deemed valid had the potential of removing the offending web page from Google’s index, but did not impact the rest of the site. That has now changed.
Hollywood has long criticized (and sued) Google for not doing enough to protect original content from being stolen and copied. Hollywood in particular hates YouTube. Of course, in this author’s humble opinion, Hollywood doesn’t help its case by clinging to the old dated notion that one needs to have cable to get the latest shows. Perhaps if it made more content legitimately available on the web; via Netflix, Amazon and Hulu, the piracy problem would get better. This cartoon from the Oatmeal sums it up perfectly. I too would love to watch the Game of Thrones series and I’m willing to pay a one time cost for it, but I am definitely not willing to resubscribe to premium cable just for the privilege. I was also rather appalled at the amount of ads (on every page refresh!) NBC made you sit through after paying to stream the Olympics online.
That being said, copyright violations is a real problem for bloggers as I have blogged about previously. I’ve also written an article on how to use images in your blog without violating copyright.
Now DMCA takedowns will have more impact, on August 10, 2012 Google announced on its blog that “valid copyright removal notices” will become a new signal for its rankings. What does this mean? As is typical for Google, the announcement is vague on the details, but if your website receives “enough” (the number is undetermined) DMCA notices from Google, the entire site will start rank lower in Google’s search results pages. So no longer is just one page affected, your entire site can be impacted by the notices. The idea, of course, is that the worst offenders will no longer rank as well as the original content.
As usual the answer is not black and white. It certainly doesn’t work as well as an SEO tactic as it used to. Panda devalued most if not all article marketing directories, so consequently they don’t rank as well and for fewer keywords. Most SEO firms will tell you that guest posting is the new article marketing and to not bother with article marketing at all. But is Article Marketing completely dead? No, depending on your competition and your keyword it’s still an easy way to get some exact match anchor text links to your site, and for some narrow niches that can be enough. It’s certainly better than buying links.
But before you rush out and create a bunch of articles and upload them to every article directory you can find, you might want to read through a case study I recently did exposing article marketing’s dirty little secret. When I was recently studying a client’s backlink profile who had lost traffic due to Google’s Penguin I started noticing what people had done with articles from these directories. So I took it one step further and did a small case study on what happens to the articles once they are published.
With Google’s increased focus on linkspam with the Penguin release, a periodic review and pruning of your site’s backlinks is now a must do.
Your backlink profile may hold unpleasant surprises. All too often, web site owners have no clue what kind of links a past SEO company may have built for them. I’ve found gaming sites and even porn lurking in the backlink profile of a website whose owner would be horrified to be associated with. Even if you never have outsourced to a link building service or used link building software – that link building that you did 5 years ago, that seemed fine at the time, might be a problem waiting to happen – even if you haven’t had a problem with Penguin yet.