Guest posting can broaden your blog’s reach and build quality backlinks to your site. But as with any popular technique, abuse is on the rise. I manage guest posting campaigns for clients and have written a few guest posts myself. And even though I don’t provide a contact us form on this blog and I don’t solicit them, I get a number emails pitching me on guest posts for WebEnso, which I usually don’t accept.
As a SEO consultant you develop certain filters, some might even say blinders. You look at websites differently than other people. Some aspects of a website you ignore, but others, like the user experience and the content you pay close attention to. A good example of this is site search, the search functionality you find on many sites.
A routine technical SEO recommendation is to noindex any search pages that are crawlable on a site.
If you are not sure what I mean by search pages: go to the search box on this site, webenso.com and type in “wordpress seo” you’ll get a page that looks like the below and has a URL that has a
?s=wordpress+seo query string in it.
Either you don’t want Google to find those URLs or the pages should have the meta robots tag on it set to noindex.
But I’m not here to dive into the details of noindex and technical SEO. My point is that as a SEO you disregard the search pages once addressed and forget about them. This is what I mean by blinders. We SEOs are so focused on Google search, with an occasional journey into Bing search, that we don’t always see the potential that the other search engines have to disrupt the search industry.
When you create a video for your online marketing you have two distinct strategies available to you:
By embedding video on your site, I’m not talking about copying and pasting the YouTube video embed code on your site. Yes that might help your visitor stick around longer which indirectly helps your site’s SEO. However unless you have a really powerful site, your page is almost never going to outrank the video you have uploaded onto YouTube. Especially if the video is well tagged in YouTube.
I’ve given a couple of presentations on WordPress SEO and recommended several plugins. Due to demand I’ve been providing my list as a PDF to attendees. Now here it is for my blog readers.
Most people outside the industry think that Technical SEO is all about on page optimization, in other words; updating a page’s meta data and content to have the right keywords in it. However there is another aspect of Technical SEO that has nothing with keywords and can be very powerful for larger dynamic websites. This discipline makes sure the site is search engine crawler friendly by making sure the search bots can find all the pages and also that there are not multiple URLs for a single page, also known as duplicate content. Duplicate content within a site is more common that you might think and I have repeatedly seen significant traffic increases by eliminating it. This is a case study of how we addressed over-indexation of duplicate content by Google and increased traffic by more than 150% overall.
The site in question is a B2B eCommerce site which does a brisk business on weekdays but has much lower traffic during the weekends and holidays. The chart below tracks its weekly non branded organic search traffic for the last 9 months. Here “non branded” means that we exclude organic traffic that arrives via company and domain name keywords, however note we do include product keywords which is a significant part of their traffic.Continue reading
This post originally appeared on October 28, 2011, in January 2014, I made several changes to it to bring it up to date. On August 28, 2014, Google ended the Google Authorship program citing low participation and concern over how it was cluttering the search experience, especially for mobile users. Despite this, you may want to still implement it if you are active on Google+. Users who are logged into their Google account and that follow you on Google+ will see the the enhanced snippet. Otherwise it’s not worth the effort.
Ah the old days. The early days when all Google looked at was how many backlinks pointed to your page. These days, trust, creditability and authority signals are a big part of what Google looks for when deciding whether to rank your page in the top ten. In the summer of 2011, leveraging Google profiles, Google released another feature that helps it do just that. You can now tag your articles with
rel="author" HTML markup to link your pages to your Google profile – which makes you more “real” in Google’s eyes. The nice thing about this is that then Google will put a pic of you next to the snippet of your article that shows up in the search results as Danny Sullivan appears below.
Google’s desire to tie content to “real” people perhaps sheds some light on the 2011 mass removal of non user Google+ accounts.
Recently I came across some client WordPress sites that had the relprev and relnext tags embedded on their archive pages. The sites were set up as a classic WordPress blog, with the reverse chronological listing of the post teasers on the home page with the “Next” button leading to the older posts.
It was these pages that had the tag, for example looking at the HTML source on page 2:
<link rel="prev" href="http://www.awordpresssite.com/page/1/" />
<link rel="next" href="hhttp://www.awordpresssite.com/page/3/" />
BTW: I just randomly came up with the domain “awordpresssite” for the above example, otherwise I don’t know anything about this parked domain.
The question is, is this the correct use of the relprev and relnext tags? My answer is: Probably Not.
HTTP (Hypertext Transfer Protocol) is the communication language (or in technical terms – the protocol) that underlies the communication of the web. When your browser contacts a web server, they speak HTTP to each other. It’s a little like international road sign standards (which is sadly not in use in the US), even though you don’t speak the country’s language you can understand the symbols on the signs on the road. HTTP is how mobile devices, different browsers like Internet Explorer and Chrome, and servers running on different operating systems can all understand each other.
HTTPS is HTTP on top on another protocol called SSL (secure sockets layer), so underneath this common language there is a layer that encrypts the messages that go back and forth on the web. This makes it more difficult for a malicious third party intercepting these messages to use the data for nefarious purposes.
It’s never boring in SEO land. Last week the SEOsphere was all abuzz with the October 16th release of the Google disavow links tool. This past summer I, like many other SEOs, have been quite busy doing “Penguin recoveries”. This name, although convenient, is actually misleading, most sites I’ve been helping aren’t recovering from Penguin itself, but rather from manual penalties imposed by Google. The difference? Penguin is an algorithmic change that that focuses on the quality of the backlinks to your site, and lowers your ranking if much of your backlink profile quality is low and spammy, while a manual penalty is an action taken by a human and usually accompanied by a “unnatural links warning” message in GWMT (google webmaster tools). With Penguin, if you clean up the problem, your site should recover in time, with a manual penalty you have to submit a reconsideration request to get your traffic back.
What we mean by indexation is the pages that Google has crawled of your site and put into its index. When you type in a search term into Google, it then uses the index as a repository to retrieve pages to show in the search engine results pages. Just because a search engine has crawled a page on your site doesn’t mean that it puts it into the index. For one thing you might have told it not to index it. Or Google might have decided it’s not worthy of being indexed.
To see whether Google has indexed your site at all, type the following query into Google, replacing your domain name for “foo.com” below.
If you get a list of URLs you are in business, if you don’t then the simplest way to get your site indexed is to add a link back to your site from a social media network such as Twitter or LinkedIn. Assuming your site is indexed, the next step is click through the pages (at the bottom of the search result page) to see what Google has indexed. You might be surprised.