I had the great privilege to have coffee with Gary Illyes (@methode), a Google webmaster trends analyst dedicated to creating a better search experience for users. Well known in the SEO industry, Gary often speaks at search conferences and is frequently interviewed and quoted in the major SEO blogs.
Speaking of Casual Conversations …..
One thing I do want to make clear is that the Q and A below is not an exact transcription. It was derived from the gist of our conversation and I’ve paraphased, but I’ve done my best to accurately reflect his comments.
Now that is out of the way, let’s dive in.
Also keep in mind that it is about 10,000 pixels of the page that is rendered. It’s not infinite.
Some of us have noticed that the cache: command just shows the HTML snapshot crawled and not the fully rendered page.
That’s not strictly true, for some important pages, the cache: command will show the full rendered page.
I’ve been really struggling to get my clients to fully implement hreflang tags for international SEO. Part of the challenge is that different teams own the different websites (or subsections of the website). Any thoughts?
I spent a half year interviewing site owners and SEOs on this topic with a goal to make it better. People find hreflangs confusing and the implementation resource intensive. I haven’t come up with a solution yet, but I have a couple of comments.
Remember that you can build a XML sitemap for hreflangs as an alternative to placing hreflangs on the page. Just keep in mind that the interlinking requirement is still there, so if you have separate websites you need to place a XML sitemap of the hreflangs on each.
If I have agreement from four out of the ten international site owners to implement the hreflangs, should I include the hreflangs referencing the other six sites even though there won’t be reciprocal links yet?
No, I would limit the tags to the four country/language tags of the sites that you have are working with. You can expand the tags later.
Note: at this point Gary voiced his view that SEOs should sit with the engineering teams so they can influence engineering priorities. There are definitely days I agree!
There’s a lot of confusion about international SEO; inspired by my conversation with Gary, here’s an infographic that I put together:
My conversation with Gary continues in the next section.
When you use the URL parameters feature of Google Search Console and set a parameter to “No URLs”, does that affect the indexation of those URLs as well as the crawling?
It affects both.
What kind of projects are you working on now?
One thing we are taking a look at is adding additional rules to the robot.txt file. A common problem we run into is that Googlebot is blocked by robots.txt disallows from accessing important resource files such as CSS and JS files. We are talking about the idea of a rule that allows crawling of dependent resources even when disallowed.
Would that be the default, ie: allow crawling of dependencies even when blocked?
That would be great, but the legal team said no. So if we do it, it would have to be an explicit rule.
What about a feature or new rule to address the caching of the robots.txt file?
Yes, the time to live for the robots.txt file is 23 hours. But you can use the robots.txt tester today in Search Console to force a refresh. I believe it is a separate button.
I rely heavily on the site: command and I hear it’s going away. Also why is it so inaccurate?
No, the site: command is not going away. The SEO usage of the site: command is a small use case, it’s actually used by scrapers, so the inaccuracy is on purpose.
What else would people find interesting to know about Google search?
It can be difficult to strike a balance between the desire to innovate and offer new features and the reality of the very large community we serve. We know that it takes 3 years for 50% of the internet to adopt a change, that’s a statistic we discuss often when evaluating changes.
Like with structured data.
This is just my vision, but in five years I think we might not need structured data because we are able to understand the page that well. People focus on the rich snippet benefit of structured data, but it also helps us understand the page better. When there is a price with a line across it and then another price showing, a visitor to the page understands that the product is on sale for a discount price. The line doesn’t mean anything to the algorithm, it needs the structured data markup to understand there is a discounted price.
Hope you enjoyed the above article. Thanks Gary!
Kathy Alice Brown is a SEO expert specializing in Technical SEO and Content. In her spare time she loves to get outside.