Stephen Kenwright – Director of Search Branded3
I think, when we are talking about machine-learning and we are talking about it more and more. We gotta ask, what are they actually learning from. It’s not about a growing link graph, it’s not about trying to understand content quality better. That kind of factors, but actually, what Google is certainly trying to do with machine-learning is actually process large amounts of data about what people are actually doing.
And they are getting better and better at that. But when you think about where we are moving towards, well, a huge number of searches that are going to be made by voice, et cetera. It is very clear, there is an answer and if you ask it again, that’s clearly not a good answer to that question.
Gerry White – SEO Consultant Just Eat
And I would say one of the things that Google is desperately trying to do is make sure that the first answer you click on is gonna be the right answer. So they are looking for kind of experiences like pogo sticking, when somebody clicks into an answer and immediately jumps back again.
That doesn’t really look right, generally, unless there is a reason why they are doing it. Like, it’s a flight and they are looking for the weights of the luggage flights and other specific pieces like that. So, some really good examples why people would do something like that, but generally speaking, if somebody goes to a page and comes out immediately, that’s not a good experience.
Dawn Anderson – Director Move It Marketing
And the human raters – it’s been confirmed – do actually use that. They use that to differentiate between bad matches with search results and good matches with search results. So, I increasingly saw, when we look at things like UX (user experience), we can take into consideration things like speed. I know we are going to talk separately on speed, but if somebody got a positive experience via speed… or take the classic example now of the Interstitials.
We know that that’s a bad user experience. And we know that Google is increasingly taking it on board. Things like negative experiences. Because that now is actually a thumbs down, so it does affect rankings. So I think it’s gonna be an increasingly important factor.
Barry Adams – Founder Polemic Digital
The fact that Google is now actively rendering pages as it crawls and indexes. Then, with the headless browser, means that they can see exactly how the page would format for an end-user and make quality evaluations based on that. On where content sits, how content interacts and things like that. What I find a bit worrying is that these tactics are scalably exploitable.
We see that with some services offered online, where you can get a temporary boost on your rankings by just getting a lot of people to click on a specific search result that you identify. And the ranking boost actually happens for as long as you pay people to perform those clicks.
So it will be interesting to see how Google can counteract possible abuses of those ranking factors when they talk about, for example, artificially keeping people on the page so that it looks like, when the user eventually gets back to Google search, it looks like an new session rather than refinement of the same query. And that is something that we’ll be probably struggling with.
And probably also the reason why they haven’t openly confirmed click-through rates from search results as a ranking factor. Despite the fact that a lot of tests show that it is at least a temporary ranking factor. So it will be an interesting point of observation over the next couple of months and years.
I think they have actually, in interviews with various people that have kind of argued that click-through rate is a ranking factor, I think they said – I think it was again Andrey Lipattsev in a roundtable on crawling – said, in actual facts, if something looks to be a popular page, in that it’s getting lots of clicks, then they will zoom over there and have a look. Even though it’s inflated and only temporary. It does have an impact on things changing on Google’s search engine side.
One of the things you mentioned was that Google is rendering pages. I think that one of the things that a lot of people never do enough is use the “Fetch & Render” tool. Within Google you can fetch a page and see how it actually looks within the Webmaster Tools (now Search Console). And it’s surprising how often you do it and it just doesn’t quite look right.
There is something interfering with it and it’s a very simple thing. You can do this for both desktop and mobile. It’s definitely worth doing for both of them, just to double check that you’re not going to be subject to the interstitial penalty, you’re not going to find that something happens inside the page. Maybe because in some CSS something is blocked slightly. So it’s definitely worth doing.
It seems like an odd thing to say “The dangerous part of UX is when people are starting to talk about SEO as just UX”, now. A great UX doesn’t beat a penalty, a great UX doesn’t beat no links. It’s all one of several ranking factors and it’s not just a case where you can’t look at everything from architecture to linking et cetera, just for the sake of user experience.
Yeah, you can’t have a great UX and then never actually talk about what you actually do on the page. With the best will in the world, search engines can only pick up so much. So you do need to have relevant content and all the other factors that play a role in SEO.
I, for one, as not being Google’s biggest fan – understatement. I welcome the interstitials penalty, if you can call it that, the filter, if anything, because it annoys the crap out of me. And I want that to die a swift death.