Podcast: Download
Subscribe: Apple Podcast Google Podcasts Spotify
Google’s Developer Advocate, Martin Splitt joins Search Engine Journal Founder Loren Baker in this live Q&A about Google Core Web Vitals, the delay to June for the Page Experience Update, and other overall performance and speed needs for websites to better compete within Google and convert users.
Here is the entire transcript of the show (please excuse any transcription errors) :
Loren Baker:
Hi, everybody. This is Loren Baker, Founder of Search Engine Journal. And with me today we have a special show all about core web vitals and the page experience update. With me today, I have none other than Mart Splitt of Google. Hey Martin, how’s it going?
Martin Splitt:
HI Loren. Pretty good. How are you doing?
Loren Baker:
Pretty good, thank you. Thanks for staying up so late on a Thursday evening in Switzerland. It’s 11:00 AM here on the West Coast so I really do appreciate it. I’ll do the same for you in the future.
Martin Splitt:
Aw. Thank you very much, Loren.
Loren Baker:
So yeah, we have some folks hopping on but let’s first get started by could you take a second just to introduce yourself Martin, a little bit about what you do, what you focus on et cetera, et cetera.
Martin Splitt:
Sure thing. Yeah. So my name is Martin Splitt, I am a Developer Advocate at the Search Relations Team here at Google. I work with Gary Illyes and John Mueller and Lizzi Harvey and Cherry Prommawin and all the other lovely people. While our team is generally concerned with Google Search, I most of the time, specialize in rendering, crawling, indexing, and specifically JavaScript, which is usually influencing the core web vitals, that’s why I am very happy to talk about that topic as well. There might be super specific questions that I might not have an answer to, in which case I would refer you to our wonderful webmaster forum or the office hours or Q&A sessions that we do on YouTube every now and then.
Loren Baker:
Excellent. Well, it’s a pleasure to have you. Too bad we couldn’t have Gary and John but we’ll get all three of you at once, I think, maybe underwater or something. You’re also a big diving, you’re a diver as well, right?
Martin Splitt:
Yes, correct. Yeah, I do dive in warm water as well as cold water.
Loren Baker:
Okay, nice. Well, no pun intended but let’s dive in the core web vitals. Okay, all right.
Martin Splitt:
All right.
Loren Baker:
So just as a reminder for all of you that are currently watching live, if you have any questions for Martin, please feel free to ask them in the comments, either on Facebook or on YouTube where we’re streaming live right now. Martin, before we had you on, we made sure that we sent out a survey to a lot of the SEJ readers and community. We had a lot of questions come in. Some are very, very specific some are more broad stroke, some have nothing to do with core web vitals, and some of everything you do in core web vitals. But before we get started, a lot of the industry was expecting this update to come out this month, it was pushed back, so is there a specific date that we can expect the page experience update to be unleashed? And do you expect it…
Martin Splitt:
There…
Loren Baker:
Okay, go ahead.
Martin Splitt:
No, go ahead.
Loren Baker:
And do you expect it to happen all at once or over the course of a week or more on that?
Martin Splitt:
So there is no specific date that things will start happening. Currently, the announceable is mid of June so it might be anytime in well what would consist of mid of June. It will not be a off-on kind of situation, it will gradually roll out, it will gradually add things to the mix of signals and it will gradually start being effective. So not like a full-on switch from nothing to all of it and there’s no date announced yet.
Loren Baker:
Great. So we’ll do what we can to prepare for Mid-June.
Martin Splitt:
So I think the timeline is roughly starting mid-June and then should be fully in effect at some point in August
Loren Baker:
Okay good. We have the beginning of the summer to the end of the summer on that front. And you said things will be rolled out gradually, do you see any signal becoming more important in that rollout or prioritized?
Martin Splitt:
Not that I’m aware of.
Loren Baker:
Okay.
Martin Splitt:
What I do know is that at the beginning we will definitely roll out for mobile first and then eventually desktop will join the mix as well.
Loren Baker:
Which was confirmed I think earlier today, right? So mobile and desktop for those of you that are possibly only focusing on one or the other, it’s time to focus on both, right? Which is interesting because I find, from an SEO perspective a lot more companies seem to focus on desktop even if the bulk of their traffic is mobile. So thank you for bringing more awareness to mobile experience and most mobile usability as well.
Loren Baker:
All right. So I’m just going to dive into, not to use a dive thing again, but a lot of different questions on core web vitals and the updates, and then we can take it from there. So one question I see a lot of is, how relative is core web vitals to the space that someone competes in? So for example, if they’re traditionally competing against other sites which are slower than them, and have not updated their core web vitals, and they’ve updated them a bit but they’re saying they’re scoring mediocre, maybe their scoring needs improvement across the board, maybe some good, and maybe some bad. But when they check out the bulk of their competition, their competitions pages aren’t scoring very well at all. Is it still as important to prioritize all of these fixes if the folks in their competitive space are not? And how much of a difference will that make?
Martin Splitt:
This is really, really hard to answer because it obviously is one out of many signals and obviously or it should be obvious that relevancy and good content still matters more than speed, because that content delivered fast is still that content. So assuming all other things being equal, they never are. All other things being equal you might see that the core web vitals then have a tiebreaker effect where you would see a ranking improvement. Obviously that is practically never the case so you might, depending a little bit on your niche and on the specific circumstances on your page versus your competitors pages, you might see bigger effects so you might see smaller effects depending again on the query, on the intention, on the location on all the other factors that might be there. So I can’t say it’s not going to be a big shift because for some people it will be a big shift I can’t say it’s going to be a small shift because… It’s not going to be a small shift because for some people it will be insignificantly small. So that’s something that remains to be observed.
Loren Baker:
Do you think that shift will grow over time though, even if it is small at the beginning.
Martin Splitt:
Maybe. Maybe it doesn’t. I expect it to be roughly similar to https, maybe a bit stronger because https is now a component of the page experience signal once the rollout happens.
Loren Baker:
Interesting. All right, so https is also a component of the page experience update once that rollout happens as well. Next question that came through, and there are a lot of these. I’ll start with the non-Google component first then we’ll go into the Google component. There are a lot of platforms out there, Shopify comes to mind, where typically a developer or a company will go for the base platform but then they’ll add on lots of bells, whistles, and layers that help with other forms of marketing, right? So for example, one of the top eCommerce D2C apps out there for managing and segmenting your email lists is very, very slow and slows down the entire rendering process, right? There’s other apps out there that are utilized by eCommerce companies such as review apps such as chat buttons that either slow down the page loading or also mess with the, and I always mispronounce this, the cumulative layout shift, right? So the page loads, suddenly the reviews load at the bottom, then the stars fill in at the top. Suddenly, the chat button appears after the load of the page. Maybe some of this is done from a early rendering experience, maybe some of it is waited till the end after the main components of the site load, but it still is detrimental to the experience, right?
Martin Splitt:
Yeah.
Loren Baker:
Is the core web vitals update going to give people a break, so to speak, if they’re using a third-party app which is leading to their site having lower scores than if they would if they had no apps on the page? Kind of a strange question. I mean, a lot of these…
Martin Splitt:
No, I understand where you’re coming from. Yeah, I understand where they’re coming from and I’ll probably answer another question following up on that one which might be, what about using certain Google products like Ads or GTM or Analytics. The answer for all of these questions is pretty much the same. Think about what are we trying to do with the page experience signal? What we’re trying to do there is we try to quantify what makes the user have a good experience with a page. And it doesn’t matter what tools are being used, what libraries, frameworks are being used, if there’s JavaScript on the side, if there’s no JavaScript on the side, if there’s apps on the site, if there’s all first party on the site, if it’s using Google Analytics or Google Ads or Google Tag Manager, none of that matters if it slows down the page, it’s detrimental to the experience of the user. It doesn’t matter where the reason is coming from, if it’s like bad first party code or bad third party code, everything is possible to do with less impact on the core web vitals then it is probably done right now out of not being aware of that being a problem or a lack of care or other technical reasons that need to be addressed at some point.
Martin Splitt:
As developers, we’d like to speak of that as technical depth. And so if they make things slower, that reflects in the core web vitals and that’s what matters in the end. Sure, I’m seeing comments saying like, these apps might actually make the experience better for users, but do they? Because if it’s like, “Oh yeah, this app gives us a chat experience.” Yes, but a chat can be implemented in a way that does not make the page slower for everyone who does not want to interact with the chat or even those who want to interact with the chat, it’s not a measure of should we have a chat on our page, yes or no. Yes, if it makes the experience better for the user, have a chat. Just don’t build it in a way that it actually makes the page worse. That’s the thing.
Martin Splitt:
And we can argue about if the core web vitals are really completely modeling that. I would say they don’t but it’s the best approximation that we have right now and actually measuring performance and measuring experience for you this is really, really hard and we will see an evolving set of metrics as part of the core web vitals evolving over time. But generally speaking, the idea is to give pages that are giving a good experience to users a boost. And I don’t think a good experience is if I am reading something about the article, I’m potentially going to buy, and then whatever I’m reading is shifted down because there’s some review stars popping in on the top. Does that mean you shouldn’t have review stars, no, have review stores, but make space for them so that when they pop in nothing else moves on the page. It’s not that it’s impossible to do this.
Martin Splitt:
I get this question a lot with cookie content banners. So is a cookie constant bundle that I have to have for legal reasons is that going to drag down my CLS? Probably yes if it’s implemented in a way that is disruptive to the user it might actually cause cumulative layout shift. If it’s only causing a little bit of it that’s not even a problem, we’re not saying zero is what you need to target, you’re needing to target something that is reasonable which I think is 0.1 which is the percentage of the effective view port and the amount of shifting that happens so there is a certain amount of shifting that can happen without basically falling under the threshold of what core web buyers consider a good experience. But if you implement it, let’s call it lazily, and just go like, “Yeah, it’s going to be fine, yes, it’s going to move everything below once it pops in,” then that’s not a great way of implementing it and you might want to reconsider the way that you implement it.
Martin Splitt:
If you’re not implementing it because it’s coming from a third party, let them know, tell them, “Hey, by the way we noticed that your solution does this, we really like your solution but we really don’t like how it kind of treats our users, so would you consider fixing that?” And there are ways of doing it, it just needs to be done.
Loren Baker:
It’s a good point on many levels. One, you may think something is good for the user because you think, “Oh, having this review section helps people know that this product was reviewed well therefore I should have it and the user wants it.” Question one, does the user really want it, right? Question two, if the user does want it, how do we implement it so it doesn’t move the rest of the page? Same thing with chat buttons, I’m seeing more and more, just as an average internet user, especially in the mobile device, I’m seeing more and more static chat buttons utilized and getting rid of these dynamic chat buttons which I’ll try to scroll down a product page and I’ll hit the chat multiple times because I have giant thumbs, right, or something along those lines, right? And you have to take that into account.
Martin Splitt:
There’s one shoo that I really like but I don’t visit on mobile and that has cost them sales actually because they have a chat that pops over everything on the mobile. On desktop it’s actually not very intrusive, it just pops in the corner, I’m like “Yeah, fine whatever,” I ignore that. But if I’m trying to buy a product and I’m looking for the product and I go to that product page and then a huge chat takes everything away and I have to awkwardly scroll on the mobile phone and then tap it away and then actually… Not great user experience, I’m sorry.
Loren Baker:
Agreed 100%, happens to me all the time actually. And then it’s funny too because a lot of the eCommerce companies that I talk to about chats, I’m like, “Do people use the chat button. Is this important to you? Does it convert?” “Oh, I don’t know. Not sure. I don’t know.” Well, that chat button is currently ruining the user experience.
Martin Splitt:
That’s something that generally is a problem, people are looking sometimes at the wrong things. I get that a lot when people are like, “Oh my God, we lost impressions, Google Search Console shows us that we lost impressions.” I’m like, “But did you lose clicks?” “I don’t know, I have no idea.” Well. if you got impressions into search results and you didn’t get the clicks that you actually need to then I don’t know convert, sell the product, sell the service whatever, did you really lose much?
Loren Baker:
Yeah exactly. Next question that comes in, which you alluded to, does this also apply to Google Analytics, Google Ads, Google Tags, anything on the Google side that’s powered by Google maybe not the same division of Google that you work within. Does this also effect the site negatively?
Martin Splitt:
Yeah. If it makes it slower, it makes it slower. It doesn’t matter where it comes from, if it’s from Google or not. And that’s something that people need to understand that Google Search is the search engine that you’re using and that’s the search engine that you’re working with is very well separated from the rest of Google and that’s for the reason that it would be unfair to favor our products, right? I would not be okay with us saying like, “Oh yeah, sure, we’ll give analytics and ads a pass,” and every other ad provider or any other analytics provider has to deal with the fact that they need to actually optimize for core web vitals. We don’t do that. Everyone gets the same playing field and obviously sometimes people at Google try to be like, “Hey, we’ll search, can you help us with this?” And we’re like, “Here’s the webmaster forum here at the office hours, here’s the documentation that’s what you get it’s public support channels for everyone including Googlers.” That’s why I find it very risky and people are like, “Oh, we’ll be using X because it’s a Google thing.” Doesn’t matter. If it makes your website slower, it makes your website slower.
Loren Baker:
Right, and then also analytics is front-end implementation so there are ways to change how it’s implemented on this side too. And it helps keep the rest of Google accountable and you’re right, it would be a little bit unfair on that side to do so. Are subdomains evaluated independently or part of the root domain for core web vital scoring?
Martin Splitt:
I actually don’t know that specific detail, that’s something that you would have to ask elsewhere. That’s where probably being the webmaster forum is a good place to ask these questions.
Loren Baker:
Okay great. Another question that came in that’s a little bit similar is, are no index pages being used to evaluate a site’s core web vitals as well as index pages? So pages that are blocked from indexing and/or disallowed from content updates.
Martin Splitt:
Right. I mean, in the end, a page gets a boost, if it’s not in the index it can’t get a boost in ranking, right? You have to be in the index to be ranking. So if you want to see a ranking improvement on something that is not indexed then nah.
Loren Baker:
So let me ask this a little bit differently with a real world example. so I have a site, lorensenergydrinks.com, right? And most of everything off of lorenenergydrinks.com is open to google indexing. but then for some reason, I decided to do an ad funnel campaign to give away like a free energy drink and then people can subscribe and save. So I have a subfolder that I’ve set up slash landing pages slash Instagram campaign. And no know index that from the index because I don’t want that to appear, but that’s set up separately than I would one of my main pages, that’s set up with bells and whistles and all these things and this and that and totally, let’s say most of my site is passes core web vital tests with flying colors. But then I have these ads over here, these ad pages, right, which I want to no index, right? Now, can Google still hit them and if those pages are totally, totally failing the test, will that affect the rest of my site?
Martin Splitt:
As far as I’m aware, we’re not mixing these things and specifically the ranking, again, is per page. So for that page, we wouldn’t have any data because we don’t put it in the index, so we can’t store any core web vital results for that specific thing or look it up from whatever data source we’re looking it up at. What I don’t know is if we are accumulating, and again, I do not know the answer to that then. What I don’t know is if there is some sort of accumulation that we do in case we don’t have signals for something, but it’s not as if like, “Oh, you have a page that doesn’t pass core web vitals hence, there will be no core web vitals buttons boost applied to your entire site.” That’s not how that works.
Loren Baker:
Okay gotcha. Well, anyway it is a good excuse for people to get their ad pages in line.
Martin Splitt:
To address Brenda’s comment, I do think we do gather the information and this probably shows up in the core web vitals report because the data is collected from real user metrics. So a user visiting that page does send back telemetry and we do see the data. What I’m not so sure is how exactly it’s being used in ranking because I’m not familiar with the specifics in ranking and I don’t want to, I do not usually answer ranking questions, which is why I’m going out on limb here quite a bit and that’s why I’m saying I’m not sure how exactly an accumulated value might be used in ranking. I don’t think we do that but it might as well be.
Loren Baker:
But those pages are picked up if someone accesses them via Chrome.
Martin Splitt:
Yeah, exactly. If people are visiting them the data comes back into the data collection. What doesn’t happen is this page will not get a ranking boost from core web vitals because we might have the ranking the core web vitals data but it can’t rank if it’s not being indexed and no indexing means it’s not being indexed. So that kind of doesn’t check out, right?
Loren Baker:
Yep. Okay, the next question that comes through is… Oh interesting. So have a have a lot of comments from folks that have been improving their core web vital score and then for the past few weeks they’ve been seeing some positive changes in ranking, right? Is this an indication that the page experience update may be testing right now, perhaps on the weekends, and or slowly rolling out before it officially does? Or is it a coincidence.
Martin Splitt:
It’s neither. It’s not even a coincidence. Page speed has been a ranking factor before.
Loren Baker:
That’s true.
Martin Splitt:
So it has nothing to do with page experience in this case but it just coincidentally, by making the site better accidentally, you got a ranking boost from something that is not page experience.
Loren Baker:
That makes a lot of sense actually, so kudos for getting your page sped up before the page experience update goes out. You may be seeing an improvement in ranking because of those changes that you’ve done but not necessarily because of the page experience update. Okay, that makes perfect sense. Next question. Why does Google PageSpeed Insights show sometimes a completely different result to lighthouse performance reports on the same page. So if someone’s doing a page report on Google PageSpeed Insights compared to Lighthouse or maybe compared to Lighthouse within the Chrome browser, why would they be seeing different testing results on that front?
Martin Splitt:
I would be very surprised if you would not be seeing different testing results. To be honest, I would be surprised if you don’t see different testing results with Lighthouse when you test over multiple days. That’s because, as I said, quantifying performance is actually really, really tricky and there’s lots of factors. And then you have to understand where data comes from. So there’s basically two gigantic buckets of data that you can look at. One is real user metrics, that’s the data, that’s the telemetry data reported back by Chromium browsers for users who have opted in into sending telemetry data back. That you can see in Chrome UX reports, there you have the data that we are getting an anonymized form in terms of how fast the pages have been for actual users out in the field. And that obviously is already data that is very, very unstable in the sense of if one day I have 100 visitors coming to my site on a fantastic broadband connection on a recent MacBook they will probably see that even my website being terrible is probably going to be okay because the network speed is fast, the computing power is available, and that kind of smooths this out for them. And then the next day it’s people on small or slow phones, on shaky mobile connections with high latency, and then everything will be looking a lot different from that.
Martin Splitt:
Obviously as data is collected, we are making sure that our sample size is large enough so that it’s not like 10 people today 10 people tomorrow that would give us completely unusable data. But if the sample size is large enough and the time frame that we’re looking at is not just one day but like a week or a month, then the data starts smoothing out and you can actually draw conclusions from the signal you’re getting there. You can’t really do that like by looking at snapshots. And that’s field data, that’s what we are using in page experience. But we are not, not using in page experience, at least not planned to do that anytime soon, is lab data. Lab data is where you are running a program in some sort of form and then actually try to gather the data and get the data that would be sent as telemetry, and there are multiple tools like that. There’s web page test, there’s PageSpeed Insights, there’s Lighthouse there’s web.dev/test, there’s a plethora of other third-party tools that do these things. And the thing with Lighthouse, especially the Lighthouse that you might be running on your machine in Chrome, is that it does a simulation, it runs within chrome so it is affected by things such as other things running on your computer.
Martin Splitt:
If there is something else that takes away CPU power because you are converting your video in the background or your computer is doing an update or something, if you’re like bittorrenting something to a friend or whatever, then that might saturate your network so you might actually get a lot of jitters, so noise versus signal from Lighthouse. And I know that when I run Lighthouse 10 times, I basically get nine different scores and that’s expected, it’s not real user metrics it is lab data, a lot of the things like LCP, that’s a heuristic so it tries to figure out statistically speaking, get reasonably sure what we think is the main content, is the largest content, and what’s the largest content full pane, and then that that’s when we stop the clock.
Martin Splitt:
But sometimes things just take a little longer, sometimes your browser might take a little longer to actually spawn a process because your processor is busy with other things, and then things take longer. And if they take longer that means you might actually flap around the threshold, right? If it’s like, “Oh, we need to be done with this in two seconds,” and like one time you are done in 1.8 seconds yay the next time it takes 2.2 seconds oh. And then sometimes because your computer might do some bananas heavy lifting computing tasks in the background that you’re not even aware of, it might take five seconds and then you get like very, very wide variety of data, and that’s just how lab data unfortunately is. Unless you have a controlled lab environment, where you’re like, “Okay, so we are requesting the website from a local server so that we can rule out any network weirdness and we are doing it on a computer that does pretty much nothing else than just that all over again, then you get more or less the same scores and even then because it’s heuristically, so it might decide slightly differently what it considers to be the largest conventional pain so that you might actually, and same with FID, same with LCP, you might get slightly different values for these as well. So there’s always some noise in that signal.
Martin Splitt:
And then with PageSpeed Insights, PageSpeed Insights is basically just running in the cloud somewhere, it is leveraging Lighthouse but it’s not running on your computer, it’s running somewhere else in a different environment. I don’t exactly know what this environment looks like because I haven’t really had any insights into that. I hadn’t had any insights into PageSpeed Insights. So I’m assuming that it’s like some sort of shared server infrastructure and you might see differences in depending on how much it’s leveraged and how much available capacity it has at different points in time, so you might actually see fluctuations within PageSpeed Insights, but it’s definitely going to be different from your website being tested on your computer in Chrome’s Lighthouse. And that is, to begin with, if my server happened… I don’t I don’t know where PageSpeed Insights lives. Let’s say PageSpeed Insights lives in Virginia in the data center.
Martin Splitt:
So if my website is hosted here in Switzerland and I test it on my local machine, network doesn’t really play a role because it’s like milliseconds to go to the other end of Switzerland go to my server and get the website back. It’s going to take a while to go over the ocean to the page speed insight server in Virginia and then actually like have that communication happen, so it’s inherently going to be slower. And I think network is mostly… I do see that in the time to first bite being different, for the core web vitals that doesn’t matter so much, but still this machine that it simulates I think it simulates a Moto G4 phone, is going to have very different specs than a Moto G4 simulation on my MacBook. So we are going to see different scoring across the tools and even within the same tools they will fluctuate.
Loren Baker:
What’s the most accurate tool to utilize as provided by Google that has most field data within it?
Martin Splitt:
I guess the best way of doing it is PageSpeed Insights right now because at least you’re getting roughly the same instance and roughly the same configuration. And it also shows you field data as well from CrUX if that’s available, so you get lab data and field data and PageSpeed Insights, which I think is great.
Loren Baker:
Great. We have some really good questions coming in. I encourage any viewers right now if you have any questions to ask them. Before I get into the one that just popped up from Gabriel, I do have a question that came up during a webinar in the past with Keith Goode, who I also see is on from IBM. So the question that was asked previously was, “Hey, I work on multiple different sites, one I’ve optimized and I see everything is passing and search console reported the changes almost instantly. Another site that I’m working on everything is passing according to the tool sets that we utilize, same exact thing, we’re not seeing any data yet in search console after three, four weeks what’s the difference…
Martin Splitt:
Not enough field data.
Loren Baker:
Not in the field data. Not enough visitors, not enough field data.
Martin Splitt:
And even… It can be enough if these visitors are not generating telemetry data then we are still not having the telemetry data. And even if we have some data it might not be enough to for us to… oh damn, the word has to slip me. Ah confidently say this is the data that we think represents the actual signal, so we might decide to actually not have a signal for that if the data source is too flaky or if the data is too noisy.
Loren Baker:
So it may take time.
Martin Splitt:
It may take time.
Loren Baker:
No difference whether it’s more traffic, less traffic, just it takes time to put together.
Martin Splitt:
Yeah. I mean, more traffic is more likely to actually generate data quickly but it’s not a guarantee.
Loren Baker:
Okay. Gotcha. So don’t freak out if you fixed everything and you still don’t see the reporting there. If you feel confident and then once Google feels confident with all the data that they’re able to compile, it should update in time. Next question from Gabriel, “Hey Martin, does Google calculate core vitals looking only at the last 28 days of RUM data?
Martin Splitt:
I don’t know.
Loren Baker:
And does this range impact the rankings?
Martin Splitt:
I don’t know. That’s a really good question. I can try to follow up with the team to figure that one out but I don’t know at this point.
Loren Baker:
Excellent. Thank you for the question Gabriel. Hopefully we’ll have a follow-up soon. Okay, can you confirm or deny if visits from a Google search result from an amp site will use the data from the cache page load to determine core web vital metrics. If that is how it is factored then won’t all amp search visits get perfect LCP and FID scores?
Martin Splitt:
I don’t think it works like that.
Loren Baker:
Okay good. I think that would also determine on how the template was set up as well. Are there any CMS platforms that you think will be most impacted by this update and why?
Martin Splitt:
Don’t know.
Loren Baker:
Okay, it could possibly be CMS platforms that have a lot of additions and layers that are added to them as well on that site, but not sure. Is there going to be any kind of leniency for companies that are having a hard time getting their developers to implement these fixes on this front?
Martin Splitt:
We have announced it last year, we have pushed it back from May to June. At some point, it’s going to happen.
Loren Baker:
It’s a pretty, pretty early announcement so we’ve had a lot of time to prepare for that, right? So really good on that front. And again, is it mandatory? Like you said, it can be a tie breaker, it’s a very small component in the overall algorithm. A lot of questions are coming in like, “Should we do this for Google?” But to your point earlier you’re not just making these fixes for Google you’re making these fixes for the user experience at the end of the day. So I’ll go ahead and answer this one. It is mandatory because you do not… It’s personatory because you do not want to have people visiting your site that can’t load anything, they can’t click on anything, that everything’s moving around. Don’t just think of this as being in a technical SEO, think of this like go into your different usability tools out there that are taking videos of how people are utilizing the site and see how your users are experiencing your site at the end of the day?
Loren Baker:
Especially as you’re elevating this internally to your devs to make these changes, which are critical, and the Google team, to Martin’s point, has given us enough time to get ready for this. You’ve been able to get these fixes in, you’ve been able to build a case for it. At the same time, don’t just fix these issues if it’s showing up negatively on a Google score in webmaster tools. If you’re able to identify usability issues, chances are they’re going to haunt you further down the line from a ranking perspective as this becomes more important. But secondly, you might uncover something that’s keeping people from converting, that’s keeping people from sharing, that’s keeping people from experiencing all the content that they should be experiencing.
Loren Baker:
Can they scroll down? Do your jump links work? What’s happening when they’re trying to load a large infographic image on a small phone? All of that is a component of this really at the end of the day, so don’t just optimize your user experience for a score that Google gives you. There’s plenty of different services out there that will give you feedback from real users as they’re trying to scroll through your site. So sorry about that. Next question, Martin, what happened to your unicorn hair?
Martin Splitt:
Diving and cold temperature has happened. Over the winter, it was really cold, I continued diving and long hair and diving don’t go too well together when it’s cold. And it was not convenient so I just cut my hair short.
Loren Baker:
There you go. That explains quite a bit, that explains quite a bit on that side. Let me go through the rest of these questions that are coming in here. Okay, so this is interesting. I’m not sure if you can answer this or not but, if someone is in a situation where they’re using various different tools and add-ons and apps and plug-ins to be able to make their user experience “better” or upsell the user or whatever it is, and those tools aren’t making the changes at the end of the day and they can’t implement them differently, should they be looking at different solutions.
Martin Splitt:
I guess looking at different solutions is definitely a good idea. I mean, if you had, I don’t know, if you had something that would, let’s say you own a car you drive a lot and you have something that somehow reduces your fuel consumption but it makes you crash into a wall every third day. I mean the lowering and fuel consumption is amazing but there’s like this annoying side effect that you crash your car every couple of days, maybe only every couple of months, maybe only like every six months you crash once.
Loren Baker:
Tesla analogy?
Martin Splitt:
No, but okay, but that would be the case, I would say that the issue with it outweighs the benefit of it and there might be other ways of reducing your fuel usage that you might want to look into, like a different style of driving, a different kind of car. Similar here, if it gives you more stuff that potentially is great but then it has these implications you have to judge for your specific case if you’re okay with the implications that it has or if you’re like, “Nah, I’ll try to see if we have something else that does that without the problems.”
Loren Baker:
It’s a good chance to look through all of these legacy tools that people have utilized over the years and put together a nice SWOT analysis, right? Or what are the pros and cons? Such and such ups conversion rates or whatever by X percent but it takes longer to lower… Yeah thanks, are you laughing at that comment too? So Mr. Goode says that if you’re crashing your car every month or so it’s time to adopt mass transit. So put together that analysis, right? There are some tools out there that are I guess alternative players in their spaces that are really marketing the fact that they’re fast, they don’t mess up anything with the page loads, and they don’t move things around et cetera, et cetera, et cetera on that front.
Loren Baker:
So just look into that because it might just be a better user experience at the end of the day and it might improve the strengths that you’re seeing as well, right? So maybe it’s time to get a new car that’s not crashing all the time and gets better gas mileage or is a little bit more carbon neutral. Okay, next question has come in, which is kind of interesting is, if my core web vital scores are really good from a mobile experience perspective and then two different scenarios, they’re either really good from a desktop experience perspective or they’re really bad from a desktop experience perspective. Will I then rank better in mobile first or for mobile users if my core web vital score is better on the mobile side than it is a desktop side or is there some kind of aggregate score looking at both experiences that are being utilized to weigh a site because you don’t necessarily know how people are going to access from one device to another in the future.
Martin Splitt:
I am not aware of any aggregates at the moment, that doesn’t mean that there won’t be in the future. As far as I’m aware right now, mobile is being used for mobile and desktop is going to be used for desktop.
Loren Baker:
Okay. Mobile score, desktop score, no one really knows what the future holds. So make sure it’s both. Hi Crystal. So another question I’m just going to add on to that a little bit, if a site is seeing 80 to 90% mobile users, right, 10 to 20% desktops, say most of your B2C oriented sites, shopping sites, things like that, should they really be worried about desktop at the end of the day and if they don’t address their desktop experience will that negatively affect them on the mobile side?
Martin Splitt:
I don’t think so.
Loren Baker:
Or just not bother?
Martin Splitt:
I don’t think you need to worry about that too much then.
Loren Baker:
Okay. All right, good. Okay great. Does loading a page without images, ads, pixels, and then rendering them on user activity over time trick the Google page experience tool or measurement system. So let’s say lazy load or back ending loading of things, let’s not use the word trick. How does that affect the page experience?
Martin Splitt:
I mean, the core there would be then, if you implemented it in a way to trick the system, it wouldn’t necessarily work because especially CLS is calculated over the lifetime of the page so you would still see shifts if they would appear after user activity. The other thing is if content only pops in after user activity Googlebot probably wouldn’t see it, except you use lazy loading and again that’s like if you try to trick, you’re easily inviting more trouble than it’s worth. And I’m also not sure how well that would work and if it would work now, we would definitely have an incentive on figuring that out in the future. If you are using techniques such as like infinite scroll and lazy loading and you implement them correctly, it might actually have a positive effect on user experience and thus on the core web vitals as well.
Loren Baker:
Excellent, very cool. So lazy load can be lazy loading images correctly upon the user action scroll or whatever can have a positive effect on the experience in core web vitals, right, because it just makes sense. It almost reminds me of the rules around pop-ups and ads, if they’re action-oriented as opposed to whatever, forcing, it’s less of a negative impact. Okay, got a couple more and then we’ll head out. When it comes to browser caching, are there any sort of SEO guidelines, best practices to communicate with your dev team regarding the max age you would set for certain file types?
Martin Splitt:
We do have a little bit of a guideline for that. I think in general, try to cache as long as possible for pretty much everything. If you can use immutable assets then definitely that is a fantastic way of cutting down the need for network interactions and that’s just generally a win for everyone. There is this additional layer where you can use service workers, that unfortunately does not bring much benefit in terms of Googlebot’s usage of your site but it will still potentially help if you have lots of returning visitors and with the service worker you can do more granular decisions on when to refresh cache, but there’s no hard and fast rule or guideline in terms of what you want to do with your developers. My general advice is cache as long as it’s reasonable. And that might be shorter than you like and shorter than you think depending a little bit on your site, the content, and the way that you work with the content.
Loren Baker:
Excellent. Another interesting question that comes up is with Google Chrome DevTools and testing the page load, testing and documenting the page load experience within DevTools and pulling out CLS information specifically, do you recommend that storage is cleared within Google Chrome or, like you had said earlier, any apps or tools that may be running on the back end that affect overall usability, loading, timing, et cetera, et cetera, on that. And what’s the best way to prepare Google Chrome to be able to accurately pull these experience numbers if someone’s trying to say visualize on the timeline all of these issues with their development teams?
Martin Splitt:
I would probably try to basically either launch a completely separate Chrome instance and then that be probably the incognito window of that instance so that I’m not having any extensions or stuff available there. Or just really use something like Puppet which also launches an entirely independent Chrome instance and I would do that on a machine where I have as few applications running as possible. Maybe I run like a virtual machine somewhere in the cloud that does nothing else but run my Chrome instance and then give me the data back. Because, as far as I’m aware, I’m not sure how to get that through Puppet but then puppeteer, but there must be a way of getting the profiles out and then you can import the profiles file into your browser and actually investigate it with your DevTools in your browser, so that’s definitely a possibility.
Loren Baker:
So that was Puppeteer?
Martin Splitt:
Puppeteer yeah.
Loren Baker:
Pptr.dev
Martin Splitt:
Yes.
Loren Baker:
I’ll drop a link in here, in the stream yard right now. Up tier. Go and check that out afterwards. I had a lot of questions about Google being able to actually identify what the issues are if Google can identify UX issues and core web vitals within Chrome on a page by page level, can we get that from a reporting perspective within search console?
Martin Splitt:
Good question. I guess that’s not always going to be easy especially because we have to make sure that we are not leaking too much information because that might make things less private than you want. But I don’t know if there’s anything planned to give you more insights into that. You can get a lot of insights about that already if you’re testing in your local DevTools, but I don’t think there’s anything planned on the roadmap for search console.
Loren Baker:
Excellent. So let’s see what day is today, it’s May 20th I believe. You had said we have… the clock is ticking, actually…
Martin Splitt:
The clock is ticking, tick tock.
Loren Baker:
… if we were on the original timeline, we’d probably all be freaking out right now. But we’ve been given an additional month, more or less, right? You would say it’s probably going to be, we’re looking at a mid-June rollout, lasting until mid, late August or lasting until August so a slow rollout, some things changing over time. For those that are currently viewing or listening right now, what tips do you have for if folks have not been able to get these fixes in, if they’re currently working on it. If it doesn’t look like they’re going to be able to get them in before mid-June, any tips or anything that you can add to this discussion that you’d like to that give people maybe a better piece of mind and/or kind of it’s time to put the pedal to the metal?
Martin Splitt:
So I think, first things first, don’t panic, don’t completely freak out. Because as I said, it’s a tiebreaker. For some it will be quite substantial, for some it will not be very substantial, so you don’t know which bucket you’ll be in basically because, again, depends a lot on context and industry and niche, so I wouldn’t worry too much about it. I think generally making your website faster for users should be an important goal and it should not just be completely ignored, which is the situation in many companies today that they’re just like, “Yeah whatever.” I think when you can get your company to shift from, “Ah whatever,” to, “Oh yeah, we need to get that done but we can’t get it done until June,” that’s the milestone, that’s the improvement that you want to have, you want to have a commitment to make things better and you want to be the one who said, “Hey, this is going to be a factor in rankings so don’t be surprised if we are seeing some changes in ranking.”
Martin Splitt:
I wouldn’t oversell it as, “Oh my God, we need to stop everything else and we need to just focus on core web vitals right now,” because that might backfire in the end as well. So you want to take a reasonable approach about this, you want to be like, “Hey, be sensitive when we’re making new changes,” especially if you start new projects. So for instance, I recently created a new website for a site project of mine and I did that with Hugo, which is a static site generator, and I had to pick a theme. So I clicked on the first theme that I liked and I liked it a lot and then I ran a few tests on it and I noticed that their sample site is already having really, really bad performance.
Martin Splitt:
So I kept looking and I found a theme that I really liked that had really good performance out of the box. So then I chose that over the other because it gives me a better performance out of the box. That’s an approach that I would take for new projects definitely advise them to look into core web vitals from the get-go for projects that are already in maintenance mode or are basically already actively being deployed. I would look into making some sort of plan for the mid-term future saying like the next six months, eight months, 10 months, 12 months, to actually work on the core web vitals and to improve performance not just for an SEO perspective but also literally for your users.
Loren Baker:
Yeah. Not to sound like a broken record, but for the users and for conversion is, I mean for me, from a consulting perspective, that’s always really helped. I’m going to drop a link in here right now, it’s basically a cloud flare study that looked at page performance, speed performance and conversion rates, right? So it’s very easy for folks in SEO to stick within our little SEO bubble and think that this is only something that helps with Google ranking or it’s a tiebreaker at the end of the day or whatever. But the fact is that all of us within SEO, whether we like it or not, we’re in charge of one of many sales and lead generation tools, right? So if we can make the case to make this change, whether it’s by June, or like whether Martin said, six-month plan, 12-month plan, whatever, don’t freak out, but let’s let’s make this plan, we can improve things across the board, right? So one good way to sell this internally is not necessarily to say, “You need to make this because if you don’t we’re not going to increase rankings 100%.” But it’s to pile on all the other all the other benefits, sorry.
Martin Splitt:
This keeps reminding me of this cartoon where there is a climate scientist on the stage and this is how we can improve nature and ecology and the air quality and reduce pollution and reduce our reliance on non-renewable energies, and someone in the audience gets up and says, “But professor, what if we make all these improvements and then the world wouldn’t end otherwise anyway?” It’s like, “Ah.” Yeah, why accidentally make the world a better place, it’s kind of like a weird question right? It’s like you’re making things better for your users, that’s never not going to pay off in some form.
Loren Baker:
Yeah, so there’s going to be benefits even outside of SEO.
Martin Splitt:
Yeah.
Loren Baker:
So once you make the case and get this implemented internally with the developers, and the PPC team shows up to a meeting and starts bragging about how conversions have increased, make sure that you plant that seed to let everyone know that there’s going to be better PPC conversions, if they’re utilizing the main site for the landing pages, there’s going to be better social media conversions, better email conversions, better direct traffic conversions, which mostly is a search anyway, and better conversions across the board, right? So definitely get that in from an internal selling perspective.
Martin Splitt:
Exactly.
Loren Baker:
Because promising a ranking change as a result of this is not necessarily going to be guaranteed, but promising a user a better user experience and then the ability to actually convert better and especially if that chat button’s not taking up someone’s entire phone. Who really wants to chat with the company anyway when they’re making a purchase decision? I find that most of the time that just slows down the entire process. So when I walk into the Apple Store, I don’t want to have a conversation with somebody that’s working there, I just want to buy and get out of there. So anyway, Martin it’s getting late where you are, thank you so much for jumping on. I hope you really enjoy your day of diving tomorrow and this is going to be cold water diving, right?
Martin Splitt:
Yes, correct. I think the water is 11 degrees celsius.
Loren Baker:
All right, so watch out for the catfish and let us know if find anything cool there on the bottom.
Martin Splitt:
Will do.
Loren Baker:
It’s been a pleasure everybody, we’re going to be following up probably in a couple of different SEJ posts with everything, so we’re about to sign off but thanks so much and looking forward to seeing all this roll out in mid-June.
Martin Splitt:
Awesome, looking forward to seeing all of your wonderful faces and smiles again soon and thanks a lot for having me Loren.
Loren Baker:
You’re welcome Martin, thanks for everybody for tuning in. This is Loren Baker and Martin Splitt with SEJ Show, signing off. Cheers.