Episode Transcript
[00:00:00] Speaker A: Hello, I'm Jeremy Rivera, host of Unscripted SEO. I'm here with Mark Williams Cook. Give yourself a quick introduction and we'll get into the weeds really fast, I guarantee.
[00:00:11] Speaker B: So for those that don't know me, as I said, my name is Mark. I've been doing SEO for around about 20 years. I'm director of an SEO agency called Canda and I run a intent research tool called Also Asked. And I'm probably kind of known most in the SEO community for my screaming into the Void on LinkedIn with my unsolicited SEO tips and the core updates newsletter.
[00:00:35] Speaker A: Fantastic. And if anybody's listening and isn't following him, isn't subscribed to the newsletter, shame on you. There's gold in them thar hills. Go and do it. So I have been in several conversations recently about relevance and site quality and I was wondering what your perception was on how Google currently in its, you know, we have different iterations as they break, we. They break things, we try to optimize for it and then they change path. We are shifting the meta. So in our current meta, is it siege tanks? Is it broodling, Is it a mass Marine? Obviously these are all starcraft references. So what is the current meta when it comes to Google and relevance and Google and search quality?
[00:01:24] Speaker B: You've watched my talk, haven't you?
[00:01:26] Speaker A: What? What? I wasn't listening to it on the drive over here, so I'm super fresh with exactly what you said. That would be cheating. It's unscripted, but I didn't write it down. There's no script. It's just happening. It's in my new.
[00:01:38] Speaker B: So, yeah, I mean, that's a huge question. Right. And so to try and boil it down, the biggest changes I think we've seen to the meta defining that as the things that currently move the needle the most or what Google's looking at the most obvious. And it's, it's like a layman label. And by layman I mean me to call it brand, because I don't think it is brand. I think brand as an actual concept of how marketers talk about it is quite big and diverse. I know you had Morty on the show, right, who's got a lot to say about brand, but it's certainly connected to your, that concept of how well you known you are as a thing and how much authority you have. And I think that overrides a lot of what you might traditionally think of as relevance. So point, point in case being recently I saw the Vatican site was Hacked. And it was ranking on the first page of Google for an injected page about CBD gummies.
[00:02:44] Speaker A: No.
[00:02:46] Speaker B: And right. So.
And you know, you see a lot of people posting about things like, you know, topical authority and stuff like this, and I just want someone to explain to me why the Vatican website is a topical authority on CBD gummies. Right, okay. Because it is not right. But it ranked. It got a pass. It got a pass and it ranked.
[00:03:08] Speaker A: So, right.
[00:03:10] Speaker B: There is a big chunk, I think, that comes down currently to kind of brand stuff.
And it's certainly a band aid, I think, for the shortfalls that have been magnified in Google's algorithm because of Gen AI.
[00:03:30] Speaker A: That makes sense. What do you think about this concept that if we as SEOers shift our perception from a page level of optimization to an entity approach and consider all of the ways that the thing that we're working with is mirrored online, it's like the shadow realm or, you know, out through the looking glass. Like, there is a representation of what you do in the real world, who you are. And our efforts should be to make the mirror as optimized as possible, usually fully in sync with what's in the real world. But sometimes maybe we can do better. And by that I mean, you know, if we can create more signals of authority online, then that's, that's going to reflect positively. So what you think about that model of thinking about SEO as entity optimization? Because that would address how we're seeing the current meta develop. Right? Because Google is, is considering, okay, this entity of the Papal State.
You know, it's a reflection of what is in the real world, a very, very trusted source, like one of the, you know, in many cultures, the highest trusted source. So of course that reflects online in certain ways because there's so many dioceses, so many churches that link to the Papal State site, to the, the Pope's personal website, his blog. So the fact that they found an exploit and put that there shows that Google does give weight and credence to those signals. So I don't think necessarily that it totally invalidates that there isn't some sort of relevance factor with most links, but maybe it's in balance. You know, like, if you have a scale of like, relevance versus authority for your entity, if you have more authority, then relevance matters a little bit less that it's a factor, but it gets weighed out by the others.
[00:05:30] Speaker B: Yeah, I think, yeah, there's a couple of questions in there. So in terms of like, entities, I do think that as a kind of model to work with, especially as we go into the future, is important.
[00:05:41] Speaker A: Yes.
[00:05:42] Speaker B: Because we have moved a long time ago away from, you know, Google put it, I think over a decade ago, they talked, when they were talking about their knowledge graph, they talked about moving from strings to things is how they talked about it. Right. How my feeling of how it works is that the authority, if we want to call it that. And again, these are all kind of very dangerous terms to play with because they all have really specific meanings in different contexts. But if we say we're defining authority as like the magical stuff that makes websites rank. Right. Right at the moment, I feel that's attached to sites, not. I feel not two entities. I think there is some kind of. Kind of grounding system which is more about understanding the query and how things are connected together. That is where the entity stuff comes into play. Understanding relationships between things that maybe in text aren't obviously related. The great example is like in Google's knowledge graph, you know, when you search the name of actors, you know, you get movies and films that they're in because that's. And that's a nice, clear demonstration of how entities can be used within search to be super helpful. Help relevance, help saying, okay, we need to look at this, this site because it's got this stuff on it in, you know, in whatever calculation we're doing. But when it comes down to the actual kind of authority, there are patents and stuff talking about identifying, for instance, on podcasts like we're on now to like fingerprint people's voices and attach them to real people so, you know, they can work out who's. Who's saying things. And there's lots of talk when it comes to eat about author having author profiles. Right. My personal feeling and experience on that is that, you know, I don't think adding an author because it says that it's me that I've written about SEO will particularly help because if it did, that's all Black Hat SEOs would be doing. Right. They'd be taking famous, famous people, people's names and just making authors and being like, yeah, they got the same name and the same. If, if we could prove that there was something, you know, that my voice is on your podcast and that somehow helped it. For SEO, you can, you could take this video and whack it in 11 labs and clone my voice, like very easily and use that. So I think we have to be careful with things that can be very easily exploited by webmasters. Search engines have been burned by that for years.
[00:08:17] Speaker A: Right, right.
[00:08:17] Speaker B: And I Think the way they're doing it, with attaching authority to these sites and things that can to at least some level be authenticated, which breaks when they're hacked, obviously is a way to keep things under control again, and that's again something because it's got to be at least, maybe recent core updates notwithstanding, it's got to be at least somewhat predictable the output. And from what I've seen of Google as well, and when I heard Google's talk, obviously there's loads of super clever maths going on in the background, but the general systems have been to me, surprisingly simple. For instance, I asked Gary Ish a really direct question. I got a direct response about the last modified on web pages, you know, because we talked about people tricking Google by saying, oh yeah, last modified yesterday, last modified yesterday when they're not making significant changes. And Google said, hey, we, we check this, we check to see if you've made significant changes. And I said to Gary, is there like a, is there like an analog of how much you trust websites like this signal based on if they're saying they're doing something and they're not? And he said, no, it's just binary. We either trust them or we do not. And it's the same with things like canonical tags. If you get them all wrong, it's not like Google will trust you a bit less. They just have a very simple thing of, well, you don't seem to know what you're doing, so we're just going to ignore all of your canonical tags and just work it out ourselves. And it made sense to me from. Because how many kind of offshoots algorithmically do you get if you have at each stage this analog, you know, or you know, 1 to 100 scale? Yeah, it's just like it doesn't become predictable and it's become huge. You're massively over complicating it. So I think there are, and I saw this when, you know, we looked at the kind of Google endpoint stuff, we got just loads of like different binary classifiers, like is it this or is it this? Which then determines which kind of route this little ball drops down.
[00:10:15] Speaker A: I think that's, I think that's fair.
Yeah, you're right. It would introduce, because we have to think from an engineering perspective, you know, what they could reasonably create as a system that tests and then for outcome of that if like trying to modulate and come back later and oh, we're going to adjust this and it's on a scale that's so many additional tests of like, okay, well it's down to 75. Well, it's 50, then it's 25. It's it. It would definitely be very much a hard system to test out. And I got a feeling that Google in certain ways is about efficiency on the engineering side, trying to create systems that they can test and predict.
What is your take away from them having actually slapped down Forbes to at least. To a certain, at least publicly, do you think that that's like a confirmation of the site quality that you're taught that you're talking about the papal site ranked for this? And the reason they're introducing this site abuse factor is because it is abused. And are they recognizing that, that publicly? It seems like certain people are getting away with it, they don't want to change it too much. Or is it a recognition and a signal that we're going to be shifting to another. Another meta where these top dogs aren't going to be crushing everything and ranking for CPD gummies.
[00:11:43] Speaker B: I think there's a very good reason that the site reputation abuse isn't algorithmic at the moment, because that was kind of the signals they were giving us at the start of the year before they gave us the. Was it like maybe they would said they were going to roll out some different spam stuff and then everyone assumed it had kind of gone live. And then they came back later and were like, oh no, no, no, we haven't done the site reputation abuse stuff yet. And then I was like, what? You said it was going to happen. And then they're like, oh no, we'll make it very clear when it's going to happen.
And then they were like, yeah, we're rolling it out manually. It's like, huh. Like you were, you were definitely talking about like algorithmic changes. And then it suddenly just became like a policy change. The search liaison, Danny Sullivan, did say, I have to paraphrase here just to give you what I believe the meaning of his words were, which is essentially they couldn't find a good way to algorithmically deploy it that I imagine didn't cause like accidental false positives. And accidentally, just because they're dealing with big high authority sites. Yeah, it's a big deal if they accidentally, you know, crush a multimillion pound business. Right. They've got to again, predict what happens. So I don't think they found a way to be able to reliably do it. So they've had to go on what I think, to be quite blunt, is like a PR exercise And a bit of a fear exercise of just walking around with this great big Google mallet and just very publicly smashing some sites to bits just to put, you know, the fear into people going, well, we could use this tactic, but kind of, you know, and then it's not going to get past the, you know, the board or the, or the, the senior management. If, if someone's like, well, these foresights did it and they got crushed. Right. Because nobody wants to, nobody wants to be the person who says do that. So that's my impression of it. I think they haven't manage to solve it in an algorithmic, scalable way. I wouldn't, you know, if anyone from Google did listen to this. I don't, I don't mean that as like a criticism because it's an incredibly difficult problem to solve. Incredibly difficult because you're essentially trying to.
These sites by definition have all of the good components that Google is looking for.
[00:14:03] Speaker A: Yes.
[00:14:04] Speaker B: So you're looking for one very specific thing and then it's a part of the website. So I don't know if you saw, Jeremy, we had a few of these websites that were hit and then they migrated part of the site that was hit to another URL structure and it all started coming back.
[00:14:19] Speaker A: Right.
[00:14:20] Speaker B: So again, how it worked was obviously that they'd flagged these URL patterns. So not only did they have to successfully identify that this site is doing it, but they need to very carefully cut around that, they need to surgically remove that. And that can be very challenging if the site, you know, on purpose or not, has a URL or internal linking structure. That doesn't make it obvious. You know, I think all those sites that tried to do that eventually rolled it back because they realized that Google was going to just go fully like, well, if you're going to play Harbaugh, we'll just take you out the SERPs and see how long you want to play those games for.
[00:15:00] Speaker A: Yeah, no, it's not the first time that Google has used FUD to get, you know, to do change the industry more out of fear. I've seen two. Seen it happen twice before. The first with nofollow when it first came out, scaring the bejeebus out of all of the news organizations to the point where to this day either news articles don't have any links or if they are linked, they're like one word and it's to another internal article and it's no followed like in. The second time that I saw it was when Google got super mad about guest authors and was like, don't be a super guest author. You have this abusive profile that goes up everywhere and we're going to come down with the wrath of God. Like, you know, an smarty got, got nuked and like two other people got nuked. And then everybody's like guest posts are bad for like four years. But there was no actual algorithmic proof of that coming out. So I think Google does have that PR godzilla going to stomp on your house unless you, unless you behave occasionally. And I think you're right. It is a more nuanced, complicated issue that probably mathematically is extremely difficult to suss out because that intrinsic, you know, like if they are trusting them because of these factors and the scale and scope and size and brand and then the brands misbehave once they get that big. Like, it's not that the model is necessarily wrong, it's just that ultimately the, there's not that many Forbes to correct. So applying an algorithm when you can scare the bejeebus out of the five mega brands that are truly exploiting it seems, seems like better than changing the entire nature of the web.
[00:16:45] Speaker B: Yeah. And I mean that's what they kind of did with the nofollow thing, right? Because I think they, they made their job harder of analyzing the link graph because we had, what was it, 17 or 20, 19, like several years ago now they, they announced that nofollow is now going to be a hint. So we may actually count nofollow links because it made it harder for them to rank sites because lots of authoritative sites that they would usually take link signals from were no following links. So Google's own pool of who should I trust got harder. So they were like, we're just gonna have to do our own call. You're absolutely right. With the building up trust thing. I like to think sometimes about Google's algorithm and how they trust sites in a way that humans interact with each other. And it's no different from if you meet someone and they were a great friend to you and very trustworthy for years and then they suddenly, you know, betrayed you or exploited you or did something, you would maybe just be like, right, I'm not going to talk to this person anymore. You wouldn't change the way that you develop friendships. You wouldn't be like, well, those people, you know, did all the right things. So therefore I can't trust anyone like that because like you say, it breaks your own kind of model. So sometimes that manual approach, I think if you want to call it that does, does work and we're guilty as SEO sometimes and I think being a bit flippant with how complex the problems are that Google's trying to solve because we see, you know, we don't even see half, you know, the 99% of spam that Google is already blocking because the web's like just full of junk, right? So we see these very relatively small examples that get through and we're like, oh, they should be, they should be taking care of that. When they're kind of like this, they're hoe doring the rest of the spam on the web for us.
[00:18:34] Speaker A: There is that and I don't know, it is part of the cat and mouse in the nature of I guess humanity at this point of like if there is opportunity then portions of humanity will attempt to exploit it. And you know, as, as a meta develops and you know, starcraft and siege tanks develop and then some one person figures out at a, before a tournament that hey, you know, mutas are the perfect counter to it. And then it's a huge exploit and it's, it shifts everything. It's kind of this gold.
We've done it since, you know, gold rush. You know, like everybody was poor but oh, there's gold over here and it will make, make a man rich gold rush, you know. So I think we as SEOs need to have a shovel seller mentality. And what I mean by that is when there is a gold rush, be the guy selling the shovels, not digging with them.
[00:19:28] Speaker B: So it'd be the Nvidia for the gen AI times then I suppose so.
[00:19:34] Speaker A: Kind of shifting a little bit to the buckets in queries. And I sent you a message about this in the way that from the Google exploit you said that there a number of buckets, buckets that Google treats these queries and delivers results in a specific type of way. Boolean. It's yes or no short answer. What was your process to figure out what those buckets were based off of the data and what does that query handling mean to you? And then potentially what are the paths to think down as far as implication in strategy? Because I went down one path and you're like no, no, no, not that.
[00:20:16] Speaker B: So in terms of fig, there was nothing to figure out in, in the. Literally they were labeled that like there was, there was a, there was a.
[00:20:24] Speaker A: Thing I figured out how to read. Jeremy.
[00:20:26] Speaker B: Yeah, so it was just called like RQ Semantic Class RQ's RefineQuery. And it just had like it would say for a query you know, equals bool or short answer. Right. And then so we had hundreds of thousands of queries and then when we collated them all they were any ones that had that class were one of those eight that I published. Okay.
So I think it's helpful to take one step back, which is that the things that are surfaced at the front end of Google and are being passed around, from what I understand generally helpful metrics, parameters for them when they are constructing the search result page that we see or doing post ranking. So that may be specific metrics that were calculated during ranking that are then surfaced to the front end, such as I think site quality is an example of that or the that query type. Obviously Google doesn't know the query until you type it in. So I think that may just be like a front end classifier. Right. Which it may have that classification also in the back end. But Google knows at the time when you do that query, I think like what that that classification is. And what I noticed was depending on the query type it fell into would determine things like which SERP features were shown.
So in, to answer your question, in terms of strategically, what do you do strategically? I don't know if it's massively exciting, to be very honest with you. I'm not going to like try and blow it up in something it doesn't, but something that we're doing at the moment is putting that data into a classifier. So I can put in any query I like and it will try and it will say it will predict it's in this. It will be in this category. Right. This class, which then will let me know this is likely the types of feature we'll see on that serp.
So then I know what kind of content maybe I should be making or optimizing for. It gives me at least just some guidance. It's something I would include on a content brief. So it's not like pivotal like this is how to hack Google. But it's like all the, you know, this is to me what optimization is. You know, it's not tricks, not these things. It's the sum of many small bits of knowledge that guide you down the optimal kind of path. The other thing I'll note is I had a conversation with Darren Shaw this week, maybe last week, because he was saying like, oh, I wonder which one of those categories like local queries goes into. And I pointed out to him that they can be any of those because there's multiple different classifiers applied to every query. So there are actually labels. We found which was trying to remember exactly here it was like local.
Local intent and explicit local intent were both labels that could be applied to queries. So something could have local intent and be in any one of those. It could be a short answer or a bool. And what I believe is happening there is, for instance, a local intent search might be something like SEO agency. Because when I search for SEO agency, I get like a Google map box. Because I think Google's search user data has shown for those queries, people generally want something, you know, within their catch, within their area or their country or their county or their state or whatever.
[00:24:06] Speaker A: Yes.
[00:24:07] Speaker B: If I did search for like SEO Agency Norwich or my city name.
[00:24:11] Speaker A: Yeah, right.
[00:24:12] Speaker B: Then that becomes explicit local query because I've explicitly said here is the place I want. And again, that slightly changes, I think, the direction and the weightings of what's going to appear in the serp.
[00:24:24] Speaker A: I think it is. Yeah, it is. It's not new. I mean, I would often look at the keywords and try to try to place what type of SERP features tended to trigger with that. If you hadn't been doing that in the past 10 years, then you hadn't paid attention to the fact that search is no longer 10 blue links and that there is multiple massive layered differences to the types of content that are surfaced. So you should go back to learning SEO, take those classes again and realize, hey, we're multimodal now. We're multiple different types of content at the same time. So naturally, query intent, they've been signaling it for a while. They've been saying, like, the intent behind the keyword matters. And we've also seen that on algorithm changes of what type of what shows up for the same keyword and when we see those intent shifts happen for a query, I think your example of, you know, nice people and nice people, if something happens in Nice, France, then suddenly all of the nice queries might have a different intent. And that's going to change how Google what Google is going to surface over that or seasonality for Kentucky Derby suddenly making toilet bowl hats more popular. So you've spent a lot of time developing a tool to surface. The people also ask and people also ask is a function of featured snippets. What is your understanding of the dynamic interplay between AI overviews and featured snippets? Are they independent systems? Are they interdependent systems? Are they internally competitive systems within Google, where AI overviews seem to be evolving in this way to replace those, or do they seem to at least for the moment continue to stick around and provide enough of an alternate value that we should expect that featured snippets and PAA will remain distinct and separate from AI overviews.
[00:26:24] Speaker B: Okay, we meant just to clarify, we mentioned a few different things there. So we mentioned AI overviews, featured snippets with a kind of static bits at the top and then paas, which are the people also asked the questionnaire. So paas is like my kind of expert area, if you like. Out of those three, I have opinions on the other two. So we'll start with the other two. So I think AI overviews are definitely here to stay. I don't think anyone would disagree with that. I saw in the Q3 earnings call last year for Google they had massively reduced by like 90% the cost of generating those. So I was like. Because that was my one big thing is like this must be expensive to like cook these things up, right? I have seen examples where Google is showing AI overviews and featured snippets in the same serp have seen that. And I've seen people kind of moaning about that, like regular users as well, like non SEO people. And I think the honest answer is Google's still testing. I think Google's quite liberal now with some of their tests. They do quite big tests. Right. Like I think the whole Reddit everywhere in serps thing is a test.
And I said last year at Serpconf, I don't see that lasting through 2025. And I saw Lily Ray actually this week saying it started to go into free fall, which I think super interesting.
That's the question. I mean, I would expect AI overviews to eventually replace featured snippets because I just think they're more dynamic and you can generate a more nuanced answer. They can improve them quicker. It might even actually be cheaper for them because they could if the cost of generating the exact answer from the stuff they have in their model versus keeping the featured snippets up to date through crawling and indexing actually might be like a plus EV revenue move for them. And that's what Google's always going to do, right? It's going to do the thing that makes it the most money. Yeah, PAAs interest me more than both of these things, specifically because PAAs were mentioned in the DOJ trial as well as a key part of the induction loop that helps Google understand queries. So the same query can obviously mean different things to different people, right?
[00:28:40] Speaker A: Yes.
[00:28:41] Speaker B: And Google has no real intrinsic understanding of queries in a human sense. So them being able to sort of say okay, I've googled running shoes and they're like did you want to know about the fit or which are best or. And they then start to get this idea. Well if someone Googles this fairly generic term, we know like 80% of them are kind of going down this line, 10% this line, 5% this line, 5% this Line. And they can keep they a B test those PIA results all the time. They're being split tests. They're not static. So we have to do a lot of work at all SARS behind the scenes to keep the results static because we do change monitoring which helps us monitor intent shift. Right. But there's a lot of work that goes into getting consistent PAAs which Google does and not many people are aware the you can actually and this is a feature we'll have next month. You can specify city with paas.
[00:29:37] Speaker A: Okay.
[00:29:38] Speaker B: Which is I believe something Google's been doing relatively, it's relatively new because we started noticing one of our customers actually Arnaut Hellman's lovely guy was sending me examples. Yeah, he's great guy. He's sending me examples. When people were, when he was googling and getting also asked queries for food trucks. It was coming up with like say food trucks Denver. And I was like that's interesting. And then kind of five minutes later, which was actually weeks and weeks of research later we found we can actually, we can actually pass not only cities but specific longitudinal latitudes to Google which will alter PAAs. So we've been through a lot of R and D and we're hopefully next month going to be launching this feature where light users will be able to specify city and monitor for changes in specific cities and see customized results for that city. And then pro users can literally be like this exact longitude latitude because Google has a list of like supported cities in every country.
[00:30:40] Speaker A: Okay.
[00:30:41] Speaker B: But obviously if you're on like a mobile and stuff it will sometimes just pass the long lat. And then I had all these interesting questions like what if I do a Google search when I'm on a boat in international waters? You know, what's it going to pass? Because it's still going to give me a result. Right.
[00:30:56] Speaker A: You're like hey bored, I'd like to take a cruise for non suspicious reasons. I'm testing something in the boat.
[00:31:03] Speaker B: I did a worldwide cruise. All tax deductible.
[00:31:06] Speaker A: Yes.
[00:31:07] Speaker B: Research. But so that, yeah, going back to your question. So I think PAAs are going to like going to be around for ages because they're part of this induction loop for Google.
[00:31:16] Speaker A: Okay.
[00:31:17] Speaker B: And I honestly think they're one of the best bits of keyword research we have available as SEOs. I've been using them for years. Literally, like, you know, we're talking. The reason also us exists is because I did a. I did an SEO conference and I just had some command line tool that I was. That we'd written to do this and I had loads of people come up to me and I can. I have that tool? And I was like, yeah, sure. And I gave it to them. And then I'd get all these like, support requests because it was just like a Python command line thing. And people are like, oh, it says it needs this package. I was like, there's obviously like a barrier to entry here, which is why we made a web version. But the cool thing about it is, like, if something is on the news, if I see something on TV and I Google that, the people also ask. Results can change within hours.
Like, you know, when we have like an election here, the results just change almost straight away. And the example I gave on another webinar several months ago is when GPT4O first came out. Right. I think it had been released two or three days ago. And I was showing them on every major keyword tool. It said the search volume for GPT 4.0 was 0, which was obviously wrong because it already had millions of users. And then I showed them and also asked getting the people or SARS data, there was already questions like, how is GPT4O better than 3? How much does GPT4O cost? Like, we already had all the things and the intent people were searching for. And you'll know in SEO, sometimes being the first to publish is like a massive advantage. Right?
[00:32:56] Speaker A: Right.
[00:32:56] Speaker B: Because you're the only one firstly to rank, but then you get all the links. Right? Because when people are doing the research, they find you. So that's why we set up the monitoring feature, because I just feel it's so underused. Like, for every client we have like, tracking of search intent, like new queries that are popping up and we're getting them well ahead, well upstream of people that are using tools which are brilliant tools, have, you know, great usage, big platforms, but they have these huge databases which are very sort of slowly rolling through the keyword field. Whereas this is right on the kind of vanguard of new stuff that's coming in. And it's just fantastic. So that feature, I think is going to stay. It's there now in like 70, 80% of all searches anyway. So Google uses it, the AI stuff. I mean I've been using Perplexity a lot recently. Yes, absolutely. Adore it. So it was Miriam. Yes. Yeah. Who originally got me onto Perplexity, pushed me, gave me the nudge. And that's really given me pause for thought when I, when I think about AI overviews and how we get results. Because if I'm looking for a specific piece of information, it's just the user experience is so good. Right. You ask it a complicated question, it goes and does the searches for you, it writes the answer, it cites the sources so you can directly check you're not getting hallucinations and does all the legwork for you. And there's no ads. I mean, what's not to like?
[00:34:21] Speaker A: That's true. And I'd had other conversations about the role of AI tools. Talking to Michael Buckbee of NOA Toa and thinking about the impact of the upper portion of the funnel and what is our strategy as SEOs when sections of that upper funnel area are going that lunch is going to be consumed.
So do you still keep putting out the buffet in hopes that, that what they eat is going to get regurgitated properly and you'll be cited as the source or is the like. Do we keep feeding the beast in AI by doing all of this top level stuff? And that's talking about both, both Bing and Google. Because Bing think the diminishment we saw in the past week, the announcement that Google's finally below 90% market share is because, not because of Bing directly but because Bing behind the scenes is now powering Copilot, is now powering GPT is now powering Perplexity. Siri just, I think it was two weeks ago. You can access GPT through Siri now. So AI powered type search is going to impact search in general. I've seen people say like that the, that it is a complementary process and that a lot of people end up doing an even more heavily branded search after they use other SEO tools, use those AI powered tools. Do you agree with that? That's the shift that we're saying that it's going to move people more towards brand and that as a entity we need to do more in the middle and the bottom of the funnel to be more complete about what we're creating, the signals and information we're communicating about our entity, how we compare what our features are versus writing thousands of articles that are. What is the color blue?
[00:36:18] Speaker B: Yeah. So I think you've touched on right at the End there on what you'd call like solved knowledge spaces.
So yes, for solved knowledge spaces, on what is the color blue?
A Zanya recipe. You know, there's only so many times as a species we should be producing this information. And personally, I will celebrate the day when we can just stop endlessly regurgitating the same kind of solved content again and again on the web. Like, you know, skyscrapering it up and adding grandmother's story before the recipe. Like, brilliant. Bring on the death of all of that stuff. Like, fantastic. In answer to the question about, you know, we'll say unsolved spaces, changing spaces, like say B2B software, you know, the stuff with commercial interest as well. I think the area that you're describing there, so this is it going to spread out to already is spreading out into different tools. What do we need to do for that? I think there's two important things there. One, firstly, I don't think for this. Let's start with the second thing. The second thing I think is that that area is going to be a relatively short amount of time when any of that's going to be applicable, and I'll tell you why in a minute. But while we're in that area, I don't feel what we do as marketers, as Digital marketers, as SEOs, has radically changed at all.
All of the studies I've seen, the actual studies, academic studies about, you know, GEO and generative search and appearing in it, basically tell you to do all the same stuff that SEO people already do. There's a few things about optimizing for generative search, like, you know, you're more likely to appear in generative search if you're citing sources, for instance, or you have statistics. It doesn't matter if they're even correct because obviously the way the Gen AI is working is it's generating, you know, as well, they have their feedback loops of the thumbs up, thumbs down, and that's the insidious feedback loop of the Gen AI, which is it produces content by definition. Looks like. Looks legit.
[00:38:29] Speaker A: Yeah.
[00:38:30] Speaker B: So even sometimes you can give people incorrect information and they're like, thank you, that's brilliant. Because they haven't checked. You know, I did a video a couple of weeks ago on Nickedin where I was asking, I was asking to like Gen AI ChatGPT to do a entity analysis analysis for me on a URL and I purposely just gave it a URL that didn't exist and has never existed. And it happily took some keywords from the URL and Just told me I was reviewing gaming keyboards and it listed prices, said I had affiliate links on there, it named stores. Like none of that was true.
[00:39:05] Speaker A: Right, Right.
[00:39:06] Speaker B: So I don't think what we're actually doing as SEOs needs to change so much, although the user behavior will change. Where I think we're going, which I'm more interested in, is I think we are going to have a fundamental shift of. Rather than people doing their own research, I. E. I want a pair of running shoes, for instance, and then I look at sizes and then I maybe look at different brands and I look at, you know, I like these ones because they're made from 100% recycled materials and they're within my price range. You know, all the things we go through. Right. I think all of that will be replaced by agents who, who understand the user. So they know my price range, they know I would prefer to have, like, environmentally friendly brands. And they know because of, you know, the latent stuff in their lm, like what that consists of and what they should look for. And therefore I would just be like, I want to buy some new running shoes. It will go and do the research for me and it will select the brands for me to serve up to basically say, you know, here's a couple of options, here's how they meet your criteria, which one do you want? And it will go and do it. The reason I think that will happen is because it's less work for the user, it's less friction. And if you look back through history, that's basically how all technology weevils its way in. Right? Which is that it's less effort, you get the same or comparable end result for less effort, less brain power, less clicks, you know, you get the same result. And that's why I think the people that will win this race will be the ones that own the hardware. So I think Google is obviously really well placed. Android devices, they have YouTube, they've got Google Discover. I think Apple is well placed. Microsoft is fairly well placed.
I don't think on their own, people like OpenAI will compete because the key to having those good agents is understanding the user. Most people, Most people's mobile phones aren't more than 2 meters away from them. Like basically their whole life now, you know, like 99% of the day your phone's within two feet of you, right?
[00:41:17] Speaker A: Yeah.
[00:41:17] Speaker B: And you're doing a lot of stuff on it. The amount of information that goes through that phone that can help an AI agent do things for you, I think, I think this whole Trade off. You know, there's a lot of questions at the moment around AI and privacy and theft and copyright. I think in general, people will care less personally about that information flow as time goes on because it will become the norm. And the trade off of the convenience of just being able to have send out email people, you know, robots, into the ether to do things for you while you do something else will be way too appealing. And because of the, you know, the corporate hellscape we live in.
[00:42:02] Speaker A: Yeah.
[00:42:03] Speaker B: Might become an economic imperative, you know, to. To, you know, to be that efficient that everyone else can do those things. Like, imagine trying to get a job nowadays and manage your life, like without a smartphone or without Internet access. Right. Be almost impossible.
[00:42:17] Speaker A: Yeah. And I do think that you're right that the attitudes toward privacy, you know, in Cambridge Analytica scandal in the 2000s was massive. It meant a lot to people at the time. And now we're daily.
[00:42:33] Speaker B: It seems quaint now.
[00:42:34] Speaker A: Yeah, it seems quaint. It's like, oh, that's adorable. I give away that much information about myself at breakfast. I just did.
I literally just went to a site and more information about me was given than everything that was contained within the Cambridge Analytica process that caused such a hullabaloo at the time. Terrifying on certain levels, as is existentially, like anytime, every single time I bring up to a guest the concept of AI, I think everybody kind of experiences a bit of existential angst and we all kind of try to cope with it as we can. So the way I do that is just like, okay, my job right now is to manage this part of the apocalypse and figure out, you know, how to live most comfortably right now. So functionally, with AI overviews are going to stay.
PAAs you think are going to stay. What about the interplay? I'm curious about how you see the interplay between featured snippets and people also ask, because you're right, there is serpent feature functionality behind it. That is that query disambiguation process of like, it helps them twiddle that dial. But as far as I'm aware at the moment, PAAs are powered by featured snippets. Have you seen any PAAs that, when followed through, go to AI overviews without featured snippets?
[00:44:04] Speaker B: So what you're talking about, when you open the PAA and you get.
[00:44:08] Speaker A: When you click through on the PAA and it goes to the query. Have you seen an example yet where the featured snippet doesn't exist, only an AI overview exists.
[00:44:20] Speaker B: I Don't. That's a really good question. And I've probably actually got the data to answer that.
To answer that for you because we actually collect all of data. But how are you testing that? Because when you click through. So when you open the paa, that's going to concertina out, right?
[00:44:39] Speaker A: Yeah, it should concert.
[00:44:40] Speaker B: That's like a static. So are you asking if there's an AI overview generated within that concertina?
[00:44:48] Speaker A: Well, I haven't seen that, but that would be obviously two things to look at would be within the concertina, but more about where. Because when you click through, it gives you that it takes you to that expanded query. So I would say that it's a hypothesis of mine that if Google were to be running a test or consider running a test where AI overviews in some form it thinks is equal to or better than featured snippets, then we should see an A B test at some point of the the featured snip of the PAA leading to a query where there is no featured snippet anymore. It's just the AI overview.
[00:45:29] Speaker B: Okay, then I can't answer that then. Yes, they do do that because I just did it. I googled jam sandwiches.
[00:45:35] Speaker A: Okay.
[00:45:36] Speaker B: And the first PAA was what goes with the jam sandwich? And it gave me my little snippet. And then doing that query, what goes with a jam sandwich gives an AI overview on its own. That's what you mean, right? I've understood you correctly there. Yes, there we go. There's the answer to that. So I think we'll keep the PAs will stay because of the query refinement. But I imagine we're going to transition to AIO views in the. In the main serp. And I think the reason we're going to transition to AIO views in the main SERP is there's more potential for Google to keep you on the serpent with AIO views. Right? And that's how they make money. Like the long the more you're in their ecosystem, the more you're going to bounce around and make the money. Even if you go to a site and click on a Google Ad, they're getting a smaller cut than if you interact on their owned property. I don't know if you can replicate that, but yeah, I've got an AI overview for the snippet that was generated in the paa. Brilliant question. I'd never even thought about that. I'm going to like, you've given me like suddenly like this whole like myriad explosion of ideas that I want to Test now because. Yeah, so our tool actually doesn't to get all the kind of the net the all the tree of paas. We're not actually like if we get. So you get normally four queries, right, Four questions from the PAA to get the subsequent ones. We're not re. Googling the question, we're actually interacting with the SERP to do that, which is harder. The reason as well that we do this is I found that if you interact with the pa, additional questions that are added in the drop down are different to if you redo the query.
[00:47:20] Speaker A: Yes.
[00:47:21] Speaker B: And I believe this is because Google has built this kind of very complex web of where you are in your understanding. So say you're googling you know the very basics about Linux and then you then have a. You click on a PA that's say about a specific terminal query. The results are going to open up are still based in the fact that you probably don't understand what you're looking at. You're. You're a beginner, you're a novice.
[00:47:49] Speaker A: Right, Right.
[00:47:50] Speaker B: Whereas if you immediately Google a specific bash thing, Google's like okay, this is a more advanced query and there's, there's specifically patents around this as well which is working out how advanced the user is in their searches. So that was one thing we worked really hard to do to try and capture the intent map that reflects the beginning of the user journey which is the initial fingerprint of this is probably the entry point for this user. Because you're on ramp into the information is also super important as to a hint of your understanding of that that is true.
[00:48:27] Speaker A: It does make one wonder what your model would look like if you intentionally did the opposite with the intention of showing the knowledge journey of implication of different levels of users as a spectrum and what type of queries display. So then it's, it's almost tralfamadorian like, like cat's cradle of like you're seeing like all like from a baby to an old man the different ways that same search means to them. That might be. It's its own separate tool of flipping that, flipping that one aspect of how you're gathering that data. Because you're right like if you start in SEO, what's a keyword versus like cosine relevance in a shard. Those are like here and here in terms of depth of understanding of search.
And like if you ask what's a shard and you're at this level then like if you're going like it's not even in the same ballpark. But if you're creating content for SEOs, knowing what are those iterations of knowledge as you maybe. I mean, the manual way to do that would probably be to, to. To check each one as you go and, and explore each PAA individually because you are capturing the start point of that. So then if you took the next one and did that one individually for that set, I suppose you could get there.
[00:49:55] Speaker B: You could get there. I mean, the reason I had thought about this as well, this was connected to. I did think about, because PAAs are basically infinite. You can just keep clicking and going. And I was like, wouldn't it be cool just to build a map of like, just keep going and make it like zoomable and just have this giant. Because you can see, you could see all the clusters of where the questions are connected to each other and you could basically see how every single topic is on the web is connected and you could do it in. And then I was just like, it melted my head when I was, how, how would you like? Because I was like, it could be this 2D map. And then I was like, but no.
[00:50:34] Speaker A: It has to be three dimensional.
[00:50:36] Speaker B: Well, I was like, it needs to be more than three dimensional. Because I was like, all these things are going to be connected closely, but they're not, they're not going to be close to other things that need to be close. So I was like, how do I make them both close and far away? And I was like, I can't do that in three dimensions. It's just not possible. You know, maybe I could make it like, okay, so this is. These things are close, but only between 3:15pm and 3:25pm but these things over here at 4:30 are closer. But I was like, that's not very good visualization. It's probably not that helpful to the end user.
But no, you're right about that. On kind of the. I did think through that. But the reason I didn't do it was again, while it would be an incredibly interesting kind of tangent, I was like, the function here is if you've defined your kind of initial question that your article is going to be about, yeah, you want to stay on that on ramp. Because this is what's guiding you, right?
[00:51:34] Speaker A: Because you're, you're creating a definitive single anchor point as your objective, I would say then as a practice, if you're creating a Lexus or a corpus of knowledge as SEOs, maybe that's something that we need to think about. If Google literally has patents to understand the knowledge journey depth level in an industry, then we should be thinking about the knowledge level depth like and have, you know, as we come up with our content strategy, define out experience level or level of expertise in who we're talking to as we map out the content. And knowing now that your tool is starting at that touch point, not trying to use the expanded, you know, because you can go four deep, right? But knowing that your tool, you need to go like grab that point at 4. We can do it manually. So anybody listening sign up for also asked, give this guy some money so he can keep, keep developing this stuff. Because, because SaaS. SaaS owners have it hard. Please just go and subscribe. Like I know you're using it free. Just give them a little bit more money. It's worth it. Because let's look at that. Because there is. It also would help in theory of approaching AI overview and optimization. If we're thinking about the like, it's like for a dealership, right? There are people who know jack about cars and then there are people who know a lot about cars. And the content that you want to publish should, you know, try to serve those different people. Like diving into the engine specs and towing capacity matters. Nothing to a low.
[00:53:17] Speaker B: I want a red car.
[00:53:18] Speaker A: Yeah, I want. What are all your red cars today?
[00:53:22] Speaker B: That's me.
[00:53:23] Speaker A: Yeah, I'm kind of a little bit here. I can, yeah, I can do. I can change my battery. So I'm there. And other people are like, oh, torque and this. But it doesn't suit, you know, like, I think that's. But being more complete in our content strategy is also going to help with that AI overview problem of them eating up your lunch.
If we're the ones that are talking about everything from, you know, hey, what 20, 25, what. What are the, the five top sedans to consider versus, you know, torque and engine specs in BMWs to compare on or lot. Like it's not just looking at keyword volume. It's looking at, you know, not just user journey of like, hey, I need to know more about this car. I need to compare your trustworthiness and I need to know what I need to know to buy it. There's also a horizontal piece. Like these journeys are adjacent to each other for different people in that optimization process. You're also learning about their Persona and trying to optimize more. For one, because we make more money off of the metalheads or they're a pain in the butt and they have so many high demands. We don't want them and they don't end up converting. So it's actually kind of forcing you to think about at the end of the day what happens after they hit the lead form. They talk to the salesperson, are they actually buying and can you optimize better for any of those different levels of tech talent knowledge? They dumb, they smart. I'm not sure what did it say in the patent?
[00:55:02] Speaker B: So I was gonna say actually I've got like an experimental podcast called the SEO Patent Podcast and what it is is I feed it a PDF of a search patent and it uses Notebook LM to generate a discussion about it.
[00:55:16] Speaker A: Okay.
[00:55:16] Speaker B: And it's been so helpful. So I was doing that for myself to listen to patents just on my drive to work because it gives a really nice summary of what is involved in the painting and roughly how it works. So if I'm like, hang on a minute, this sounds really interesting, then it gives me the motivation to go and read it because personally I struggled. You know, if I've got like a 20 page patent, they're not the most exciting things to read 50% of the time. It's just like that actually wasn't useful to me in any way.
[00:55:46] Speaker A: Yeah.
[00:55:46] Speaker B: And I've had like it blow my mind how positive the feedback has been for like a generated podcast. People said it's really helpful because it does just like 10 or 15 minutes. And I started on the like site Quality Painter and like the one about trying to gauge the user's level of knowledge with the search datas in there as well. That's how I knew about that one.
So yeah, it's on like Spotify, Apple, See a Peyton podcast.
[00:56:12] Speaker A: I will add that to the show notes and be listening later. Because Bill Flosky is my spirit animal.
[00:56:19] Speaker B: He did a lot of good work. He did a lot of good work.
[00:56:22] Speaker A: A lot of good work. I think Olaf Noff is doing a bit of that continued work and trying to be the patent guy in the space. But knowing you've got a podcast on it. Absolutely, absolutely. Have you checked out Storm by Stanford? So Stanford University came out with a agentic model with the intent of you give it the subject matter and it uses multiple models to create a scientific paper based off of citable sources.
[00:56:50] Speaker B: Wow. No, I haven't seen that.
[00:56:52] Speaker A: Well, now I know what you're going to do after this.
[00:56:56] Speaker B: That's next on the list.
[00:56:59] Speaker A: Okay. So check that out because patents, you're right. And now I don't think we actually talked about it. It was your other speech. Patents are interesting. Because if, if it is getting done, then most likely it has been patented. But just because it has been patented doesn't necessarily mean that it is being used or used entirely according to the original spec of what they said that they thought it would do, that they wanted it to. To do at the time they created it. Because there are plenty of patented machines out there that now they do completely different things or for a different purpose by doing something like the thing that they're doing.
[00:57:38] Speaker B: Exactly. I mean, we know that for a fact. Like, we know Gary came on the search of the record podcast and said like, hey, Google still uses PageRank, but it's a completely modified version to what was in the patent. But I haven't seen an updated version of that. So it, you know, it's very clear to me, even like when I mentioned like the site quality patent, I think that was 2016, that was filed. Like, that's a. That's a long time ago in tech. Right. So I would be be shocked if it looked. Well, even if it's used, if it looks like how it looked there or if it's something completely different now.
[00:58:16] Speaker A: Right. They just have it patented so they can use it in the function. And then it continues to evolve and integrate integr. Great.
[00:58:23] Speaker B: And I guess it blocks competitors probably at the time, doesn't it? It's like, sorry, you can't do that. We're doing that. You know, it's ours.
[00:58:29] Speaker A: Yeah, no, it's definitely a competitive. A competitive monopolistic capability. I buy. Russ. All right, so this has been just a fantastic conversation. We've gone really far. I like to cap things off with your number one most actionable thing that somebody can do immediately after listening, listening to this in their SEO campaign. You know, maybe it's eat a jam sandwich, look up the PAA on how long does a jam sandwich last. But really super actionable, tactile, like doable. We've been in the clouds, We've had our fear of the AI installed in us. We're good. So actionable.
[00:59:07] Speaker B: Can I give three.
[00:59:08] Speaker A: Yes, yes, yes.
[00:59:10] Speaker B: Do it for different kind of levels. Right. So I've been saying this a lot for 2025, if you're in content, what I've been doing now is actually starting at video and working backwards because there's tools now that can basically instantly transcribe your video. And then that's a fantastic use of LLMs, which is to write an article if you're having to get information, especially from clients. Sometimes trying to get clients to write an Article is like getting blood out of a stone, but you can have a video call with them and just ask them the questions. So you do the content brief and just chat to them about it and they will happily talk your ear off. Transcribe it, get AI to put it into into an article and then go through, you know, whatever process you need, tone of voice, fact check it. It's an amazing way to make really, really good content. And you're not falling foul of the kind of you're just regurgitating stuff. If you're beginning at SEO, I would say I'm biased, but check out PAA stuff. Use Also Us for free. Like you know, if you haven't used it before, you got three searches a day. If you have used PAA stuff before, you're a bit of an SEO, you've got Screaming Frog. I'd say I've got a guide on Search Engine Land on how you can use Screaming Frog with the Autoast API, with the ChatGPT API to crawl all of your content past the content to Chat GPT along with the list of all the 2550 questions from also us. Ask it to tell you which of those questions are not included or not covered in your article and then put those questions directly into your Screaming frog crawl results. So you can essentially automate gap analysis at scale. So you can say I've got a thousand bits of content. Where is intent shift? What do I need to do? You hit crawl, go make your jam sandwich. Definitely a tip. Love jam sandwich. Come back. And then some of them yes will be off topic because you know you've got that split in PAs, but do that manually in four days. When you're done, come back and tell me you didn't want to do it this way. Like the amount of time we've saved just doing that. So these I think are really cool uses of AI that avoid the massive downfalls of hallucination and information gain. So they're my three things to go and try that are actionable.
[01:01:28] Speaker A: Well, I gotta do a couple of things for clients, but actually I'm going to do that for got two hours to work on them today. I'm going to do that and pull that up. Thanks so much for your time. Where can people find you? Give a last plug whatever you want.
Follow it. Buy his stuff guys.
[01:01:51] Speaker B: So I'm most active probably on LinkedIn. Mark Williams Cook. I think I'm pretty much the only Mark Williams Cook on the Internet, so I'm pretty easy to find. I'm on Blue Sky Now I left X at the end of the year, so I'm there again. Just search Mark Williams cook and things. Yeah, love you to use also asked. We've got our own podcast, Search with Candor, run by my brilliant colleague Jack. And the CoreUpdates.com is the newsletter. Every Monday we send out five SEO tips. Bullet pointed, very quick, important SEO news and the podcast. So that's where you can find us.
[01:02:26] Speaker A: When is that tool, the query bucket, is that released yet?
[01:02:31] Speaker B: No, I am trying, I'm trying to get it finished before. Before to show you my kind of stack time. So before Brighton SEO, which is in April, is when I'm hoping to get it done because there's some to be, you know, to be very honest, there's some. So I'm kind of neck deep in some more self education about getting deep into Bert and fine tuning models and doing some more machine learning stuff. So I got my first machine learning qualification back in like, like 2016, I think it was before it was cool and it's completely out of date because all the like transformer stuff now it's just like irrelevant. So I'm, I'm trying to basically do all of that work myself. I've had some brilliant people basically just like, give it here, I'll do it for you. And I was kind of like, okay. I was like, I want to go through this process myself because I will learn a lot doing it.
My idea is hopefully for Brighton and some other conferences I'll be at this year, I'll be able to have that tool and make it publicly available.
[01:03:34] Speaker A: Yeah, fantastic.
[01:03:35] Speaker B: We'll put it in the newsletter when it's about.
[01:03:37] Speaker A: All right, so follow the newsletter folks. And I'll be looking for that. Thanks so much for your time, Mark.
[01:03:43] Speaker B: My pleasure. Thanks for having me on, Jeremy. Really appreciate it.