Home about us downloads podcast technology contact subscribe
Adam Lasnik - Google's Search Engine Evangelist

Transcript of interview March 14, 2007 ©2007 Tech Talk Radio

Adam Lasnik, Google’s Search Evangelist and Lidija Davis,
Tech Talk Radio Australia

Adam Lasnik - Google Search Engine EvangelistTTR: Today on Tech Talk Radio we’re visiting the Goolgeplex in Silicon Valley and having a chat with Adam Lasnik, Google’s Search Evangelist.  Adam’s role at Google revolves around being a webmaster liaison; he spends most of his time monitoring webmaster forums and blogs.  Adams most visible role is speaking with, and learning from webmasters around the world, particularly at conferences. He has just celebrated one year with Google.

TTR: So, Adam, happy anniversary and welcome to Tech Talk Radio.

AL: Definitely a pleasure to be here.

TTR: Search engine optimization—wow—what a buzzword in Australia at the moment.

AL: SEO, certainly it’s quite the thing here in the States as well.  It very much combines a part of search engine marketing, or SEM, so oftentimes, people have looked at it as: How do we optimize our site for Google and for the other search engines?  But I think maybe that’s a bit too narrow.

The way I think is a better approach is: How do we optimize our site for users?

That typically translates into better crawling, indexing, and ranking on Google.  And, I would assume the other engines as well.  So a lot of it has to deal with accessibility, compelling content, and often times a lot of really simple tweaks that are both good for the user, and good for Google.

TTR:  OK, what are they?  What are the tweaks – we have to ask.

AL:  One that first comes to mind is actually the simplest, but one of the most powerful and that is to optimize both your title tags and, also your meta description tags. 

It’s been surprising to me actually and sometimes scary to see, even major companies having the same title and the same meta description on many, many pages. 

And since this typically shows up on Google search results, that doesn’t tell the user very much.  When they’re getting ready to click and they look at it and say: “Huh, that’s not very descriptive, that’s not something that I want to explore.” 

By refining that, by having a title that mentions the company name, describes what the page or that section is about, can really help both the user, and also help your sites presence in Google as well.

TTR: What should people be thinking about when they write the meta tags?

AL: You want to think of it as, particularly for the meta description tag, as a brief, and targeted description—not a series of keywords—it can actually be a sentence or a phrase, so that it looks like how you would describe your page to someone else, in just a brief moment of speaking. 

If it looks spammy to a user, it’s probably going to look spammy to us.  So you want to avoid that.

googleTTR:  Because that seems to be the common question, how many keywords, what about the meta tag, what about the title, and how many keywords – what a question.

AL: (Laughs) Well, we get that a lot on many different aspects on search engine optimization and site optimization overall: What should be the keyword density? Exactly how long should my page be? And the answer really is use common sense. 

You don’t want to have a keyword density such that it just reads obnoxiously; you don’t want to have a page where you have to scroll to the point that your hand gets sore on the mouse.  But, similarly, having pages that are two sentences long where people have to click next, next, next…

We look at it from the perspective of: What would a user want to see, what is actually useful? We base our algorithms around that, so there really is no set number; no set ratio.

TTR: In terms of links, that’s something you haven’t mentioned yet, but people do ask about links and what should we do?  If you’re going to be putting something out there you want to make sure that it reflects directly on what you are talking about, because otherwise, what’s the point?.  If you do have hundreds of links, what’s it going to do?  Tell me a little bit about links.

AL:   Well, you actually, starting back, you touched on sort of a fun aspect that I think people don’t pay enough attention to, a far more important aspect; they’re thinking: Hey, lets get as much traffic as we can; let me get as many hits, let me rank number one – without always thinking about the more core subject here: What are we going to do with these folks when they come to the site? Is that page really relevant? 

OK, we’ve link baited this huge section of people—congratulations—but, if they hit your page and they go “Hey, this is pretty dumb,” and hit back; you’ve just used a lot of bandwidth, you’ve perhaps lost some brand or some site good will.

So the thing to think about with links is relevance

You want to be getting links to your site that are relevant to the content you have, and you want to be linking to other pages from your site that are also relevant to what your users are probably looking for.

Now, if you have a personal site and you want to link to your uncle’s dentistry blog, and your site’s not about dentistry, who cares?  That’s great, that’s fine.  It’s not that we’re looking over every single link with a fine tooth comb, we look at patterns, both incoming and outgoing.  If it makes sense for the user, if it’s not clearly spammy or obnoxious, then it’s probably fine.  It can definitely help the way we view, index and rank your site.

People joke and say, “Gosh, if I see another top 10 ways to do such and such,” it’s become a bit of a cliché, and a bit of a joke from that phrasing, but at the same time, I think the way we perceive link baiting is, if that content is indeed compelling, if it is relevant to what the links are coming in, then hey, more power to you.  You’re providing some good entertainment that is useful for the user, and that’s great.

googleTTR: And that gets rid of my next question which is: If I’m a supplier, should I get my customers to link back to me?  You’d think, absolutely!

AL: Yeah, there is one important consideration in that context.  We see this all the time.  For instance, with blog software or form software it says “powered by such and such” on the bottom, on the footer, and you know, that’s fine, we certainly don’t see that as a problem at all, as long as it’s visible and not hidden.

But there is one caveat there.  The site where that’s linking into, if they have 80,000 links on powered by such and such to their site, they shouldn’t assume that’s 80,000 votes.  Because that wasn’t placed there typically affirmatively by that other site, so we’ll see it as votes, but there is a point of diminishing returns.

TTR: Tell me about regional filters.  This is something way back when I was in Australia, one year ago—way back—I would use Google a lot, but we would get google.com.au.  Here, that’s not the case.  Is there any regional filters?

AL: Absolutely, I think they are more subtle than a lot of people think.  We’re definitely not completely rearranging, or including brand new content, or indexing completely different content by region.  Rather, we’re essentially massaging what people see, largely based on regional, cultural, and linguistic differences.  For instance, an often used word between England and Australia and the States is the word football. So, we think of something different when you say the word football. 

There’s also minor linguistic changes, l remember when I first came to Europe and I said “Excuse me, where is the restroom?” and he looked at me like I was totally batty.  So what we try and do is take that into account—those differences—and interpret what people are searching for and then view, based upon where they are.  How might they be looking for something that’s different from what someone in America might be looking for?  So we might slightly rerank some of the results, or include a couple of results here and there that would be different for someone in Australia than say in Japan or the United States.

TTR: So, you do the hard work for the webmaster?

AL: Yeah, exactly.  You know, you bring up the webmaster in this, and of course, I think some webmasters have been concerned: “Hey, I can’t predict my ranking when my client is seeing something different here,” or, “From my site the results are different in this country and that country.”  But we really see it as a win, not only for the user, but for the webmaster as well, because we’re spreading the love.

We’re creating a set of results that the person in each country is more likely to click on and see as relevant.  And that’s better for the webmaster, and it’s better for the user.

TTR: Is there anything businesses should be thinking about?  In Australia, you have to think we don’t have the large corporations, I mean we have some, but not as many as you would find right here, and they don’t have the money to spend on SEO.  What are the key considerations for them?

AL: The most important thing is to speak with an authentic voice and with a language that reflects your primary audience.  So if you are primarily intending to reach fellow Australians; write with a tone, and with a spelling, that reflects what Australians would expect to see.  Spell harbour the right way.  Refer to different cultural aspects, and use language that people would expect.

If you’re trying to hit multiple audiences that becomes a little more challenging, in which case I would pick a primary audience, and spell and write, for them. 

But if there are specific regional differences, for instance Americans need to know some tourist information that every Australian would already know, include separate pages for them.  Even if some of that content is overlapping, that’s fine, that’s not a problem.  We want to discourage of course, absolute duplication, or identical content, particularly when it’s intended to deceive, and I’m sure that’s not something your listeners, and frankly the majority of webmasters are trying to do, when they take segments of content and repeat it.  We don’t penalize for that sort of thing.  We see it as essentially an attempt to communicate with different audiences, with slight nuances, and that’s fine.

google volleyball

TTR:
What does blogging do for someone with a website?

AL:  When it makes sense for that sites audience, blogs can be fabulous.  It’s important to realize, just starting a blog for the sake of “it’s going to improve my ranking,” it’s probably not going to be too effective in itself, because if you don’t blog because you feel passionate about a particular subject or about your products, it’s going to show, and people are going to not come back and not look at that favorably with your brand.

In the majority of cases, where you do feel passionate about something, a hobby, a product, a service, blogging about that is a way to share your authentic voice; it’s a way to have frequently refreshed content, and also to start a conversation, assuming you have comments turned on, to start a conversation with your visitors.  And they love that typically, where they can ask you questions, or give feedback and get a response directly from someone that represents that company or that organization; they are more likely to come back to see follow ups, they are more likely to share it with their friends.

So blogs inherently are best in the user context, and the side benefit is because there tends to be additional rich content, it’s probably going to be beneficial for the search engines as well.

The only warning on blogs is, like any good writing and like any good site management, or writing for sites outside of blogs, it takes time.  It takes effort to write content that’s interesting, compelling, funny, useful, and also to manage comments, to be able to not just leave those comments out there, but to participate in the conversation.   So if a company is going to go down that route, it’s something you want to carefully think of, and make sure you have the resources and the dedication, like any aspect of your site, to really see it through in a quality way.

TTR: Is there any technical differences in terms of optimizing a blog and optimizing a website?

AL:  There is one actually. There is one consideration that comes to mind.  In the case of blogs, sometimes duplicate content can be an unintentional issue; where you will show the exact same article, at three, four, or five different URL’s. And that can both confuse Google, and can also confuse your users, because when they want to bookmark a particular article and share it with their friends, if they have the chance of bookmarking or linking to one of three or four different links…when you refine your blog, some of those links may go bad after a time, and it can also diffuse the PageRank and other signals that enable us to see which pages are more relevant and important.  And we see that most frequently in blogs and also content management systems on the whole.  So it is important to make sure you have one URL, and this applies for all sites; one URL for each page of content.

TTR:  Webmaster Central.  How much can you tell me before you have to kill me?

AL: (Laughs) No, it’s the opposite actually.  I would talk about Webmaster Central until you kicked me out of the room here because it’s one of the things that I think we really are most proud of, and excited about here.

It is the way we see if we can manage scaleable communications, and the key word here is scaleable.  Because there are a ton of us here in search quality, and on the webmaster tools team, but compared to the number of webmasters around the world, and my goodness, that’s a scary ratio. 

So even though we’ve been building up these teams rapidly, even since I’ve been here, webmaster tools is a way that we can share information, for instance through the blog that’s at Webmaster Central, through the help documents, which we have those now in 18 languages, I think, we’re building up the blog, actually stay tuned, pretty soon some new languages for the webmaster blog as well, and the webmaster group, where there are discussions. 

But, of equal importance, we have all the tools, where people can see:  “Hmm, how does Googlebot see my site?”

There are ways that you can test your robots.txt file, to say “Wait a minute, am I accidentally blocking out entire sections?”  So, from information, from things that you can tell Google to better crawl your site, and for ways for you to interact with webmasters, I think, and I’m biased, it’s been an outstanding resource so far, and it’s also a multi-national team.  So I have a lot of faith you’ll continue to see some really great stuff out of that.

TTR: A lot of people hesitate to join community groups like that, thinking, “I don’t really have anything valid to say” but it’s interesting, and they should be, shouldn’t they?

AL: Yeah, because even if you are not a geek or a great techie type person, you can offer insights into other people’s sites.  Such as: “Well you know what, when I was trying to order something on your site, or I was trying to look at some of the full scale pictures, I couldn’t figure out how to do it.”  And then sometimes a site owner will think “Didn’t you see this link here?” and then “No, no, that’s at the bottom of the page” and people are able to help each other literally optimize sites.  Not technically, but from a navigational, from a user perspective. 

Or they will say “You know that just looks too markety, maybe you could make it such that it just explains your product better,” and that I think, more than any search engine optimization technical stuff, that is the key to really doing well in search engines.  To have your sites be clean and more navigable, and these groups offer a lot of good insights into that area.

TTR: Robots.txt you mention.  How relevant is that to search engine optimization?

AL: It can really be a great help, in two different areas.  I should actually take a brief step back for those who aren’t familiar with robots.txt file.  It’s just a tiny little text file, that anyone can typically write and post in the root of their website, right at the top level.  And that particular file, is what different bots from search engines, including Googlebot, look at to determine: What can we crawl on your site and what are we not allowed to crawl?

So for instance, you might not want to have us crawl your client discussion area, you might be comfortable having that open on the Web, but you don’t really want it indexed.  Or, you might have an area on your site that’s under construction, and you don’t want that indexed either.  So you would then put a statement: disallow and then such and such directory.  And then when our Googlebot comes, it sees that and says, “Oh, not going to go there.” 

And that also helps you in Google because, if there are pages that are empty, for instance, you have restaurant reviews on your site, and a whole bunch of restaurants don’t have reviews yet.  You’re probably not going to want to have those show up in our index because it’s just going to annoy people that click.  On aqua dining reviews, I’m sure by now that’s got to have reviews, in Sydney, it’s a fun restaurant I ate at but at the time it didn’t.  And people typed in “Aqua dining reviews” and then they would get there and it says “Be the first to offer a review” and they think “I don’t like this site, this is useless.”  So it helps your site, it helps Google; to make sure the content we are putting in our engine is the content you want to have there.

TTR:  You guys, MSN, and Yahoo! got together and made sitemaps.org.  What is it, and what does it do?

ALThat can be a really useful component.  It’s actually a file you place on your site, and you can use automatic generators to do your work for you, that tells all these different search engines what all your pages are on your site that you would like to have indexed.  And it optionally mentions, what priority certain sections or pages of your site: if you’re going to crawl this many pages, make sure you tackle these, these are the most important ones. Or, we actually update these because it’s news, every week, and these ones once a year. 

You putting a sitemap gives the crawler hints so it can better crawl and index individual sites.  And a great thing, this is an augment to the normal crawling cycle.  So it can’t hurt your site, it can only serve to bolster what we are able to see. 

Another brief example, if you have a site that has a very big database and all the pages might not be interlinked, and so it would normally be hard for the crawler to find.  By doing a sitemap, we can then see, every one of those real estate listings, or restaurant reviews, or product information pages.  It doesn’t mean we will index all of it, but it just helps us better understand what’s on your site. 

TTR:  What about companies, small companies that have between 100 and 1000 pages, is that something they should be doing?

AL: Yes, I still think it’s valuable to look at.  I did a sitemap for one of my own sites, just a fun hobby site, and it took me less than an hour.  It took me about half an hour to figure out “how do I make this, where do I put it,” and once you do it, you can use these automatic generators to update it.  So, while it’s not a total set it and forget it, it’s one of those things that once you do it, it really is not a significant investment in either time and certainly doesn’t cost you anything.

TTR:  That’s the other part of SEO I suppose, it’s a consistent, continual doing things to your site, and then making sure it’s updated, not doing it once and forgetting about it.

AL: Yeah absolutely, especially because sometimes links go stale.  There is nothing more embarrassing as a company than saying “one of our most privileged customers is so and so,” and then forgetting that they moved their site to a new domain, so when your potential customers click on that, they get a 404, or site not found.  So maintaining it, always being vigilant about the content you have on your site, and also making sure you know what people are saying about your site.  Staying atop of that conversation, and keeping a clean and consistently updated site, that takes time, it takes effort and it’s worth it

TTR:  PageRank. That one gets me, I really don’t get it.

AL: Yeah, that’s one of the things webmasters really tend to focus on, and part of it is because there is that little green bar in our tool bar, that always is there saying, you have this much page rank.  And one of the key things I like to emphasize is people really do tend to, I think, worry too much about PageRank.

When it first was unveiled, when first it was invented by Larry Page—hence the name—it was actually somewhat revolutionary in that, even though it may seem common sense now, to use links into a site to determine relevance and importance, back then, it really wasn’t being done. 

Nowadays, the basic tenets of PageRank, which is how, and what, other sites link to you as, plays a role in how the search engines see the importance and relevance of your pages.  Right now, it’s just one of well over a hundred, maybe even as many as two hundred factors, that we take into account at the crawling and indexing stage.  So, by people spending so much time saying “Oh, it increased one, it decreased a bit,” or even worse, “my competitor has this PageRank, how come they’re ranked in that way?”  And I think it ends up not being time and resources well spent.  Because it’s really more of a symptom, rather than an indicator of pure quality or what will happen. 

It’s much better to try and look at your site holistically.  What kind of traffic are you getting from the search engines, and even more important than that, how is that traffic converting?  If people are simply going to your site, from Google, clicking on one page and then going somewhere else, well PageRank isn’t going to help you at all.

TTR: Number of words in a search query.  I remember trying to work out what’s the right word, years ago, which word would best reflect what I want.  Now, how many words can be put into a search query?

AL: That’s a great question.   I have to say, I’m not really certain on that.  I know it’s gotten longer over time, so people we see frequently, I do this, you type in a big phrase from lyrics you’ve heard on the radio, thinking, what is that song?  And it used to be there was a much shorter limit, and now, I think for practical purposes, you can type in pretty long phrases.

TTR: How many words are people searching on now do you think?

AL: I think it has slightly increased over time, that’s my understanding.  With that said, our ability to understand what people are trying to get at has dramatically improved over the years.  The key tenet here and this comes down from the very top, from Marissa, and from Larry and Sergey and all the folks in search quality, is, we shouldn’t try and change the way people search, we should change the way that we are able to interpret what people mean. 

And so over time, we’ve gotten better with that.  Both by having really large data sets to see what people search on, what do they click on, and we’ve also been easing in bits of personalization.  So when we see what you tend to type in and what you tend to then go to, it gives us a better understanding of what you are looking for.  So you don’t have to type in really very long or detailed queries, but you can type in just a few words, of course you can always refine that later. 

TTR: Quality of words: not so much keywords, how important is it to have a decent style?

AL: For your users, I think it’s really critical that at least you have a style. 

To be able to have a unique voice, ideally one that is given the context of course that’s professional and appropriate for that audience.  So if you are writing on behalf of a fortune 500 company, and you’re trying to attract large blue chip consulting contracts, you want to write in a way, that your grammar is correct, your spelling is accurate for your location somewhat, but on the whole, Google cares less about the exact tone, or even the perfect spelling and grammar of a site, and cares more about that the content is rich, compelling and unique. 

So, if you’re mostly just grabbing snippets from other places, we can tell that and we’re going to think, that in itself is not really a valuable destination, but if you’re writing regularly, and you’re writing content that users find compelling, however the tone may be, then the specific word choice itself is significantly less critical.

I call these things, frankly, the smell test in a way.  You know, you look at that and think “Hmm, kind of smells fishy,” or, “that’s totally what I expect to see” from this type of company or when I type in this word, that type of content I hoped and expected to see. 

TTR:  One of the things that I’ve noticed, and often annoys me, is jargon.  I can’t even pronounce half of them; search arbitrage, long tail, and canonicalization.  Suddenly all these people are trying to work out what these words are.  Are they important?

AL: If it’s any consolation, hanging out with my med school friends is significantly worse. 

Every industry has its jargon, and I would hope in most cases, this type of specialized vocabulary has come into place to save extra words.  Sometimes it’s just because people want to sound important or want to sell high priced contracts, hopefully not the majority. 

Some of these concepts and words are important like, one of the ones we have heard and we have used a lot is canonicalization, took me a while to be able to pronounce that in fact, but what that actually looks at is, for any given page for instance, there might be more than one way to refer to it.  So a site may be at www dot whatever dot site dot au or it could be without the www.

And that is historically because the way servers were made, there could actually be different sites, different content, on the dub VS the non-dub, sounds crazy nowadays, but it’s true.  So we have got to train the bot to view those as different sites.  The problem is nowadays, when people when their sites respond to both, it can confuse both the user, and confuse the Googlebot. 

But the concepts even though they might be jargony, tend to still be pretty important.  I think as sort of a key suggestion, people shouldn’t feel daunted and overwhelmed by this vocabulary, they should feel free and completely unembarrassed either to go on our forum or any form and to say “Hey, I’m new, I want to build a great site, help me out, what does this mean?” and also, one other quick tip is that, one of the lesser know things on our search engine is you can type in the word “Define colon” then in quotes put a phrase or a word and that will very often give you plain English interpretations. (E.g. define:”word”)

TTR: Google operators: people don’t know that they exist half the time – it’s just bizarre.

AL: It’s an interesting tension for us because, on the one hand, coming back to the same thing we were discussing, that we want to make searching so intuitive for the user that they don’t have to change or adapt the way they search.  But on the flip side, we know that there are a ton of power users that want to be able to drill down and specify you know, “inurl”. 

And here is one for your listeners that I think is useful and not terribly difficult to use, site colon – so that when you know for instance, on a particular American university site, Princeton dot edu, there was this great article you read. You could type in then site colon Princeton dot edu and then the name of the article or a word from the article and that will only show you pages on that site.
(E.g. site:Princeton.edu 42 or site:gov.au internet security)

We can do a better job of explaining how we work and what we do, and I think over time we have been doing a better job of it and there is still room for improvement.

TTR: How often does Google index?

AL: The answer isn’t as straightforward as you might like.  We are indexing much more rapidly now than we did before however, it also depends on the site.  So, we see for instance, a really prominent news site, or even a blog, that tends to refresh its content, or add new good content regularly, as we see that, we adjust the crawling rate for that particular site so we will crawl it more often and then also, tend to index those pages more often. 

So there is really no set time.  It depends on what we see, and how often that particular site is updating.

One brief note on the same topic.  We just recently added a pretty neat tool in Webmaster Central where you can tell us to crawl even slower, or in some instances if we’ve been holding back you can tell us to crawl faster or crawl more.  And over time we’ll be giving more and more of this control and offering the opportunity for webmasters to tell us how they want us to crawl their site. 

But we’ve gotten a bunch of feedback “Help, you’re pummeling my server and I haven’t ramped it up yet.

TTR: I saw that and I was wondering why anyone would want you to index less frequently?

AL: You know, sometimes, for instance when I was in grad school, I ran a Web server on my stupid little computer that could only handle a few queries at a time.  It was just for fun.  It was a site for my friends, but at times the search engines were discovering it.  They could also be pummeling it really fast, you know, going through a lot of pages, quickly.  And then my friends couldn’t view the pages, now that’s an extreme example, but if you’re even a larger site, with a really fast server and you have a ton of pages, you may not want to have, for whatever reason, bandwidth concerns etcetera, you may not want to have us hit it as hard.  Now with that said, I think we’ve done a really stellar job at naturally not crawling too quickly.  Just in case, we want to give webmasters the opportunity to say “hey, hold back a little bit.”

TTR: And of course the final question – how many servers does Google have?

AL: That’s fun.  I don’t know how many of your listeners are fans of Douglas Adams, so I do have to admit, it’s one of my favorite answers that I have given before, but it still remains compelling.

The answer is 42.

TTR: Thank you Adam, thank you so much for talking with me today

AL: Definitely a pleasure

Interview March 14, 2007.
Adam Lasnik, Search Evangelist, Google talks with Lidija Davis, Tech Talk Radio Australia

Google, Main Campus
1600 Amphitheatre Parkway
Mountain View, CA 94043

google

©2007 Tech Talk Radio - May not be reproduced without permission.

Contact Us | Privacy | © 2004 - 2013 Tech Talk Radio