Matt Cutts made an appearance today at the Domain Roundtable conference. Matt started things off with a few introductory comments, then spent most of the time answering questions from the audience and from questions that people sent in ahead of time.
Here are the highlights of what he discussed:
The primary litmus test for whether something is acceptable, ask yourself, “What is the regular user looking for?”
-Does it add value for the customer?
-Will they be happy to find this site?
-Is it relevant?
-Is the content unique?
He talked about how there are lots of great reasons to buy domains, but not as many domainers want to actually design and build out sites around the domains. He gave some examples of parked pages that don’t really add value, gmhs.com, earthday.org.
He mentioned ajaxian as a site that has great content even though their domain isn’t generic/premium. It’s a multi-author blog about all things AJAX.
Someone asked about duplicate content/stolen content. Matt said Google keeps track of when/where they first find content, and they do a pretty good job of rewarding the original source of the content and not the thieves’ version of the content. There was an attorney in the audience who was asking about DMCA requests and Matt referred us to the DMCA process with Google, and admitted that this stuff is outside his area of expertise.
When somebody asked about moving a site to a new domain, he recommended reading the recent post about moving a site on the Google Webmaster blog. He said people often overlook the suggestion to test the redirect with a small part of your site first (subdomain or directory), and it works smoothly and quickly, you will be fine to do the 301 redir on the whole site.
The question came up about whether it matters which TLD (top level domain) you’re using. For example, do .com domains carry more weight than a .net, .us, .info, etc. He said that TLD doesn’t matter–that’s the way Larry and Sergey originally designed the Google algorithm. The algorithm doesn’t care where the page is located, it’s all about pagerank (LINKS) of the particular page. At the end of answering this question he did admit that they might have started to look at particularly cheap (and spammy) TLDs differently than other TLDs–or they might start considering TLD in their algorithm if they’re not already doing so.
Regarding interlinking between sites, he said it’s fine to interlink if the sites are related, but he said not to overdo it. When pressed, he said over 10 sites interlinking might be asking for trouble. He said it would also be ok to break out your network of sites and interlink sites within a certain category. The specific example was a network of local sites, and Matt said you could either have a single portal with links to all the geo-portals, or maybe interlink between all the various plumbing sites.
Matt mentioned that sites don’t automatically get pagerank just for existing. They need backlinks to get pagerank. Also, he said if you have a network of sites and add a bunch more sites, it’s like spreading the same amount of peanut butter across a bigger piece of bread. In that case, each site in the network gets a smaller share of the pagerank distribution.
On expired domains, Matt said Google tries to reset pagerank/links for all expired domains to zero when they are registered by someone new. They don’t try to penalize the expired domain, but they also don’t want to give credit for the previous owner’s links.
He said keywords in the domain carry weight with users, and for this reason, Google also gives some weight to a keyword in the URL and/or domain name.
I don’t have the question in my notes, but something prompted Matt to mention Google Ad Manager, which I wasn’t familiar with (who can keep up with Google’s products?). It’s an ad serving solution that’s free and lets you serve ads on your site. You can serve up adsense ads, but you don’t have to, or you can use Adsense as your backfill for any unsold inventory.
Matt suggested doing a site: search to check if a domain is indexed before buying the domain. He also talked briefly about webmaster tools and how to submit a reinclusion request if needed.
Matt was asked what is the best way to park your domains without ticking off Google. He replied that Google can detect any change in content as it recrawls the site, so it’s fine to park a domain with a simple PPC parked page or whatever, and then when you start building out the site, Googlebot will notice and start indexing the site as quickly as possible. He also made the obligatory recommendation for using nofollow for links on parked pages, “just to be safe,” and then he explained what nofollow does and how it is used.
He suggested reading and abiding by the webmaster quality guidelines.
He was asked about IP delivery and he said that IP delivery is not bad, but it is bad to cloak–serve up different content to Google than what everyone else sees. If you use IP delivery (for geotargeting content for example), you should simply geotarget the content to Googlebot, too.