Notes...
Spam is a subset of SEO...not all SEO bad, etc.
Nissan Motors robots.txt blocks all spiders.
Testing fixing 302's. Want to accept destination URL except for like 0.5% of the time. Gives SF Giants URL as an example.
Somethings in index can be perceived in our process as the sandbox...does not apply to all sites.
Does not see Google buying DMOZ or killing reliance on it.
Google does not have the ability to hand boost any sites. They do have the ability to penalize things by hand they believe are spam or illegal.
Autolink...references how it was liked at Web2.0. Thinks the launch could have been better. Would like to allow users to enter their own triggers.
Users and privacy...to take search to the next level you need some information about the users. Matt said he wouldn't work at a company that he felt violated users privacy.
Matt has never worried much about hidden table row type techniques to organize word order. With CSS if you want see how it influences a file test it.
Toolbar does not influence how frequently stuff is crawled. It is too easy to spam, and the toolbar does not have equal distribution across various regions. Many people assume some things provide clean signals which are not so clean.
Matt as a webspam team member said he has no ability or intent to accessing the Google Analytics data.
Litmus test of a site for spam is what value does it add to the web. User reviews, forums, community, etc. What makes a site unique.
Matt Cutts hates on paid links. He said they have manual and algorithmic approaches to paid links. Compares effectiveness of paid links going forward to how reciprocal link spam has largely died off with Update Jager.
If you have to something creative and useful it is easy to get quality links that are hard for your competitors to try to recreate.
Not too long ago I interviewed Matt Cutts.