Your repeated references to robots.txt is addressing the wrong problem. Banning Googlebot prevents your site from being crawled at all. Most site owners want their sites to be crawled, and others (those with personal or family sites, etc.) don't care one way or the other.
Rather, the problem arises when Google decides that a link posted by a forum member is paid spam. Most of them aren't; and if a site has a large number of incoming links from forums, the chances are that's because their content is good or because they're a merchant that happens to sell stuff that those forum members need.
But because of Google's arbitrariness, rather than those sites being rewarded for their good content or for having the products that those people need -- either of which would argue for "relevance" by anyone's definition -- there's a good chance that someone posting a link to a site that he likes, for whatever reason, is harming the landing site by hurting its rankings, or even getting it delisted altogether. This is the complete opposite of the "relevancy" that Google claims to seek.
The solution I've settled on is just adding the 'rel="nofollow"' attribute to external links. It's a ****ty solution, really, that works against relevancy because the vast majority of the links are not monetized. They were placed because the landing sites were relevant to some question, issue, or need on the parts of the members. But it is what it is. I can't think of a better solution off the top of my head.
An additional problem, which Bill alludes to, is that Google is no longer "just" a search engine company. Because they have their hands in so many different businesses, they're increasingly in competition with other companies whose sites they index. That creates a conflict of interest that may (hopefully, in my opinion) come back to bite them in the ass someday.
The Expedia situation will be interesting to watch, especially if Google goes into direct competition with them.
-Rich