Google: We’re working on bad search results

By Dow Jones Newswires-Wall Street Journal
Posted Jan. 21 at 3:51 p.m.

If you have been frustrated lately by search results on Google, you’re not alone, and Google knows it.

There’s been a drumbeat of criticism of Google’s search results coming out of Silicon Valley — and now the Internet giant has responded, saying it has heard “the feedback from the Web loud and clear” and believes it “can and should do better.”

In particular, the company is talking about stopping “content farms,” which provide low-quality, often unreliable and sometimes plagiarized information on a certain topic, just to get traffic from search.

Google has been making changes to its algorithm to keep low-quality sites from appearing high in searches, search guru and principal engineer Matt Cutts wrote in an official blog post Friday.

But he also writes that, despite Google’s efforts, “The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception.”

It’s unclear whether today’s post has anything to do with yesterday’s announcementthat co-founder Larry Page will be replacing Eric Schmidt as chief executive. But the mea culpa highlights one of the big questions Google has been facing lately: whether its search quality has taken a hit.

Just this month, there have been several posts from prominent tech insiders lamenting the state of Google results.

“Google has become a jungle: A tropical paradise for spammers and marketers. Almost every search takes you to Web sites that want you to click on links that make them money, or to sponsored sites that make Google money. There’s no way to do a meaningful chronological search,” wrote University of California at Berkeley visiting scholar Vivek Wadhwa on TechCrunch.

Software developer Jeff Atwood has complained about content farms in particular. “Last year, something strange happened: The content syndicators began to regularly outrank us in Google for our own content,” he wrote.

Content farms, in general, publish thousands of Web pages a day in an effort to draw views from Google searches. A Wired article from 2009 described their goal this way: “To predict any question anyone might ask and generate an answer that will show up at the top of Google’s search results.”

Sometimes the goal is achieved through low-quality but original articles and videos. Sometimes the sites cut and paste or compile content written elsewhere and use “search-engine optimization” techniques to get their own pages to appear higher in results.

In fairness, content farms are a problem that all major search engines are facing — but Google gets the lion’s share of the attention because it has the lion’s share of the search market.

And Google makes the point that it has made significant progress against “search spam,” in which sites patently lie about what is on the page, inserting keywords to attract people to sites that don’t actually have appropriate content at all.

“A decade ago, the spam situation was so bad that search engines would regularly return off-topic webspam for many different searches. For the most part, Google has successfully beaten back that type of ‘pure webspam,’” Mr. Cutts writes.

Content farms, however, are a different story — much trickier, and when you get down to it, just as annoying for readers.

Google’s algorithm proved to be fairly adept at detecting blatant lies about what was on the page. But information from content farms really is pertinent to the search terms at hand — even if it’s not actually what the reader wants. It’s something that a human is easily able to recognize, but maybe computer intelligence isn’t quite there yet.

And if Google’s algorithm just favors “trusted” sites like major media companies, that could create problems for sites that are obscure but contain legitimate information.

Read more about the topics in this post: , , ,

Companies in this article


Read more about this company »


  1. jack (me) Jan. 21 at 5:05 pm

    If Google were serious about keeping “low-quality, often unreliable and sometimes plagiarized information on a certain topic” out of its results, Wikipedia should not come up near the top of any search.

    In any event, I can be satisfied if “Halia + massage” turns up with relevant results.

  2. Ian Michael Gumby Jan. 22 at 3:53 a.m.

    Not an easy task.

    Google is in a catch-22 situation. If they tweak their search engine to give ‘trust’ weighted results first, then of course Google’s own sites would be at the top of the list.

    Also one would have to ask what constitutes a trusted site. Did a company pay to get its trust weight increased?

    It would create a situation where there will be claims of Google taking advantage of their search engine to promote their own sites over the competition.

    If Google were to be forced to split their search engine business from other parts of the company… well can you say Microsoft?

  3. Bluecrane Jan. 22 at 7:21 a.m.

    Tech is great, it made Google, but maybe a simple button assigned to the browser that allows users to report spam would cut this stuff down. Get enough of those buttons pressed and it’s time for a real human to look at the site. If it’s spam, trash the URL.

  4. GetaLife Jan. 22 at 8:24 a.m.

    Good idea Bluecrane.

  5. tcloud Jan. 22 at 9:09 a.m.

    How about this; enable users to strike or delete sites from showing up. If ABC site is a crummy content farm, click and prevent it from ever showing up again. Thus users could weed out sites they dont ever want again. I’ve wanted this feature in a browser for years.

  6. jack (me) Jan. 22 at 10:33 a.m.

    Bluecrane, it took the Tribune 6 months to get new software to have any effect. Then the spam just migrated to the Sun-Times, which can’t figure out that a spam filter is needed.

    The “report abuse” button doesn’t work for its intended purpose and just drives click count for the site you are viewing.

  7. BABILA BETRAND Jan. 30 at 7:29 a.m.