If you don’t want your content to show up in search results and on Google News, it doesn’t have to. In fact, newspaper developers can block Google’s robots from crawling across their Web sites and gobbling up their content if they just insert a couple lines of code into their sites.
“If a webmaster wants to stop us from crawling a specific page, he or she can do so by adding ‘<meta name=”googlebot” content=”noindex”>‘ to the page,” Mr. Cohen wrote on Google’s official public policy blog. “If at any point a web publisher feels as though we’re not delivering value to them and wants us to stop indexing their content, they’re able to do so quickly and effectively.”
Publishers can even require that the material Google indexes can be removed after a certain date (say, when it gets archived and goes behind a paywall). They “can add a simple “unavailable after” specification on a page, telling search engines to remove that page from their indexes after a certain date,” Mr. Cohen explained.
Mr. Cohen started out his post by quoting a declaration from a group of European newspaper and magazine publishers stating that they “no longer wish to be forced to give away property without having granted permission.” In his entry, Mr. Cohen wrote that newspapers have had the power to control their content “when it comes not only to what content they make available on the web, but also who can access it and at what price.”
“During the past few weeks, we’ve witnessed a growing movement to rein in, extract money from, or stop altogether the aggregators who are accused of eroding news sites’ revenues by quoting from or linking to traditional (and expensive) content,” wrote Bill Grueskin, dean of academic affairs at Columbia University Journalism School and former deputy managing editor for The Wall Street Journal, at PaidContent today. “Yes, it’s hard on the ego to watch another site get credit for your hard work, but is it really hurting the bottom line?” It’s the advertising model that’s broken, according to Mr. Grueskin. Aggregators are possibly “saviors” for drawing eyes to news sites, he wrote.
Google has the same argument. “Today, more than 25,000 news organizations across the globe make their content available in Google News and other web search engines,” Mr. Cohen wrote. “They do so because they want their work to be found and read — Google delivers more than a billion consumer visits to newspaper web sites each month.”
He added that many publishers’ suggestions to change how Google handles their content “would fundamentally change — for the worse — the way the web works,” he wrote. “The Internet has opened up enormous possibilities for education, learning, and commerce so it’s important that search engines makes it easy for those who want to share their content to do so — while also providing robust controls for those who want to limit access.”
Publishers need to create more engaging content instead of try to blame Google for their problems, according to Mr. Grueskin. “[A]ggregators are more a distraction from the real crisis than the cause of it,” he wrote.
Follow Gillian Reagan via RSS.