Поиск публикаций  |  Научные конференции и семинары  |  Новости науки  |  Научная сеть
Новости науки - Комментарии ученых и экспертов, мнения, научные блоги
Реклама на проекте

Сultural change would be necessary where papers are evaluated for their own scientific merit

Sunday, 19 May, 14:05, aquareus.livejournal.com
via Марина Аствацатурян (ФБ)

В борьбе против импакт-фактора как средства недобросовестного рейтингования объединились те, кто в этом понимает.


NATURE NEWS BLOG

Scientists join journal editors to fight impact-factor abuse

16 May 2013 | 19:00 BST | Posted by Richard Van Noorden | Category: Publishing

If enough eminent people stand together to condemn a controversial practice, will that make it stop?

That’s what more than 150 scientists and 75 science organizations are hoping for today, with a joint statement called the San Francisco Declaration on Research Assessment (DORA). It deplores the way some metrics — especially the notorious Journal Impact Factor (JIF) — are misused as quick and dirty assessments of scientists’ performance and the quality of their research papers.

“There is a pressing need to improve the ways in which the output of scientific research is evaluated,” DORA says.

Scientists routinely rant that funding agencies and institutions judge them by the impact factor of the journal they publish in — rather than by the work they actually do. The metric was introduced in 1963 to help libraries judge which journals to buy (it measures the number of citations the average paper in a journal has received over the past two years). But it bears little relation to the citations any one article is likely to receive, because only a few articles in a journal receive most of the citations. Focus on the JIF has changed scientists’ incentives, leading them to be rewarded for getting into high-impact publications rather than for doing good science.

“We, the scientific community, are to blame — we created this mess, this perception that if you don’t publish in Cell, Nature or Science, you won’t get a job,” says Stefano Bertuzzi, executive director of the American Society for Cell Biology (ACSB), who coordinated DORA after talks at the ACSB’s annual meeting last year. “The time is right for the scientific community to take control of this issue,” he says. Science and eLife also ran editorials on the subject today.

It has all been said before, of course. Research assessment “rests too heavily on the inflated status of the impact factor”, a Nature editorial noted in 2005; or as structural biologist Stephen Curry of Imperial College London put it in a recent blog post: “I am sick of impact factors and so is science”.

Even the company that creates the impact factor, Thomson Reuters, has issued advice that it does not measure the quality of an individual article in a journal, but rather correlates to the journal’s reputation in its field. (In response to DORA, Thomson Reuters notes that it’s the abuse of the JIF that is the problem, not the metric itself.)

But Bertuzzi says: “The goal is to show that the community is tired of this. Hopefully this will be a cultural change.” It’s notable that those signing DORA are almost all from US or European institutions, even though the ACSB has a website where anyone can sign the declaration.

(Nature Publishing Group, which publishes this blog, has not signed DORA: Nature’s editor-in-chief, Philip Campbell, said that the group’s journals had published many editorials critical of excesses in the use of JIFs, “but the draft statement contained many specific elements, some of which were too sweeping for me or my colleagues to sign up to”.)

DORA makes 18 recommendations to funders, institutions, researchers, publishers and suppliers of metrics. Broadly, these involve phasing out journal-level metrics in favour of article-level ones, being transparent and straightforward about metric assessments and judging by scientific content rather than publication metrics where possible.

The report does include a few contentious ideas: one, for example, suggests that organizations that supply metrics should “provide the data under a licence that allows unrestricted reuse, and provide computational access to the data”.

Thomson Reuters sells its Journal of Citation Reports (JCR) as a paid subscription and doesn’t allow unrestricted reuse of data, although the company notes in response that many individual researchers use the data with the firm’s permission to analyse JCR metrics. “It would be optimal to have a system which the scientific community can use,” says Bertuzzi cautiously when asked about this.

And Bertuzzi acknowledges that journals have different levels of prestige, meaning an element of stereotypical judgement based on where you publish would arise even if the JIF were not misused. But scientists should be able to consider which journal suits the community they want to reach, rather than thinking “let’s start from the top [impact-factor] journal and work our way down,” he says. “The best of all possible outcomes would be a cultural change where papers are evaluated for their own scientific merit.”

http://blogs.nature.com/news/2013/05/scientists-join-journal-editors-to-fight-impact-factor-abuse.html

Читать полную новость с источника 

Комментарии (0)