Measuring the quality of research in different countries can be a complex, controversial exercise, but the widely used bibliometrics system shows that Irish scientists are increasingly influential, writes CONOR O'CARROLL
NO ONE LIKES their career to be summed up in a single number and yet this is often done in science, simply because it can be done. Science metrics dates back to the 1950s when the linguist Eugene Garfield founded the Institute for Scientific Information (ISI). The company indexed scientific publications and produced the Science Citation Index. In 1965 Garfield used the database to show that Nobel Prize winners published five times the average number of papers and were cited 30 to 50 times more than the average.
This established citations as the quantitative metric to measure the influence of a researcher. Since then, the ability to measure scientific output has been greatly enhanced by the availability of online databases such as the Web of Science, Scopus and Google Scholar.
Researchers’ discoveries are communicated mainly through publication in peer-reviewed journals. An article is submitted to a journal and is vetted anonymously by experts in the field. Despite criticism, the anonymous nature of this process persists as it allows the experts to give free opinions on the research. The experts carry out a thorough check of the methods used and conclusions reached by the authors.
The simplest metric based on publications is the total number of articles published. This gives an overview of productivity. For example, since 1981 the number of papers in the EU has increased by 100 per cent. Over the same period the increase for Ireland has been above 300 per cent. Internationally, Ireland has more than doubled its percentage of world research papers in the same period (currently producing 0.44 per cent). Quantity does not always imply quality, and there is a vast range of journals worldwide. Some have low standards for publication, while others, including Nature, are every scientist's dream.
The impact of a publication is measured by the number of times that it is referenced or cited in publication by others. It is important to understand that this measures the impact of the journal, not of individual researchers. Of course there is an effective quality measure of individuals too, as only high-quality papers, as assessed by peer review, find their way into high-impact journals such as Cell, Natureand Physical Review. At a national level the research impact of Ireland has exceeded the world average in the past 10 years. Irish universities are now in the top 1 per cent of research organisations in the world for 17 fields of science, from immunology to engineering.
In 2005 the physicist John Hirsch introduced a number for assessing individual scientists, the h-index. According to his definition a h-index of, say, 50 means that 50 publications by a researcher have been cited at least 50 times. This has the advantage of measuring both productivity and citation impact. There are limitations to this figure, as it means the older the researcher the more likely he or she is to have a high index, and it can never decrease.
There are limitations to the whole science of bibliometrics. It only measures what can be measured in a straightforward manner through publications. It undoubtedly favours those who publish in English. Nonetheless it can provide much useful information if used with caution. Some governments – for example, those in the UK and Australia – use it as a major factor in allocating research funding to universities. It has gained more prominence in recent years as it is one of the key factors in determining the international ranking of universities.
This type of measurement can tell us much about the general state of research in different countries. In particular it can reveal much about the value placed by governments on investing in research and development (RD). Looking to countries across north Africa and the Near East, the entire region has fewer than 3 per cent of the world’s publications. Egypt’s share of the world’s research papers is little more than it was in 1995. This is not surprising as investment in RD (as a percentage of GDP) is low, from 0.05 per cent in Gulf States to 1.02 per cent in Tunisia. The exception for the region is Israel, with investment at 4.3 per cent. The EU average is 1.78 per cent, with Ireland at 1.43 per cent.
Measuring the quality of research is a highly complex and controversial exercise. A quote from Eugene Garfield in 1977 shows how the published article still dominates and, as a consequence, citation and impact analysis. “As for science itself, I believe the basic instrument of communication is the scientific paper. I don’t think anything will replace it for at least a decade. What’s more, scientists will continue to publish their papers in printed journals.”
Conor O’Carroll is research director at the Irish Universities Association: Conor.ocarroll@iua.ie