In early 2016 a new database was launched on the Web of Science platform—Russian Science Citation Index. The database is free to all Web of Science subscribers except those from the post-Soviet states. This database includes papers from 652 selected Russian journals and is based on the data from national citation index—Russian Index of Science Citation (RISC). RISC was launched in 2005 but it is scarcely known to the English-language audience. The paper describes the history, current structure and user possibilities of RISC. We focus on the novel features of RISC which are crucial to bibliometrics and are unavailable in the international citation indexes.
At early 2016 the new index was launched on Web of Science platform — Russian Science Citation Index (RSCI). The database is free for all Web of Science subscribers except those from the former Soviet Union countries. This database includes publications from 652 best Russian journals and is based on the data from Russian national citation index — Russian Index of Science Citation (RISC). RISC was launched in 2005 but there is very limited information about it available in English-language scholarly literature by now. The aim of this paper is to describe the history, actual structure and user possibilities of RISC. We focus on the novel features of RISC which are crucial to bibliometrics and unavailable in international citation indices.
eLIBRARY.ru, а dominant regional bibliometric database, currently indexes 5279 scholarly journals, 4755 of them Russian. It is a vast universe of supposedly academic literature, largely unknown to those who do not understand Russian. In this paper I will provide a brief overview of the leading players in this field: 10 megajournals, which, taken together, annually publish much more articles than all Russian yearly output in the Web of Science (WoS) Core Collection. I will utilize a set of various metrics to capture this remarkable phenomenon. All data is sourced from eLIBRARY.ru and journal websites, and none of these journals are indexed in the WoS or Scopus databases. At the same time, one should realize that these ten are just the tip of the iceberg.
The work is related to the detection of key international and Russian economic journals in cross-citation networks. A list of international journals and information on their cross-citations were taken from Web of Science (WoS) database while information on Russian journals was taken from Russian Science Citation Index (RSCI). We calculated classical centrality measures, which are used for key elements detection in networks, and proposed new indices based on short-range and long-range interactions. A distinct feature of the proposed methods is that they consider individual attributes of each journal and take into account only the most significant links between them. An analysis of 100 main international and 29 Russian economic journals was conducted. As a result, we detected journals with large number of citations to important journals and also journals where the observed rate of selfcitation is a dominant in the total level of citation. The obtained results can be used as a guidance for researchers planning to publish a new paper and as a measure of importance of scientific journals.
Metrics usage in higher education management has clearly become an issue of great importance. A recent high-profile policy report on this topic, commissioned by the Higher Education Funding Council for England, is aptly named The Metric Tide. It reiterates a number of basic principles like “don’t evaluate individuals using journal impact factors” or “peer review can’t be substituted by metrics,” and stresses that, “those involved in research assessment and management should behave responsibly, considering and preempting negative consequences [of metrics usage] wherever possible” (Wilson 2015). One of the obvious consequences is gaming with indicators, which comes in various types and level of severity. This paper deals with one particular technique centered around so-called “predatory” journals indexed in Scopus database. It is a part of a broader research on the impact of metrics-based policy measures on various university systems. See the introductory article about “predatory” publishing by the foremost authority on this topic prof. Jeffrey Beall, p. 07.
What happens with Russian mathematics in terms of metric parameters? Where do Russian mathematicians work, where do they publish, how well are they cited?
We continue a series of notes on scientometrics of the former Eastern Bloc member states, started in HERB №2 (see “25 years after the fall: indicators of postcommunist science” by Ivan Sterligov and Alfia Enikeeva). This essay compares publication output in broad subject fields for all ex-COMECON states, examining complex dynamics of transition across a wide range of different economies and cultures. Presented data highlights major differences between several subgroups of countries.