We welcome contributions on, but not limited to, the following topics and related sub-topics:
Roberto Camerani1, Nicola Grassano2, Patricia Laurens3, Daniele Rotolo1, Antoine Schoen3, Robert Tijssen4,5, Alfredo Yegros4
1SPRU — Science Policy Research Unit, University of Sussex, UK, 2European Commission, Joint Research Centre, Seville, Spain. 3LISIS Université Paris Est, France, 4Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands, 5DST-NRF Center of Excellence in Scientometrics and Science, Technology and Innovation Policy, Stellenbosch University, Stellenbosch, South Africa
The published output of corporate science, i.e. the contribution and involvement of firms in research publications in the open scientific literature, has attracted a considerable share of research in in recent years as well as policy attention. Scholars working different fields, such as economics, management, bibliometrics, and innovation studies, have produced a large body of academic literature that has examined both theoretically and empirically several aspects of corporate publishing. By providing evidence that business enterprises tend to disclose a substantial amount of their knowledge through scientific publications, this literature has somewhat overcome the dichotomy between corporate secrecy/IP protection and their knowledge disclosure behaviour.
Despite these academic studies, our understanding of the phenomenon of corporate publishing is incomplete. Our understanding of which incentives lead firms to publish, and how these vary across industrial sectors, is limited. Existing research has mainly focussed on examining the phenomenon of corporate publishing in a limited number of science-intensive sectors (e.g. pharmaceuticals and biotechnology). Also, despite the increasing availability of online publication repositories, an authoritative methodology to match the names of firms with their publication data has not emerged yet. Moreover, information on funding acknowledgements currently collected by bibliographic databases has opened up a novel perspective to track and analyse the involvement of companies in research developments as reflected in scientific publications.
This thematic track will be aimed at stimulating discussion on how to further develop a research agenda of this topic. The track will accept submissions about a variety of topics, including:
Jochen Gläser1, Carolina Cañibano2, Thomas Franssen3, Grit Laudel4, Jesper W. Schneider5
1Center for Technology and Society, TU Berlin, 2Institute for Innovation and Knowledge Management, Spanish National Research Council & Polytechnic University of Valencia, 3Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands, 4Department of Sociology, TU Berlin, 5Danish Centre for Studies in Research & Research Policy, Aarhus University
Science and technology policies are (trans)national-level or regional-level policy measures whose purpose is to bring about change in these ‘units of governance’. Studying this change for theoretical or evaluation purposes poses interesting methodological challenges of measuring change and causally attributing it to particular policies. A first set of challenges concerns the way in which we can measure effects of policy measures.
Challenges concerning the causal attribution of changes in research to policy measures emerge from the complex dynamics of the implementation of policies and by the multi-causal nature of social processes.
To advance our ability to measure changes in research and to attribute these changes to the governance of science do so, we need an open discussion of the challenges and ways to overcome them. We therefore invite colleagues to submit papers describing how they address these challenges, papers proposing methodological solutions, and papers that report failures to causally ascribe changes in research to changes in governance. We can learn from all of these.
Sybille Hinze1, Jason Rollins2, Andrea Scharnhorst3, Jesper Schneider4, Theresa Velden5, Ludo Waltman6
1DZHW, Berlin, Germany, 2Clarivate Analytics, USA, 3Royal Netherlands Academy of Arts and Sciences, The Netherlands, 4Danish Centre for Studies in Research & Research Policy, Aarhus University, 5Technical University Berlin, Germany, 6Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands
A crisis in the reproducibility of published results has been hotly debated in fields like biomedicine and psychology. In the context of scientometric research and applied scientometric analysis, the ability to replicate the results produced by another team would equally seem to be a key requirement to instill confidence in the reliability of the results reported. Causes, prevalence and consequences of irreproducibility likely vary between scientific fields, and the discussion in scientometrics has only just begun (see workshop report from ISSI 2017
We call for contributions in form of short papers (max. 3,000 words) or abstracts of provocations (max. 1,000) that highlight and analyze issues related to reproducibility in scientometric research and applied scientometric analysis. One topic of recent interest is the potential impact of the Initiative for Open Citations, I4OC, and we welcome contributions that compare strengths and weaknesses of open sources of citation data and proprietary data sources.
The first session of the special track is dedicated to the presentation and discussion of the submitted and accepted contributions. The second session of the special track seeks to collectively develop ideas for key actions to address reproducibility, including the development of a list of top ten key actions. The interaction format (break-out groups, open fish bowl conversation or similar) will be adapted to the level and range of interests suggested by the response to this call for papers.
Noortje Marres1, Ismael Rafols2, Sarah de Rijcke3
1Centre for Interdisciplinary Methodologies (CIM), Warwick University, 2Ingenio (CSIC-UPV), Universitat Politècnica de València, 3Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands
The scientific system is becoming increasingly multifaceted. Many research communities currently navigate complex borderlands between academic settings and their various stakeholders. Existing evaluation models and methods are problematic under these conditions. They do well in decision-making contexts that favour bureaucratic accountability over substantive assessment, and general indicators over contextualized indicators.
However, research evaluation is increasingly undertaken under conditions of high uncertainty and a lack of value consensus, e.g. when mapping interdisciplinary projects or complex mission-oriented work under contested notions of excellence or relevance. Indeed, in issues such as the assessment of societal impact, the participation of diverse stakeholders is crucial in order to appraise the multifarious sources of information, perspectives and criteria that may matter.
This special track will provide a forum to discuss evaluative inquiries, i.e. models and methods that explore engagement, flexibility, and contextualization as a means for foster learning about research missions, contents and indicators. We welcome conceptual and empirical contributions that:
Ingeborg Meijer1, Susanne Bührer, Erich Griessler, Ralf Lindner, Frederic Maier, Niels Mejlgaard, Viola Peter, Jack Stilgoe, Richard Woolley, Angela Wroblewski.
1Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands
The idea of Responsible Research and Innovation (RRI), as currently promoted by the European Commission, aims to bring science and society closer together, stimulating productive mutual exchanges for the sake of both sides. Presented as open and transformative by the European Commission, it is contested and presented as rather vague by some scholars. And at the institutional level, the uptake of the RRI concept in their local policies is largely unclear. In order to bring more light into the evolution and the impacts of RRI, DG Research and Innovation commissioned a study led by the Technopolis Group (MoRRI – Monitoring the evolution and benefits of RRI) to investigate in detail the current status of the underlying concepts of RRI including public engagement, science education/science literacy, gender equality, open access and ethics. In this study a wide range of activities has been carried out that need wider dissemination and debate on content as well as how to move forward. It contributes conceptual work on RRI, provides extensive exploration of existing metrics capturing RRI, and develops new indicators requiring primary data collection. Primary data collection extends towards two large-scale surveys among European researchers about their views on the relevance, benefits, barriers and hindrances for RRI within their daily research activities.
The MoRRI project elaborates on further conceptually defining what is (and what is not) RRI, in order to lay the foundations for a broad-based agreement for policy intervention; the keys and areas to be monitored; a selection and critical reflection of indicators across all dimensions and reflection on data collection. In this special track the MoRRI consortium aims to go one step further and also discuss the potential different types of benefits of RRI (economic, scientific, democratic, social) and the framework conditions which promote or hinder their occurrence, experimentation at the organizational and researcher level, and further theoretical conceptualization of RRI and its dimensions.
Matteo Romanello1, Giovanni Colavizza2, Thomas Franssen3
1École Polytechnique Fédérale de Lausanne, Switzerland, 2Alan Turing Institute London, UK, 3Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands
Bibliometric methods offer a unique perspective that can be employed to study the social and intellectual structure of scientific disciplines, their historical development and the relations between disciplines or, more generally, knowledge areas. There is a long tradition of the use of bibliometric methods in the sociology of science (Crane, 1972; Cole, 1983, see also Gläser & Laudel, 2001) and a slowly emerging interest in bibliometrics in computational or algorithmic historiography (Garfield, Pudovkin & Istomin, 2003). Moreover, the field of science mapping (e.g. Leydesdorff & Rafols, 2009, Börner, 2010) has strong methodological techniques to offer to sociological and historical questions.
The emergence of new data sources, often as part of digital humanities initiatives, is introducing new generations of scholars, especially from the humanities and social sciences, to bibliometric methods. We aim to bring together scholars that use bibliometric data sources and methods in qualitative, quantitative and mixed-method research, to rekindle the study of the sciences, social sciences, arts and humanities from a sociological and/or historical perspective. Topics of interest include but are not limited to:
We encourage submissions from scholars in areas such as (non-exhaustive list): bibliometrics and scientometrics, digital humanities, computer science, sociology and computational social sciences, anthropology and ethnography, history (of all sciences, including HSS). Both empirical and theoretical or conceptual work is encouraged. A selection of substantially expanded submissions might be proposed for a special issue of Scientometrics.
Jacques Mairesse1, Fabiana Visentin2, Michele Pezzoni3
1CREST-ENSAE, France; UNU-MERIT, Maastricht University, Netherlands, 2École Polytechnique Fédérale de Lausanne, Switzerland, 3Université Côte d’Azur, CNRS, GREDEG, France
Novel ideas are expected to substantially contribute to scientific and technological progress and to change fundamentally existing paradigms. Scientific articles and patents are two of the principal means through which novel ideas diffuse. A scientific article requires a certain degree of novelty to be published in a journal with respect to the state of the literature as well as an invention requires a certain degree of novelty to be patented. However, some articles and inventions are more novel than others and introduce breakthrough ideas.
An emerging strand of literature attempts to recognize and trace “novelty” in science and technology by using large datasets of publications and patents. Despite the growing interest in identifying articles and patents embodying novelty and assessing its impact over time, the existing studies are affected by various limitations. Mainly, several operational definitions of novelty are proposed in the extant literature with differing results. Moreover, the determinants of emergence and generation of novelty such as research funding, scientists and inventors’ personal characteristics, and institutional contexts are still largely unexplored..
This special track aims at gathering studies on scientific and technological novelty, its short and long run impact, and its determinants.
Cathelijn Waaijer1,4, Ad Scheepers2,4, Nynke Jo Smit3,4
1Administration and Central Services, Leiden University, 2Rotterdam School of Management, Erasmus University Rotterdam, Rotterdam, 3International Institute of Social Studies, Erasmus University Rotterdam, The Hague, 4All organizers are members of the Dutch Association for Institutional Research, an association that, among other goals, aims to support the development of institutional research in higher education
Universities are faced with changing methods of research evaluation. An example is the revised Standard Evaluation Protocol in the Netherlands, which has shifted from evaluating productivity quantitatively to a more qualitative approach that also explicitly and prominently takes into account societal relevance. The Netherlands is not alone in its shift in focus of research evaluation. For example, the United Kingdom has included ‘societal impact’ as a criterion that should be evaluated in research evaluation.
This shift has profound implications on the role of institutional researchers in higher education. Institutional researchers support higher education leaders in effective decision making and planning. They do this through collecting, analyzing and interpreting data from their higher education institutions.
With the proposed session we aim to promote a dialogue between designers of research indicators, scholars who study research evaluation, science policymakers, and institutional researchers. How can university policymakers and institutional researchers give the best advice to university leaders? Which types of qualitative or quantitative data and analyses can they provide to make better decisions? The session will be a combination of a paper presentations by institutional researchers, followed by a reaction of discussants who are academic researchers on the topic.
Thed van Leeuwen1, Clifford Tatum1, Paul Wouters1
1Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands
As the principles of Open Science increasingly intervene in research policy, new questions emerge about how to address Open Science in research evaluation events. An expert committee empowered by the European Commission recently tackled the apparent misalignment between expectations of Open Science and the ways in which research is evaluated. The committee's recommendations were published in the report titled, Evaluation of Research Careers Fully Acknowledging Open Science Practices (European Commission 2017). This report provides a thorough account of the misalignment, noting in particular researchers' publishing practices that privilege journal prestige ahead of openness and the additional effort entailed in opening up other resources embedded in research practices. The authors' solution prescribes expanded criteria for research evaluation; for example, the inclusion of numerous administrative tasks necessary to enable increased openness. However, it remains unclear how to facilitate evaluation of open science across heterogeneous research settings, without simply increasing the bureaucratic overhead for both evaluators and evaluated.
For this theme, we invite papers that demonstrate novel approaches to evaluation of open science and/or conceptual frameworks addressing the misalignment between principles of open science and research evaluation. We welcome qualitative and quantitative contributions from any/all aspects of the science system. Topics of particular interest include:
Hugo Hollanders1, Lili Wang1
1UNU-MERIT, Maastricht University
The public sector plays a key role in fostering innovation in the private sector. This special track will look for papers that study the role of formal and informal interactions, between public sector organisations and private companies on the innovative behaviour and performance of companies. Both qualitative and quantitative papers are welcome, with the latter using a wide range of possible data sources, including e.g. bibliometric data, patent data and innovation survey data.
The role of the public sector can be multifaceted, from developing the initial high-risk stages of breakthrough technologies (the ‘Entrepreneurial’ state, Mazzucato (2013)), to providing funding to companies for their research and innovation activities, to scientific collaborations, and (in)formal technology and knowledge transfers.
The contribution of knowledge flows from academic research to industry is substantial, in particular in industries like drugs, instruments, and information processes (Mansfield, 1991; Malo and Geuna, 2000). The interconnection between science and technological systems depends on regional settings (Acosta and Coronado, 2003), and to fully understand the mechanism of science and technology linkages, it is of great importance to include studies on both advanced and less advanced countries. The scientific contribution to technology development also involves a sectoral dimension. The intensity of science-technology interrelation also varies across sectors and there is a sector-specific characteristic in knowledge flows (Meyer 2000). Many studies have shown that public financial support contributes to higher innovation outputs, but there is still uncertainty about the precise impact on innovation inputs, outputs, processes, practices and capabilities (Edler et al., 2016). We invite paper submissions that explore these crucial questions.
Katrin Weller1, Astrid Orth2, Isabella Peters3
1GESIS Leibniz Institute for the Social Sciences, 2SUB Göttingen, 3ZBW Leibniz Information Centre for Economics
Social media platforms, both for general audiences (like Facebook, Twitter, Wikipedia) and for academic audiences (like Mendeley, Academia.edu) have become sources for measuring scholarly communication, i.e. altmetrics. They are also used by aggregators (e.g. Altmetric.com) to create new indicators for scholarly work and its impact. While altmetrics are rapidly becoming wide spread, there are still important issues that prevent adopting social media data as serious sources for bibliometric analysis, information retrieval and library management. We do not know much about:
This specialized session focuses on the ongoing challenges for using data from social media platforms and from third party aggregators. These challenges include but are not limited to:
We invite submissions of original research studying the quality of social media data and data aggregators for measuring scholarly communication and short (hands-on) introductions and best practices for using data from one or more specific social media platforms or aggregators. The research presentations will be followed by a panel discussion or a fishbowl discussion to discuss lessons learned.
Eric Welch1, Julia Melkers2, Nicolas Robinson-Garcia2 and Eric van Holm1
1Centre for Science, Technology & Environmental Policy Studies (CSTEPS), Arizona State University, United States, 2School of Public Policy, Georgia Institute of Technology, United States
The increasing internationalization of the academic workforce continues in an era of global competition for talent (Altbach, 2005; Foote et al., 2008), ongoing national discussions on immigration (Hopkins, 2010; Wallace, 2014 ; Freeman, 2005), and the development of policies promoting mobility of scholars (Ackers, 2008).
Despite the high representation of foreign faculty, research and data gaps have resulted in a severely limited understanding about their composition, preferences and behaviors. Outdated or significantly limited national-level data has hampered the advance of research, raising the need in the field to move beyond a dichotomous definition of “foreignness” in which a foreign scholar is denoted by their place of birth, ignoring the complexities stemming from their experience and context.
For this panel, we invite papers presenting new approaches to conceptualizing and investigating the international scientific workforce. Our intent is to better theorize about the international character, experience and diversity of academic workforce, and to better integrate the new ideas into research on satisfaction, mobility, migration, mentoring and productivity.
The panel aims to advance and energize the field of foreign-born scientist research by offering new insights and future directions for research on international scientific workforce.
The full Call for papers is also available in PDF format.