Centre for Science and Technology Studies Centre for Science and Technology Studies 2333AL Leiden Zuid Holland 31715273909

Welcome to the STI 2018 Conference Website

The 2018 STI conference will be held 12-14 September 2018 in Leiden (The Netherlands) in collaboration with the European Network of Indicator Developers (ENID), and will be hosted by the Centre for Science and Technology Studies (CWTS) at Leiden University. This edition will have a special focus on the discussion of “indicators in transition” as a driving force for more comprehensive, broader and socially oriented forms of Science, Technology and Innovation indicators and evaluations. The aim of the conference is to promote a reflection on the need for more comprehensive and contextualized STI indicators. The conference aims to offer an international platform to propose and discuss more comprehensive approaches and indicators in the study of Science, Technology and Innovation.

General topics

We welcome contributions on, but not limited to, the following topics and related sub-topics:

  • Altmetrics & social media
    • Theoretical foundations
    • Validation studies
    • Data sources
  • Careers in science
    • Gender and diversity
    • Careers outside academia
    • Early career researcher experience
  • Indicators of Science and Technology
    • Responsible use of indicators
    • Societal impact of research
    • Systemic and behavorial effects of indicators
  • Innovation
    • Gendered innovations
    • Public-private interactions (e.g. university-industry)
    • Industrial R&D dynamics
  • Open Science
    • Open access
    • Open data
    • Open science and the academic reward system
  • Research evaluation
    • Responsible research evaluation
    • Methods in research evaluation
    • Case studies
  • Research integrity
    • Policies promoting research integrity and their effects

    • Misconducts in scientific publishing

    • Studies of other types of misconduct in research

Platinum sponsors



2018 Eugene Garfield Award for Innovation in Citation Analysis

garfieldcrv logo rgb pos

The winner will be announced in the STI2018 conference. Apply now


Share this page

Special tracks

1  A closer look into corporate science and publishing

Roberto Camerani1, Nicola Grassano2, Patricia Laurens3, Daniele Rotolo1, Antoine Schoen3, Robert Tijssen4,5, Alfredo Yegros4

1SPRU — Science Policy Research Unit, University of Sussex, UK, 2European Commission, Joint Research Centre, Seville, Spain. 3LISIS Université Paris Est, France, 4Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands, 5DST-NRF Center of Excellence in Scientometrics and Science, Technology and Innovation Policy, Stellenbosch University, Stellenbosch, South Africa

The published output of corporate science, i.e. the contribution and involvement of firms in research publications in the open scientific literature, has attracted a considerable share of research in in recent years as well as policy attention. Scholars working different fields, such as economics, management, bibliometrics, and innovation studies, have produced a large body of academic literature that has examined both theoretically and empirically several aspects of corporate publishing. By providing evidence that business enterprises tend to disclose a substantial amount of their knowledge through scientific publications, this literature has somewhat overcome the dichotomy between corporate secrecy/IP protection and their knowledge disclosure behaviour.

Despite these academic studies, our understanding of the phenomenon of corporate publishing is incomplete. Our understanding of which incentives lead firms to publish, and how these vary across industrial sectors, is limited. Existing research has mainly focussed on examining the phenomenon of corporate publishing in a limited number of science-intensive sectors (e.g. pharmaceuticals and biotechnology). Also, despite the increasing availability of online publication repositories, an authoritative methodology to match the names of firms with their publication data has not emerged yet. Moreover, information on funding acknowledgements currently collected by bibliographic databases has opened up a novel perspective to track and analyse the involvement of companies in research developments as reflected in scientific publications.

This thematic track will be aimed at stimulating discussion on how to further develop a research agenda of this topic. The track will accept submissions about a variety of topics, including:

  • Incentives and obstacles for firms to publish
  • Cross sectoral comparisons of corporate science and publication patterns
  • Corporate publishing outside science-intensive sectors
  • The role of open science
  • Open innovation through corporate publishing
  • Corporate funding in scientific publications
  • Policy relevant of statistical data on university-business co-publications
  • Novel methods and techniques to collect firm publication data

2. Challenges in establishing macro-level effects of macro-level interventions

Jochen Gläser1, Carolina Cañibano2, Thomas Franssen3, Grit Laudel4, Jesper W. Schneider5

1Center for Technology and Society, TU Berlin, 2Institute for Innovation and Knowledge Management, Spanish National Research Council & Polytechnic University of Valencia, 3Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands, 4Department of Sociology, TU Berlin, 5Danish Centre for Studies in Research & Research Policy, Aarhus University

Science and technology policies are (trans)national-level or regional-level policy measures whose purpose is to bring about change in these ‘units of governance’. Studying this change for theoretical or evaluation purposes poses interesting methodological challenges of measuring change and causally attributing it to particular policies. A first set of challenges concerns the way in which we can measure effects of policy measures.

  • Which changes in research can we measure as aggregates of individual-level change?
  • Which changes are emergent macro-level effects (like changes in the epistemic diversity research) and must be measured with macro-level indicators?
  • What conclusions can be drawn from self-reported behavioural change in surveys and interviews? How can limitations be overcome by methodological innovations?

Challenges concerning the causal attribution of changes in research to policy measures emerge from the complex dynamics of the implementation of policies and by the multi-causal nature of social processes.

  • How do we pinpoint the time at which effects of policy change manifest themselves?
  • How can we causally attribute any change in the content and conduct of research to a specific change in science policy?
  • What causal mechanisms link macro-level causes to micro-level effects and transform the latter in macro-level effects?

To advance our ability to measure changes in research and to attribute these changes to the governance of science do so, we need an open discussion of the challenges and ways to overcome them. We therefore invite colleagues to submit papers describing how they address these challenges, papers proposing methodological solutions, and papers that report failures to causally ascribe changes in research to changes in governance. We can learn from all of these.

3. Reproducibility in scientometrics

Sybille Hinze1, Jonathan Adams2, Andrea Scharnhorst3, Jesper Schneider4, Theresa Velden5, Ludo Waltman6

1DZHW, Berlin, Germany, 2Clarivate Analytics, USA, 3Royal Netherlands Academy of Arts and Sciences, The Netherlands, 4Danish Centre for Studies in Research & Research Policy, Aarhus University, 5Technical University Berlin, Germany, 6Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands

A crisis in the reproducibility of published results has been hotly debated in fields like biomedicine and psychology. In the context of scientometric research and applied scientometric analysis, the ability to replicate the results produced by another team would equally seem to be a key requirement to instill confidence in the reliability of the results reported. Causes, prevalence and consequences of irreproducibility likely vary between scientific fields, and the discussion in scientometrics has only just begun (see workshop report from ISSI 2017


We call for contributions in form of short papers (max. 3,000 words) or abstracts of provocations (max. 1,000) that highlight and analyze issues related to reproducibility in scientometric research and applied scientometric analysis. One topic of recent interest is the potential impact of the Initiative for Open Citations, I4OC, and we welcome contributions that compare strengths and weaknesses of open sources of citation data and proprietary data sources.

The first session of the special track is dedicated to the presentation and discussion of the submitted and accepted contributions. The second session of the special track seeks to collectively develop ideas for key actions to address reproducibility, including the development of a list of top ten key actions. The interaction format (break-out groups, open fish bowl conversation or similar) will be adapted to the level and range of interests suggested by the response to this call for papers.

4. Research assessments as participatory explorations on content, missions, methods and indicators

Noortje Marres1, Ismael Rafols2, Sarah de Rijcke3

1Centre for Interdisciplinary Methodologies (CIM), Warwick University, 2Ingenio (CSIC-UPV), Universitat Politècnica de València, 3Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands

The scientific system is becoming increasingly multifaceted. Many research communities currently navigate complex borderlands between academic settings and their various stakeholders. Existing evaluation models and methods are problematic under these conditions. They do well in decision-making contexts that favour bureaucratic accountability over substantive assessment, and general indicators over contextualized indicators.

However, research evaluation is increasingly undertaken under conditions of high uncertainty and a lack of value consensus, e.g. when mapping interdisciplinary projects or complex mission-oriented work under contested notions of excellence or relevance. Indeed, in issues such as the assessment of societal impact, the participation of diverse stakeholders is crucial in order to appraise the multifarious sources of information, perspectives and criteria that may matter.

This special track will provide a forum to discuss evaluative inquiries, i.e. models and methods that explore engagement, flexibility, and contextualization as a means for foster learning about research missions, contents and indicators. We welcome conceptual and empirical contributions that:

  • Explain the processes of indicator or method validation by experts
  • Develop evaluation methods that engage with specific contexts and practices in fundamental research
  • Explore participatory evaluation experiences in fields such as health, agriculture or sustainability where societal needs are articulated or contested by users.
  • Describe experiments with processual forms of evaluation aimed at facilitating learning
  • Present examples of indicator contextualisation or co-creation by stakeholders

5. Assessment of Responsible Research and Innovation (RRI) – beyond indicator development

Ingeborg Meijer1, Susanne Bührer, Erich Griessler, Ralf Lindner, Frederic Maier, Niels Mejlgaard, Viola Peter‎‎, Jack Stilgoe, Richard Woolley, Angela Wroblewski.

1Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands

The idea of Responsible Research and Innovation (RRI), as currently promoted by the European Commission, aims to bring science and society closer together, stimulating productive mutual exchanges for the sake of both sides. Presented as open and transformative by the European Commission, it is contested and presented as rather vague by some scholars. And at the institutional level, the uptake of the RRI concept in their local policies is largely unclear. In order to bring more light into the evolution and the impacts of RRI, DG Research and Innovation commissioned a study led by the Technopolis Group (MoRRI – Monitoring the evolution and benefits of RRI) to investigate in detail the current status of the underlying concepts of RRI including public engagement, science education/science literacy, gender equality, open access and ethics. In this study a wide range of activities has been carried out that need wider dissemination and debate on content as well as how to move forward. It contributes conceptual work on RRI, provides extensive exploration of existing metrics capturing RRI, and develops new indicators requiring primary data collection. Primary data collection extends towards two large-scale surveys among European researchers about their views on the relevance, benefits, barriers and hindrances for RRI within their daily research activities.

The MoRRI project elaborates on further conceptually defining what is (and what is not) RRI, in order to lay the foundations for a broad-based agreement for policy intervention; the keys and areas to be monitored; a selection and critical reflection of indicators across all dimensions and reflection on data collection. In this special track the MoRRI consortium aims to go one step further and also discuss the potential different types of benefits of RRI (economic, scientific, democratic, social) and the framework conditions which promote or hinder their occurrence, experimentation at the organizational and researcher level, and further theoretical conceptualization of RRI and its dimensions.

6. Studies in the sociology and history of the sciences, social sciences, arts and humanities

Matteo Romanello1, Giovanni Colavizza2, Thomas Franssen3

1École Polytechnique Fédérale de Lausanne, Switzerland, 2Alan Turing Institute London, UK, 3Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands

Bibliometric methods offer a unique perspective that can be employed to study the social and intellectual structure of scientific disciplines, their historical development and the relations between disciplines or, more generally, knowledge areas. There is a long tradition of the use of bibliometric methods in the sociology of science (Crane, 1972; Cole, 1983, see also Gläser & Laudel, 2001) and a slowly emerging interest in bibliometrics in computational or algorithmic historiography (Garfield, Pudovkin & Istomin, 2003). Moreover, the field of science mapping (e.g. Leydesdorff & Rafols, 2009, Börner, 2010) has strong methodological techniques to offer to sociological and historical questions.

The emergence of new data sources, often as part of digital humanities initiatives, is introducing new generations of scholars, especially from the humanities and social sciences, to bibliometric methods. We aim to bring together scholars that use bibliometric data sources and methods in qualitative, quantitative and mixed-method research, to rekindle the study of the sciences, social sciences, arts and humanities from a sociological and/or historical perspective. Topics of interest include but are not limited to:

  • the social and intellectual structure of disciplines or knowledge areas
  • the bibliometrics-driven history ("computational/algorithmic historiography”) of disciplines or knowledge areas in any period of time
  • cultural, social, informational and intellectual practices within disciplines or knowledge areas
  • knowledge exchanges: interdisciplinary and flows of ideas and persons, or lack thereof
  • processes of knowledge accumulation, also in comparison and over time

We encourage submissions from scholars in areas such as (non-exhaustive list): bibliometrics and scientometrics, digital humanities, computer science, sociology and computational social sciences, anthropology and ethnography, history (of all sciences, including HSS). Both empirical and theoretical or conceptual work is encouraged. A selection of substantially expanded submissions might be proposed for a special issue of Scientometrics.

7. Scientific and technological novelty: impact and determinants

Jacques Mairesse1, Fabiana Visentin2, Michele Pezzoni3

1CREST-ENSAE, France; UNU-MERIT, Maastricht University, Netherlands, 2École Polytechnique Fédérale de Lausanne, Switzerland, 3Université Côte d’Azur, CNRS, GREDEG, France

Novel ideas are expected to substantially contribute to scientific and technological progress and to change fundamentally existing paradigms. Scientific articles and patents are two of the principal means through which novel ideas diffuse. A scientific article requires a certain degree of novelty to be published in a journal with respect to the state of the literature as well as an invention requires a certain degree of novelty to be patented. However, some articles and inventions are more novel than others and introduce breakthrough ideas.

An emerging strand of literature attempts to recognize and trace “novelty” in science and technology by using large datasets of publications and patents. Despite the growing interest in identifying articles and patents embodying novelty and assessing its impact over time, the existing studies are affected by various limitations. Mainly, several operational definitions of novelty are proposed in the extant literature with differing results. Moreover, the determinants of emergence and generation of novelty such as research funding, scientists and inventors’ personal characteristics, and institutional contexts are still largely unexplored..

This special track aims at gathering studies on scientific and technological novelty, its short and long run impact, and its determinants.

8. Determining and steering research quality in practice: the institutional research perspective

Cathelijn Waaijer1,4, Ad Scheepers2,4, Nynke Jo Smit3,4

1Administration and Central Services, Leiden University, 2Rotterdam School of Management, Erasmus University Rotterdam, Rotterdam, 3International Institute of Social Studies, Erasmus University Rotterdam, The Hague, 4All organizers are members of the Dutch Association for Institutional Research, an association that, among other goals, aims to support the development of institutional research in higher education

Universities are faced with changing methods of research evaluation. An example is the revised Standard Evaluation Protocol in the Netherlands, which has shifted from evaluating productivity quantitatively to a more qualitative approach that also explicitly and prominently takes into account societal relevance. The Netherlands is not alone in its shift in focus of research evaluation. For example, the United Kingdom has included ‘societal impact’ as a criterion that should be evaluated in research evaluation.

This shift has profound implications on the role of institutional researchers in higher education. Institutional researchers support higher education leaders in effective decision making and planning. They do this through collecting, analyzing and interpreting data from their higher education institutions.

With the proposed session we aim to promote a dialogue between designers of research indicators, scholars who study research evaluation, science policymakers, and institutional researchers. How can university policymakers and institutional researchers give the best advice to university leaders? Which types of qualitative or quantitative data and analyses can they provide to make better decisions? The session will be a combination of a paper presentations by institutional researchers, followed by a reaction of discussants who are academic researchers on the topic.


9. Evaluation of Open Scholarship

Thed van Leeuwen1, Clifford Tatum1, Paul Wouters1

1Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands

As the principles of Open Science increasingly intervene in research policy, new questions emerge about how to address Open Science in research evaluation events. An expert committee empowered by the European Commission recently tackled the apparent misalignment between expectations of Open Science and the ways in which research is evaluated. The committee's recommendations were published in the report titled, Evaluation of Research Careers Fully Acknowledging Open Science Practices (European Commission 2017).  This report provides a thorough account of the misalignment, noting in particular researchers' publishing practices that privilege journal prestige ahead of openness and the additional effort entailed in opening up other resources embedded in research practices. The authors' solution prescribes expanded criteria for research evaluation; for example, the inclusion of numerous administrative tasks necessary to enable increased openness. However, it remains unclear how to facilitate evaluation of open science across heterogeneous research settings, without simply increasing the bureaucratic overhead for both evaluators and evaluated.

For this theme, we invite papers that demonstrate novel approaches to evaluation of open science and/or conceptual frameworks addressing the misalignment between principles of open science and research evaluation. We welcome qualitative and quantitative contributions from any/all aspects of the science system. Topics of particular interest include:

  • Infrastructures: design and/or development of new infrastructure to monitor implementation of open science.
  • Policies: National or institutional policies on evaluation of open science. 
  • Data: novel approaches to making data open and/or assessment of open data.
  • Indicators: new indicators (qualitative or quantitative) that aim to reduce the misalignment between open science principles and research practice.
  • Dynamics: related to the above, we also welcome contributions that focus on long-term developments within the science system, and the effects the Open Science/Open Access on science system at a global scale.

10. Public-private interactions in business innovation

Hugo Hollanders1, Lili Wang1

1UNU-MERIT, Maastricht University

The public sector plays a key role in fostering innovation in the private sector. This special track will look for papers that study the role of formal and informal interactions, between public sector organisations and private companies on the innovative behaviour and performance of companies. Both qualitative and quantitative papers are welcome, with the latter using a wide range of possible data sources, including e.g. bibliometric data, patent data and innovation survey data.

The role of the public sector can be multifaceted, from developing the initial high-risk stages of breakthrough technologies (the ‘Entrepreneurial’ state, Mazzucato (2013)), to providing funding to companies for their research and innovation activities, to scientific collaborations, and (in)formal technology and knowledge transfers.

The contribution of knowledge flows from academic research to industry is substantial, in particular in industries like drugs, instruments, and information processes (Mansfield, 1991; Malo and Geuna, 2000). The interconnection between science and technological systems depends on regional settings (Acosta and Coronado, 2003), and to fully understand the mechanism of science and technology linkages, it is of great importance to include studies on both advanced and less advanced countries. The scientific contribution to technology development also involves a sectoral dimension. The intensity of science-technology interrelation also varies across sectors and there is a sector-specific characteristic in knowledge flows (Meyer 2000). Many studies have shown that public financial support contributes to higher innovation outputs, but there is still uncertainty about the precise impact on innovation inputs, outputs, processes, practices and capabilities (Edler et al., 2016). We invite paper submissions that explore these crucial questions.

11.  Challenges of social media data for bibliometrics

Katrin Weller1, Astrid Orth2, Isabella Peters3

1GESIS Leibniz Institute for the Social Sciences, 2SUB Göttingen, 3ZBW Leibniz Information Centre for Economics

Social media platforms, both for general audiences (like Facebook, Twitter, Wikipedia) and for academic audiences (like Mendeley, Academia.edu) have become sources for measuring scholarly communication, i.e. altmetrics. They are also used by aggregators (e.g. Altmetric.com) to create new indicators for scholarly work and its impact. While altmetrics are rapidly becoming wide spread, there are still important issues that prevent adopting social media data as serious sources for bibliometric analysis, information retrieval and library management. We do not know much about:

  • how they are produced and how reliable and comparable data generators are,
  • behaviour of social media user and whether we mistake affordances for intentional interaction with scholarly products,
  • how to solve technical problems (e.g., fulltext search for scholarly papers).

This specialized session focuses on the ongoing challenges for using data from social media platforms and from third party aggregators. These challenges include but are not limited to:

  • platform-based restrictions in accessing social media data and potential biases resulting from limited access (also: biases introduced by data aggregators)
  • black boxes, hidden algorithms, and lack of information about collection processes and context affecting the use of social media data and data aggregators
  • lack of understanding of user activities (e.g. motivations to retweet, like, share) and user groups in academic contexts
  • incomplete data (e.g. due to missing DOIs or identifiers not tracked by aggregators) and noise (e.g.due to bot-activity)

We invite submissions of original research studying the quality of social media data and data aggregators for measuring scholarly communication and short (hands-on) introductions and best practices for using data from one or more specific social media platforms or aggregators. The research presentations will be followed by a panel discussion or a fishbowl discussion to discuss lessons learned.


12. Rethinking the research agenda on the internationalization of the scientific workforce

Eric Welch1, Julia Melkers2, Nicolas Robinson-Garcia2 and Eric van Holm1

1Centre for Science, Technology & Environmental Policy Studies (CSTEPS), Arizona State University, United States, 2School of Public Policy, Georgia Institute of Technology, United States

The increasing internationalization of the academic workforce continues in an era of global competition for talent (Altbach, 2005; Foote et al., 2008), ongoing national discussions on immigration (Hopkins, 2010; Wallace, 2014 ; Freeman, 2005), and the development of policies promoting mobility of scholars (Ackers, 2008).

Despite the high representation of foreign faculty, research and data gaps have resulted in a severely limited understanding about their composition, preferences and behaviors. Outdated or significantly limited national-level data has hampered the advance of research, raising the need in the field to move beyond a dichotomous definition of “foreignness” in which a foreign scholar is denoted by their place of birth, ignoring the complexities stemming from their experience and context.

For this panel, we invite papers presenting new approaches to conceptualizing and investigating the international scientific workforce. Our intent is to better theorize about the international character, experience and diversity of academic workforce, and to better integrate the new ideas into research on satisfaction, mobility, migration, mentoring and productivity.

The panel aims to advance and energize the field of foreign-born scientist research by offering new insights and future directions for research on international scientific workforce.

The full Call for papers is also available in PDF format.

Important dates

  • Call for papers and posters: February 12, 2018
  • Deadline for submissions: April 1, 2018
  • Deadline for submissions: April 13, 2018
  • Notification of acceptance: June 15, 2018
  • Early-bird registration: March 29 – July 15, 2018
  • Conference: September 12 - 14, 2018
Build on Applepie CMS by Waltman Development