PapersFlow Research Brief

Social Sciences · Decision Sciences

Academic Publishing and Open Access
Research Guide

What is Academic Publishing and Open Access?

Academic publishing and open access is the system by which scholarly findings are evaluated, disseminated, indexed, and assessed, alongside policy and infrastructure choices that determine whether research outputs are freely available to read and reuse or restricted behind paywalls.

This topic spans how research is produced and evaluated (e.g., peer review, conflicts of interest, and research assessment) and how it is discovered through bibliographic databases and citation systems. Database coverage and citation counting are central because they shape what scholarship is visible and how it is evaluated, as examined in "The journal coverage of Web of Science and Scopus: a comparative analysis" (2015) and "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories" (2018). In the provided corpus, the cluster contains 137,079 works (5-year growth rate: N/A).

Topic Hierarchy

100%
graph TD D["Social Sciences"] F["Decision Sciences"] S["Information Systems and Management"] T["Academic Publishing and Open Access"] D --> F F --> S S --> T style T fill:#DC5238,stroke:#c4452e,stroke-width:2px
Scroll to zoom • Drag to pan
137.1K
Papers
N/A
5yr Growth
56.1K
Total Citations

Research Sub-Topics

Why It Matters

Open access and publishing practices affect who can read research, how quickly findings circulate, and how institutions evaluate researchers, which in turn influences funding, hiring, and public trust. For example, research evaluation practices are often tied to journal-based metrics, yet Seglen (1997) in "Why the impact factor of journals should not be used for evaluating research" argued against using journal impact factor for evaluating research, directly affecting how universities and funders design assessment criteria. Discovery infrastructure also has concrete operational consequences: Mongeon and Paul‐Hus (2015) in "The journal coverage of Web of Science and Scopus: a comparative analysis" and Martín‐Martín et al. (2018) in "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories" show why an institution’s choice of indexing/citation source can change what is counted as “impact,” which matters for library collection decisions, departmental benchmarking, and national research assessments. Integrity and governance issues likewise have real-world stakes: Bekelman et al. (2003) in "Scope and Impact of Financial Conflicts of Interest in Biomedical Research" documented the prevalence and influence of financial ties, informing journal disclosure policies and institutional conflict-of-interest management; Ioannidis (2016) in "The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta‐analyses" highlighted how publication incentives can yield misleading syntheses that can misdirect clinical and policy decisions.

Reading Guide

Where to Start

Start with Nosek et al. (2015), "Promoting an open research culture", because it provides a policy-oriented entry point that connects journal practices to transparency, openness, and reproducibility—core concerns that motivate open access and open scholarship reforms.

Key Papers Explained

Seglen (1997), "Why the impact factor of journals should not be used for evaluating research", motivates skepticism about journal-level proxies and sets up the need for better evaluation frameworks. Aksnes et al. (2019), "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories", then supplies conceptual tools for interpreting citations and indicators more carefully. Mongeon and Paul‐Hus (2015), "The journal coverage of Web of Science and Scopus: a comparative analysis", and Martín‐Martín et al. (2018), "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories", show how infrastructure choices (coverage and citation sources) shape what gets measured and found. Ioannidis (2016), "The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta‐analyses", and Bekelman et al. (2003), "Scope and Impact of Financial Conflicts of Interest in Biomedical Research", connect publishing incentives and governance to research integrity risks that open practices aim to address.

Paper Timeline

100%
graph LR P0["Why the impact factor of journal...
1997 · 2.4K cites"] P1["Scope and Impact of Financial Co...
2003 · 1.8K cites"] P2["A Comparison between Two Main Ac...
2013 · 1.8K cites"] P3["The journal coverage of Web of S...
2015 · 4.1K cites"] P4["Promoting an open research culture
2015 · 2.6K cites"] P5["The Mass Production of Redundant...
2016 · 1.4K cites"] P6["Google Scholar, Web of Science, ...
2018 · 1.7K cites"] P0 --> P1 P1 --> P2 P2 --> P3 P3 --> P4 P4 --> P5 P5 --> P6 style P3 fill:#DC5238,stroke:#c4452e,stroke-width:2px
Scroll to zoom • Drag to pan

Most-cited paper highlighted in red. Papers ordered chronologically.

Advanced Directions

Fortunato et al. (2018), "Science of science", points toward using large-scale data about publications, citations, and careers to test how policy changes (including openness and evaluation reforms) alter scientific behavior. Zhu and Liu (2020), "A tale of two databases: the use of Web of Science and Scopus in academic papers", suggests ongoing methodological work on how scholars operationalize database choice and reporting practices in bibliometric research. Across these directions, the technical frontier is less about a single metric and more about building robust, transparent measurement pipelines that are explicit about database coverage, indicator choice, and incentive effects.

Papers at a Glance

In the News

Code & Tools

Recent Preprints

Latest Developments

Frequently Asked Questions

What is open access in the context of academic publishing?

In this corpus, open access is discussed as part of broader scholarly communication systems that determine who can read and reuse research outputs and how those outputs are evaluated. "Promoting an open research culture" (2015) frames openness as a set of journal and community practices that support transparency and reproducibility.

How should researchers avoid misusing journal impact factor when evaluating work?

Seglen (1997) in "Why the impact factor of journals should not be used for evaluating research" argued that journal impact factor is not a valid proxy for the quality of individual articles. Aksnes et al. (2019) in "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories" explains that citation indicators require careful interpretation and do not directly equate to research quality.

Which database should I use for literature review and citation analysis: Web of Science, Scopus, or Google Scholar?

Mongeon and Paul‐Hus (2015) in "The journal coverage of Web of Science and Scopus: a comparative analysis" shows that Web of Science and Scopus differ in journal coverage, which affects what you retrieve and what gets counted. Martín‐Martín et al. (2018) in "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories" demonstrates that citation counts can vary systematically across these sources, so database choice should match the purpose (systematic retrieval vs. broad discovery vs. evaluative bibliometrics).

How do conflicts of interest and incentives affect what gets published?

Bekelman et al. (2003) in "Scope and Impact of Financial Conflicts of Interest in Biomedical Research" reported that financial relationships are widespread and can influence biomedical research, motivating disclosure and governance requirements. Ioannidis (2016) in "The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta‐analyses" argued that incentives can drive redundant or misleading evidence syntheses, which can distort the apparent state of evidence.

How can journals and communities promote transparency and reproducibility?

Nosek et al. (2015) in "Promoting an open research culture" proposed that journal author guidelines can be used to promote transparency, openness, and reproducibility. Fortunato et al. (2018) in "Science of science" positions these reforms within a broader evidence base about how scientific systems behave and how incentives shape outputs.

What is the current state of this research area in the provided dataset?

The provided topic cluster contains 137,079 works, indicating a large and mature literature base. The 5-year growth rate is listed as N/A in the provided data, so no trend direction can be inferred from the dataset metadata alone.

Open Research Questions

  • ? Which combinations of journal policies most effectively increase transparency and reproducibility without creating new barriers to participation, as implied by the policy focus in "Promoting an open research culture" (2015)?
  • ? How can evaluative systems incorporate citation indicators while avoiding the known pitfalls of journal-level metrics highlighted by "Why the impact factor of journals should not be used for evaluating research" (1997) and the conceptual cautions in "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories" (2019)?
  • ? How should bibliometric studies correct for differences in database coverage and indexing practices documented in "The journal coverage of Web of Science and Scopus: a comparative analysis" (2015) and "A tale of two databases: the use of Web of Science and Scopus in academic papers" (2020)?
  • ? What governance mechanisms best mitigate the influence of financial conflicts of interest on research agendas and reporting, given the concerns synthesized in "Scope and Impact of Financial Conflicts of Interest in Biomedical Research" (2003)?
  • ? How can the research ecosystem reduce redundant or misleading evidence syntheses while preserving rapid knowledge aggregation, as criticized in "The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta‐analyses" (2016)?

Research Academic Publishing and Open Access with AI

PapersFlow provides specialized AI tools for Decision Sciences researchers. Here are the most relevant for this topic:

See how researchers in Economics & Business use PapersFlow

Field-specific workflows, example queries, and use cases.

Economics & Business Guide

Start Researching Academic Publishing and Open Access with AI

Search 474M+ papers, run AI-powered literature reviews, and write with integrated citations — all in one workspace.

See how PapersFlow works for Decision Sciences researchers