Published 2025-12-20 | Version v1.0
Working PaperOpenPublished

Survivor Governance

Authority Concentration under AI-Driven State Contraction

Description

This working paper introduces Survivor Governance as a political form emerging from AI-driven functional contraction of the state. It argues that AI can expand governmental capacity while reducing the human administrative apparatus, creating authority concentration among the remaining human officials who remain institutionally indispensable. The paper analyzes the mechanisms of state contraction, survivor group formation, authority consolidation, the Shark Effect, democratic and social externalities, and institutional safeguards against survivor lock-in.

Abstract

Artificial intelligence (AI) is increasingly integrated into governmental operations, enhancing administrative efficiency, analytical capacity, and policy execution. At the same time, it induces a structural contraction of the human apparatus of the state by automating large portions of routine and procedural work. This dual transformation—capacity expansion alongside organizational shrinkage—creates a fundamental asymmetry: while governmental functions can be automated, political authority cannot. Consequently, power does not disappear with institutional downsizing; it concentrates within a narrowing circle of human officials who remain institutionally indispensable. This article introduces Survivor Governance as a political form emerging from this transformation. Survivor Governance does not denote technocracy, nor rule by AI experts. Rather, it describes a mode of governance in which authority accumulates by default among those who survive AI-driven functional contraction of government, regardless of whether they are optimally suited for governing in an AI-enabled state. By focusing on the internal transformation of government rather than democratic decline per se, the article provides a structural explanation for emerging patterns of political closure, institutional conservatism, and declining administrative mobility under AI-enabled governance.

Files

PDF preview
Files
NameType
Survivor Governance.pdf
Full-text PDF of the publication
application/pdfDownload

Keywords

  • Survivor Governance
  • AI-driven state contraction
  • AI governance
  • State transformation
  • Authority concentration
  • Functional contraction
  • Administrative automation
  • Institutional survivorship
  • Political closure
  • Institutional conservatism
  • Government automation
  • Human authority
  • Public administration
  • Democratic accountability
  • AI-enabled governance

Subjects

  • AI Governance
  • Political Science
  • Public Administration
  • State Transformation
  • Institutional Design
  • Governance Theory

Recommended citation

Wu, Shaoyuan. (2025). Survivor Governance: Authority Concentration under AI-Driven State Contraction. Global AI Governance and Policy Research Center, EPINOVA LLC. https://doi.org/10.5281/zenodo.18090197. DOI: To be assigned after Crossref membership approval.

APA citation

Wu, S. (2025). Survivor governance: Authority concentration under AI-driven state contraction. Global AI Governance and Policy Research Center, EPINOVA LLC. https://doi.org/10.5281/zenodo.18090197. DOI: To be assigned after Crossref membership approval.

Alternate identifiers

SchemeIdentifierDescription
ORCID put-code201017623ORCID Public API record identifier from early metadata
DOI10.5281/zenodo.18090197Zenodo/DataCite DOI from early metadata
File nameSurvivor Governance.pdfSource PDF file name
Publication date2025-12-20Date shown in the PDF title page and early metadata record

Related works

No related works listed.

References

  1. Axios. (2025). IRS deploys generative AI tools as workforce shrinks. Axios. https://www.axios.com
  2. European Union. (2024). Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). Official Journal of the European Union, L 1689. https://eur-lex.europa.eu/eli/reg/2024/1689/oj
  3. OECD. (2019). OECD principles on artificial intelligence. Organisation for Economic Co-operation and Development. https://www.oecd.org/ai/principles/
  4. Office of Management and Budget. (2024). Advancing governance, innovation, and risk management for agency use of artificial intelligence (OMB Memorandum M-24-10). Executive Office of the President of the United States. https://www.whitehouse.gov/omb/
  5. Reuters. (2025). U.S. federal agencies plan large-scale workforce reductions amid administrative reforms. Reuters. https://www.reuters.com
  6. U.S. Government Accountability Office. (2023). Artificial intelligence: Opportunities and challenges in federal agencies (GAO Report). https://www.gao.gov
  7. UNESCO. (2021). Recommendation on the ethics of artificial intelligence. United Nations Educational, Scientific and Cultural Organization. https://www.unesco.org/en/artificial-intelligence/recommendation-ethics
  8. United States Congress. (1946). Administrative Procedure Act, 5 U.S.C. §§ 551–559. https://www.law.cornell.edu/uscode/text/5/part-I/chapter-5/subchapter-II
  9. White House. (2022). Blueprint for an AI Bill of Rights: Making automated systems work for the American people. Executive Office of the President of the United States. https://www.whitehouse.gov/ostp/ai-bill-of-rights/