There’s all in, and then there’s all in. Albania’s Prime Minister Edi Rama has proposed replacing government ministers with AI systems like ChatGPT to combat corruption and increase transparency. The suggestion, made at a July press conference, envisions voters potentially electing AI algorithms to the council of ministers, making Albania “the first to have an entire government with AI ministers and a prime minister.”
What they’re saying: Albanian officials believe AI governance could eliminate human failings in government administration.
• “One day, we might even have a ministry run entirely by AI,” Rama said. “That way, there would be no nepotism or conflicts of interest.”
• Ben Blushi, Albania’s former minister of local government and decentralization, argued that “societies will be better run by AI than by us because it won’t make mistakes, doesn’t need a salary, cannot be corrupted, and doesn’t stop working.”
Why this matters: Albania’s AI governance proposal reflects broader frustrations with traditional government corruption, particularly in post-communist Balkan nations struggling with institutional reform.
The Albanian connection: The country has a notable link to AI development through Albanian-American entrepreneur Mira Murati, who served as OpenAI’s chief technology officer from 2018 to 2024 before founding her own $2 billion startup, Thinking Machines Lab.
Historical context: Albania faces deep-rooted corruption challenges stemming from its difficult transition from a centralized economy to free market reforms in the 1990s.
• The transition included implementing a 15 percent flat corporate tax rate and selling off public utilities to public-private partnerships.
• These rapid reforms created opportunities for organized crime and corruption to flourish over the past thirty years.
Reality check: While the AI government proposal remains largely theoretical posturing, experts suggest such a system is unlikely to address Albania’s underlying structural issues—though it probably couldn’t make the situation much worse.