Both authoritarian regimes and democracies face risks from AI-enabled power concentration, though these threats manifest in different ways.

Risks in Authoritarian Regimes:

I think this risk is greater near-term risk among both regimes because it doesn't just make transitions messier, but it also makes them more likely to be violent. After a change in leadership, either via a coup or some other method, the ex-leader's inner circle may remain as gatekeepers who built the surveillance and control infrastructure. Hence, they may remain loyal to the ex-leader. When the new one arrives, the new leader may inherit the old guards, the old structural knowledge and the ex-leader may have access which they can weaponize even after losing formal power. This could make internal coups or power struggles more volatile, because you now have competition over control of an opaque and technically complex system where neither side fully understands what the other can do with it.

In authoritarian systems, its leaders can negotiate with the military, party elites, and security services. The leader can buy them, remove them, or co-opt them. But with AI systems, particularly those that may become self-aware - who controls the data, who controls the model? The data and its biases may be difficult to change, or fool/convince of something because AI may have a mind of its own.

Regime failure may require more extreme population suffering before collapse. If AI is in charge of making decisions and then helps governments to act on them, AI (as it's not a human living being) may be able to tolerate higher levels of dysfunction before institutional failure happens or the AI recommends some way to address the issues. For example, a pre-AI authoritarian system might crumble at 30% economic decline, but an AI-enabled one might only crack at 50%+ decline.

Risks in Democracies:

An unjust law can be repealed, a corrupt or defying official can be removed, but an algorithm that becomes progressively more biased through feedback loops. This can led to legitimacy failure which citizens will have trouble challenging. It's also arguably harder to resist an AI than explicit repression because there's no obvious oppressor. You can't protest an algorithm that's 'just optimising your preferences'.

Information gaps and false information generated by sophisticated AI which may erode democratic processes.

If western democracies, particularly the US grant access to authoritarian regimes to advanced AI chips, western democracies will risk creating a structural dependency. Authoritarian regimes may become reliant on Western infrastructure for their most critical strategic technology. But if western democracies restrict it, they may accelerate those regimes' drive toward autarky and independence. Autocracies may eventually achieve this, but at a much faster pace and with greater geopolitical disruption. Democracies face a constrained choice: they either create dependent authoritarian states or accelerate authoritarian self-sufficiency. Restricting it may accelerate authoritarian drive toward domestic semiconductor capacity. Strategic diversification of allied chip manufacturing capacity is necessary but may not be enough to resolve this dilemma.

Democracies move slowly by design but AI systems move fast. Democratic institutions cannot keep pace with algorithmic deployment, and so democracies may lose governance capacity and legitimacy.