China has long been known for monitoring its citizens on a scale unmatched anywhere else in the world — but until recently, AI was not deeply embedded in these systems. That is now changing. A new report described by CNN reveals that the technologies used by Chinese authorities include facial recognition, analysis of location data, social-media monitoring, and integration of government records.
All of this is connected to AI systems that generate alerts about “high-risk” individuals — even before they take any action. According to the report’s authors, government documents indicate that the algorithms are designed to detect “signs of disloyalty” and supply local authorities with lists of citizens who require surveillance.
The system places particular emphasis on the internet. AI decides in real time which posts are deleted or hidden and identifies users who regularly consume politically sensitive content. According to data cited by CNN, algorithms can detect things like contact with foreign individuals, participation in religious gatherings, or even neutral posts containing “keywords” that the system interprets as potential indicators of opposition activity.
Experts warn that algorithmic profiles of citizens are being integrated with data from public institutions, schools, hospitals, and transportation systems — creating one of the most advanced surveillance ecosystems in the world. Citizens often do not know why they have been flagged, and there are no practical mechanisms to appeal decisions made by AI systems, leading to even deeper self-censorship and reinforcing an atmosphere of fear.
It’s also important to note that these are not experimental or pilot systems — many people report having encountered Chinese surveillance in real-world situations. For example, there are publicly displayed “walls of shame” showing photos of citizens who crossed the street at a red light. Footage from cameras is presented in visible locations to send a clear message: anonymity does not exist, and “Big Brother is always watching.”
This system may become a model for other authoritarian states such as Russia or Belarus. Predictive algorithms could allow governments not only to react to dissent but also to preemptively neutralize behaviors that fall outside desired norms.

