The traditional criterion for state legitimacy was very simple. If a state and its government could hold and govern territory, it was legitimate, at least in the eyes of other governments. The form of government and its behavior did not matter in this definition — Stalin’s USSR, Mussolini‘s Italy, Hitler’s Germany — these regimes held territory and ruled as surely as did the ones in Britain, France and the United States. And, in each others’ official eyes, one state was as legitimate as the next.
This outlook began to change in 1945. Just before and then during World War II, fascist behavior in general and Nazi behavior in particular was so shocking that many post-war governments became convinced that state legitimacy required well-defined codes of national behavior enshrined in international law.
Therefore, right after the war, human rights became a recognized standard by which to judge states and their governments. This new standard, which was implied in the Nuremberg trials, was soon articulated in such documents as the International Declaration of Human Rights and endorsed by the United Nations. It was simultaneously reinforced by a worldwide process of decolonization that focused the international community on issues of human rights, particularly as they touched on the practice of racism and apartheid.
Most importantly, this process led growing segments of civil society to support human rights law as a standard by which to judge state legitimacy. In one case, pressure from civil society worldwide was applied on apartheid South Africa throughout the 1970s and 1980s with sufficient force to help change not only the nature of that country’s government, but its national culture and therefore the character of the state itself. By 1994, South Africa was no longer an apartheid state.