What type of government had the U.S promoted since the end of world war 1?

Since the end of World War 1, the U.S has promoted democracy and a republican form of government.