Discipline: Political Science

A term referring to the views of North American conservatives in the second half of the 20th century.

Conservatism in the United States became radical and assertive from the 1960s onwards, combining liberal economics and a suspicion of the state with authoritarian moral attitudes and a bellicose foreign policy.

Such views constituted part of the new right.

Kenneth Hoover and Raymond Plant, Conservative Capitalism in Britain and the United States (London, 1989)


Facebook Twitter