About Quizzes

Liberalism

The term "liberalism" has meant many things over the centuries, but since the New Deal, it has been identified with the belief that government is able, and morally required, to improve the economic and social well-being of Americans, particularly those who are disadvantaged. The New Deal itself was transitory, but liberal programs have included Social Security, Medicare, and a range of labor, civil rights, and environmental legislation.

At its height after the Johnson victory in the election of 1964, it was possible to speak of a liberal Republican, although the term was much different from the Liberal Republicans after the Civil War. However, the failure of the Great Society programs to deliver many of their promises and the disillusionment in the power of government that followed the Vietnam War led to a gradual loss of influence for liberals, and the concurrent rise of the Conservative Movement.

In the 20th century, the term has fallen so far out of favor that the expression "liberal Republican" is an oxymoron and those who might be expected to identify with liberalism have opted for the earlier word "progressive."