Liberalism

The term "liberalism" has meant many things over the centuries, but since the New Deal, it has been identified with the belief that government is able, and morally required, to improve the economic and social well-being of Americans, particularly those who are disadvantaged. The New Deal itself was transitory, but liberal programs have included Social Security, Medicare, and a range of labor, civil rights, and environmental legislation.

At its height after the Johnson victory in the election of 1964, it was possible to speak of a liberal Republican, although the term was much different from the Liberal Republicans after the Civil War. However, the failure of the Great Society programs to deliver many of their promises and the disillusionment in the power of government that followed the Vietnam War led to a gradual loss of influence for liberals, and the concurrent rise of the conservative movement.

In the 20th century, the term has fallen so far out of favor that the expression "liberal Republican" is an oxymoron and those who might be expected to identify with liberalism have opted for the earlier word "progressive."

- - - Books You May Like Include: ----

The Strange Death of American Liberalism by H.W. Brands.
In this book, H. W. Brands confronts the vital question of why an ever-increasing number of Americans do not trust the federal government to improve t...
The Liberal Hour: Washington and the Politics of Change in the 1960s by G. Calvin Mackenzie.
A vibrant, revelatory history of the liberal moment of the 60s, one which argues that Washington wasn't simply a target of reform but was actually th...