Good question. I am having a hard time getting an answer.
The Declaration of Independence, the Constitution of the United States of America, the Bill of Rights; these were all written by liberals. Yet, 225 years later, ‘liberal’ seems to have taken on a demonic connotation. Well, fuck that, I am an American and that makes me a liberal. The liberal paradigm seems to have shifted quite a bit since the Founding Fathers did their thing. And this is good, we must either evolve or die. Over the years we have seen a liberal society pursue manifest destiny, go after slavery, back down the robber barons, bust trusts, attempt to regulate security markets, provide for the social welfare of its elderly, provide minimal medical care for its citizens, and try to prevent the poisoning of the public. The American people have forgotten their roots and seem to be actively supporting the rollback of these liberal evolutionary steps. It is time to stop this.