When I was younger, the the words "Liberal" and "Socialist" referred generally to ideas and beliefs that were all about everybody being treated fairly and equally in society.
Nowadays, however, if you read anything about American politics, you get the sense that these words have taken on a new, negative connotation.
"Liberals and Socialists" seem to be being marked out as the sort of people who you wouldn't want to leave your kids with and it seems to be fairly accepted (by the neo-Cons of the right at least) that Liberalism and Socialism are the work of the devil.
Is it simply that they're all thick and can't distinguish between Communism and Socialism?
I get that, to a point, but when did the idea that Liberalism is evil become so readily accepted?
I always thought that Liberalism (as in liberal democracy in a broad non party-specific way) was a good thing.
Nowadays, however, if you read anything about American politics, you get the sense that these words have taken on a new, negative connotation.
"Liberals and Socialists" seem to be being marked out as the sort of people who you wouldn't want to leave your kids with and it seems to be fairly accepted (by the neo-Cons of the right at least) that Liberalism and Socialism are the work of the devil.
Is it simply that they're all thick and can't distinguish between Communism and Socialism?
I get that, to a point, but when did the idea that Liberalism is evil become so readily accepted?
I always thought that Liberalism (as in liberal democracy in a broad non party-specific way) was a good thing.