Dems fighting words
Since when did “liberal” become an insult? And why haven’t the Democrats also turned the word “conservative” into an insult? Sure, the obvious answer is that the Republicans have controlled the White House 24 of the past 36 years. And they’ve controlled Congress since 1994. But it’s somehow shameful to be a “liberal” but we should all aspire to be “conservative”? Why isn’t it the reverse? Or better yet, why don’t we all aspire to be libertarian?