If we’re talking about etymology, the term “left-wing” did indeed originate from French politics and designated political movements by where members sat in the French National Assembly. So, not originally American as you stated. But the etymology aspect is not really all that interesting.
In terms of ideology, placing americans at the origin of everything is revisionism. Left-wing ideology is mostly descended from the thought of mostly European philosophers from the Age of Enlightenment.
I can imagine a mutual game of influence that goes both ways, but to claim that Americans invented it all is a bit of a stretch.
It seems revisionist to throw out the associations that history accepts because it feels like it places America at the center and we don’t like that. To refute the impact the American Revolution had on the French and then their Revolution is revisionist,
Those seated on the left, those who were pushing for overthrowing the monarchy were doing so with a decidedly American Liberalism view NOT the “leftist” views we have today. They were not actually for the commons anymore then the democrats of today are.
I’m responding on a chain of comments where American liberalism and the word liberal have been skewed to be a pejorative term over the years.
I’m talking about etymology because the thread is about it.
I did acknowledge that there was influence (that goes both ways). But saying one stems from the other is still wrong, and I rest my case: the US are not the center of the world. There was a world before, and there will be one after its collapse, whether you like it or not.
American Liberalism is the origins of “the left”
Can I get some of what you’re smoking please?
You know the United States are not the center of the universe, right?
Christ.
The terms were coined during the French Revolution based on where folks sat.
The left was aligned with American liberalism values. American Liberalism emerging at the time through the American Revolution.
The current push to change its meaning not withstanding
If we’re talking about etymology, the term “left-wing” did indeed originate from French politics and designated political movements by where members sat in the French National Assembly. So, not originally American as you stated. But the etymology aspect is not really all that interesting.
In terms of ideology, placing americans at the origin of everything is revisionism. Left-wing ideology is mostly descended from the thought of mostly European philosophers from the Age of Enlightenment.
I can imagine a mutual game of influence that goes both ways, but to claim that Americans invented it all is a bit of a stretch.
It seems revisionist to throw out the associations that history accepts because it feels like it places America at the center and we don’t like that. To refute the impact the American Revolution had on the French and then their Revolution is revisionist,
Those seated on the left, those who were pushing for overthrowing the monarchy were doing so with a decidedly American Liberalism view NOT the “leftist” views we have today. They were not actually for the commons anymore then the democrats of today are.
I’m responding on a chain of comments where American liberalism and the word liberal have been skewed to be a pejorative term over the years.
I’m talking about etymology because the thread is about it.
I did acknowledge that there was influence (that goes both ways). But saying one stems from the other is still wrong, and I rest my case: the US are not the center of the world. There was a world before, and there will be one after its collapse, whether you like it or not.