My question in this blog title would seem to be fairly irrelevant.
France was on the winning side of World War I and played a major role in eviscerating the losing Axis side.
But would it be reasonable to say that France won World War I?
I say “yes” because France is the heir to the French revolution in which the citizens rose up to slaughter the ruling class and to institute murder on a mass scale. That destroyed French culture, destruction similar to Mao’s in his Chinese Cultural Revolution.
The French revolution did not bring democracy or commercial values and it did not succeed in eliminating the hereditary ruling class.
The values that the French revolution promoted, equality and nationalism are the very elements that created the modern moribund Europe. Europe is a socialist world where most people
- don't work hard,
- the governments are diaper changing nanny states that dictate miniscule elements of everyday life and
- the idea of meritocracy or even making cultural contributions is gone.
These are as far gone in Europe today as they were in the Soviet Union.
When East Germany rejoined West Germany, the consequence was the same as the French winning World World I. All of Germany became less productive, more socialist, culturally and technically barren.
In this sense, France has spawned for the world, the concept of lackadaisical irresponsible citizens who are self-righteous. France has spread those values to their neighbors because France really won the first World War.