"Germany had lost a war to America, but was not occupied by American
troops after WW1."

You must have some strange History lesson from where you are in the world.
Britain, France, Russia and Australia to name a few fought valiantly to win
the first world war.  I believe America was involved at some point in the
end, but to claim they won it, is offensive and inaccurate.  Try examining
the words "World War".

--------------------

More obsfucation Mr Barker.  The USA did not win WWI ALONE, and Denys
did not say that.  There was a reason the USA was involved 'at the end',
the USA's involvement was the REASON the war ended soon after.  Both sides
had pounded each other to bits and were exhausted.  The addition of the
USA's fresh strength and the morale boost it gave the Allied Powers, was
all that was needed to defeat Germany.

* To join/leave the list, search archives, change list settings, *
* etc., please visit http://raven.utc.edu/archives/hp3000-l.html *