Why did America fight Germany when Germany never attacked America?

Опубликовано: 20 Июль 2022
на канале: Review Central
52
0

Why did America fight Germany when Germany never attacked America?

If you are an American I would feel a deep sadness in that. If you are not then here are a few reasons below why America fought Nazi Germany during WWII and I volunteered to serve my country.

Nazi Germany declared war against the United States after Japan declared war and attacked Pearl Harbor.
Nazi Germany destroyed U.S. freighter ships sent to our closest ally England to help provide much needed food and medical supplies after they had already declared war on England the year prior, and America had not attacked any Nazi Germans.
Nazi Germany invaded Poland, Czechoslovakia, Holland, Austria, stealing their national wealth and property, killing their men, women, and children, using many survivors as slave labor, and implementing the mass destruction of Jews.
My Grandfather volunteered to fight against Nazi Germany and became a Sergeant and squad leader. Some of his squad were caught in a barbed wire obstacle while under enemy fire; he attempted to free his squad members and was shot and captured by a Nazi SS platoon, and placed inside Bergen Belsen Prisoner of War camp. Adjacent was a Jewish Concentration Camp. He kept a diary during 7 months of capture under penalty of death if caught, and escaped during a bombing raid. He later became a Doctor and had six children. Reading his diary had a profound affect on me.
I volunteered to serve in the Army in honor of my grandfather and became a Sergeant in an Airborne Infantry regiment, was a squad leader, and fought against an Iraqi Army that invaded a peaceful Kuwait, killing unarmed men, women, and children; they threatened to fight and invade Saudi Arabia, firing ballistic missiles into their country as well as Israel, killing innocent men, women, and children, and threatening global stability. Iraq attacked Americans that were stationed in Saudi Arabia and Israel.


This was not an act of American imperialism, and there are a variety of reasons why.

Firstly, remember that Germany declared war on the United States (and you know, most of the world), and Japan struck at the United States to try and prevent their involvement in the war (hitting them so hard they don't want to hit back).

Secondly, the US enacted countless reforms in the late 1930s to limit American involvement in the war, almost going as far as to make war decided by a national referendum.

Thirdly, remember that during this period the United States was suffering from the Great Depression. While war is good for business(sometimes), when a good portion of Americans are starving on the streets, the last thing they want is a massive war.

Considering this, and numerous other factors, its a bit unfair to label this as American Imperialism. While this war certainly benefitted the US in the long run, it wasn't the deciding reason for war.

I hope this helps.


German saboteurs reached the east coast of the United States by submarine. It was only thanks to observant civilians who tipped off the FBI that they were arrested before they could commit any sabotage.

German nuclear scientists were close to developing a bomb that really would have allowed Hitler to rule the world. They were stopped only because the Allied forces were closing in on their research facility, and the scientists had to decide whether to go west to the British, French and American armies, or east to the Red Army.

I’d say we made the right decision to go to war.


Смотрите видео Why did America fight Germany when Germany never attacked America? онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь Review Central 20 Июль 2022, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 52 раз и оно понравилось 0 людям.