History
When did the united states declare war against germany?
1
On December 8, 1941, one day after the attack on Pearl Harbor, the United States declared war on Japan. This prompted Germany to declare war on the United States, which, in turn, led to the United States to declare war on Germany on December 11, 1941.
LIVE
Rating