Originally Posted by Corkobo
No foreign army has ever set foot on American soil. ...
The British marched on Washington and burned the White House and other public buildings in 1814. They were also there, obviously, during the War of Independence. Whoever wrote the blurb needs to hit the history books (or Wikipedia).