HomeFactopediaBrainoffsRankingsCommunityLog In
You know 0 facts

The United States Bought Florida from England


Play Fact Master on True or False  (it's free)    



The United States Bought Florida from England
   is   
False
They Bought it from Spain in 1819






   About - Terms - Privacy Log In