What America Taught the Nazis on July 21, 2018 Get link Facebook X Pinterest Email Other Apps About this article THEATLANTIC.COM What America Taught the Nazis In the 1930s, the Germans were fascinated by the global leader in legal racism—the United States. Comments
Comments