What America Taught the Nazis on January 29, 2018 Get link Facebook X Pinterest Email Other Apps What America Taught the Nazis https://t.co/qqKEViwi5c What America Taught the Nazis In the 1930s, the Germans were fascinated by the global leader in legal racism—the United States. THEATLANTIC.COM Comments
Comments