White American culture is the culture of White Americans in the United States. The United States Census Bureau defines White people as those "having origins in any of the original peoples of Europe. https://www.whytheracecardisplayed.com/
'An archive. Proof of Systemic Racism. Throughout history and in today's society. In America and abroad.' whytheracecardisplayed.com