American slavery wasn't just a white man's business—new research shows how white women profited, too
0
As the United States continues to confront the realities and legacy of slavery, Americans continue to challenge myths about the country's history. One enduring myth is that slavery was a largely male endeavor—that for the most part, the buying, selling, trading and profiting from enslavement were carried out by white men alone.