Do you think the recent increased public attention to the history of slavery and racism in America is good or bad for society? Why? (make sure you provide a brief explanation for your response).
Have race relations in the United States, gotten better, gotten worse, or stayed the same during your lifetime? Why do you think this and what (if anything) should be done to address this issue?