top of page

A Center Dedicated To The Surveillance of AI Technology Hopes To Detect Hate Speech

The Leadership Conference on Civil & Human Rights (LCCHR), a longstanding and influential civil rights coalition, is spearheading an effort to delve into the impact of artificial intelligence on civil rights, its role in propagating racism and the spread of bigotry. This initiative arises as growing concerns emerge regarding AI's potential to exacerbate racism and antisemitism in the United States by amplifying biases present in human-generated online content.

Goals of the LCCHR include closely monitoring AI-related legislation and regulations, assessing their implications for civil and human rights, and disseminating research papers and policy positions. Moreover, it will actively engage in civic discourse concerning generative AI.

To ensure the center's effectiveness, an advisory group consisting of experts and civil rights organizations will provide guidance. It's worth noting that leading AI companies like OpenAI, Google, and Microsoft have made efforts to combat extreme racism, sexism, and homophobia in AI-generated content. However, concerns remain about others developing robust AI systems without such constraints.

AI models, such as OpenAI's ChatGPT and Google's Bard, learn from vast amounts of text generated by people, often from the internet, absorbing the biases inherent in both digital platforms and society at large. Consequently, there is heightened scrutiny over the potential perils of AI's rapid advancement.

However, the LCCHR also aims to showcase instances where AI can contribute to a better understanding of civil rights issues. Museums focused on addressing racial violence and antisemitism are using AI, holograms, and virtual reality to confront challenging aspects of history. The initiative benefits from the advice of Alondra Nelson, former acting director of the White House Office of Science and Technology Policy during President Biden's administration, and will operate within the LCCHR structure.

Link: Axios


bottom of page