LIEB BLOG

Legal Analysts

Showing posts with label Bias. Show all posts
Showing posts with label Bias. Show all posts

Tuesday, July 16, 2024

Guidance on AI Discrimination & Emerging Data in Insurance by NYS DFS

Welcome to the age of AI Discrimination Regs. Do you have an auditing program in place? 


On July 11th, 2024, the New York State Department of Financial Services (DFS) released Circular Letter No. 2024-7 about the expectations for insurers in NYS regarding the use of Emerging Consumer Data & Information Sources (ECDIS) & Artificial Intelligence Systems (AIS) in underwriting and pricing insurance policies.


The goal of these guidelines is to ensure that all insurers adopt & manage ECDIS, AIS, & other predictive models responsibly because these models come with potential systemic biases & reliability issues that could lead to unfair discrimination or adverse effects on vulnerable communities.


Keys:

  • Insurers must ensure that ECDIS & AIS complies with all relevant federal / state laws & regulations.
  • Use of these models should not result in unfair discrimination, which means that data sources or models do not rely on protected classes & do not produce unfairly discriminatory outcomes.
  • Use of ECDIS & AIS must be supported with generally accepted actuarial standards, demonstrating a clear, statistically significant relationship between variables used & risk.
  • Insurers must regularly test & document their methodologies to ensure compliance with anti-discrimination laws & to maintain transparency.
  • Effective governance frameworks should be established, with senior management & board oversight to manage the risks associated with these technologies.


DFS notes that transparency is crucial with ECDIS and AIS. Insurers must disclose to consumers whether these technologies are used in underwriting or pricing decisions & provide the specific data that influenced these decisions. 


When it comes to third-party vendors, insurers are responsible for understanding & ensuring compliance of any third-party tools, ECDIS, or AIS used. This includes having contracts that allow for audits & cooperation with regulatory inquiries.


If you'd like to read DFS's Circular Letter No. 2024-7 click here




Monday, October 30, 2023

AI Discrimination Being Regulated by President Biden's New Executive Order

On October 30th, 2023, President Biden issued an Executive Order (EO) addressing discrimination caused by artificial intelligence (AI), amongst other topics. 


The White House announced this EO in seeking to prevent AI from leading to and deepening discrimination, bias, and other issues in justice, healthcare, and housing. 


Now, agencies will be empowered to combat algorithmic discrimination, while enforcing existing authorities to protect anti-discrimination rights and safety. 


In summary, the Executive Order: 

  • Calls for clear guidelines to keep AI algorithms from being used to exacerbate discrimination by landlords, Federal benefits programs, and Federal contractors.
  • Tackles algorithmic discrimination through training, technical assistance, and coordinates with the Department of Justice and Federal civil rights offices for best practices to investigate and prosecute AI civil rights violations.
  • Ensures fairness throughout the criminal justice system by developing best practices for the use of AI in sentencing, parole and probation, pretrial release and detention, risk assessments, surveillance, crime forecasting and predictive policing, and forensic analysis.

The Biden Administration Blueprint for an AI Bill of Rights sets out steps those using AI can take to ensure fairness and equality. The steps include regularly checking for and addressing any biases in the design and use of AI systems, using diverse and representative data to avoid discrimination or unfair impacts, ensuring accessibility for people with disabilities during the design and development of AI systems, conducting tests to identify and address any disparities before and after the AI system is in use, providing clear oversight from organizations to ensure fairness, and conducting independent evaluations and sharing easy-to-understand reports, including test results and how any issues are being addressed, to ensure these protective measures are in place.


If companies fail to comply and use AI incorrectly to deepen discrimination and bias, this Executive Order will become the basis for discrimination lawsuits as a result of the incorrect use.


To learn more about the Executive Order click here. To read the Biden Administration Blueprint for an AI Bill of Rights click here