Home>Articles>AB 331: AI Tools

Assemblywoman Rebecca Bauer-Kahan. (Kevin Sanders for California Globe)

AB 331: AI Tools

Automated decision tools

By Chris Micheli, March 23, 2023 10:38 am

Assembly Bill 331 by Assemblywoman Rebecca Bauer-Kahan (D-Orinda) was amended to address the use of automated decision tools. AB 331 would add Chapter 25 to Division 8 of the Business and Professions Code.

First, the bill would define the following terms: “algorithmic discrimination,” “automated decision tool,” “consequential decision, “deployer,” “developer,” “impact assessment,” “sex,” and “significant update.”

Second, on or before January 1, 2025, and annually thereafter, a deployer of an automated decision tool would be required to perform an impact assessment for any automated decision tool the deployer uses. The assessment would be required to include eight items. This would not apply to a deployer with fewer than 25 employees.

Third, on or before January 1, 2025, and annually thereafter, a developer of an automated decision tool would be required to complete and document an assessment of any automated decision tool that it designs, codes, or produces. The assessment would be required to include six items.

Fourth, a deployer would be required, at or before the time an automated decision tool is used to make a consequential decision, notify any natural person that is the subject of the consequential decision that an automated decision tool is being used to make, or be a controlling factor in making, the consequential decision.

Fifth, if a consequential decision is made solely based on the output of an automated decision tool, a deployer would be required, if technically feasible, to accommodate a natural person’s request to not be subject to the automated decision tool and to be subject to an alternative selection process or accommodation.

Sixth, after such a request, a deployer would be allowed to reasonably request, collect, and process information from a natural person for the purposes of identifying the person and the associated consequential decision. If the person does not provide that information, the deployer would not be obligated to provide an alternative selection process or accommodation.

Seventh, a developer would be required to provide a deployer with a statement regarding the intended uses of the automated decision tool and documentation regarding three specified items. Trade secrets would not be required to be disclosed.

Eighth, a deployer or developer would be required to establish, document, implement, and maintain a governance program that contains reasonable administrative and technical safeguards to map, measure, manage, and govern the reasonably foreseeable risks of algorithmic discrimination associated with the use or intended use of an automated decision tool. The safeguards required must be appropriate to five specified activities. In addition, the governance program would be designed to address seven specified items. This would not apply to a deployer with fewer than 25 employees.

Ninth, a deployer or developer would be required to make publicly available, in a readily accessible manner, a clear artificial intelligence policy that provides a summary of the types of automated decision tools currently in use or made available to others by the deployer or developer, as well as how the deployer or developer manages the reasonably foreseeable risks of algorithmic discrimination that may arise from the use of the automated decision tools it currently uses or makes available to others.

Tenth, a deployer would be prohibited from using an automated decision tool in a manner that contributes to algorithmic discrimination.

Eleventh, within 60 days of completing an impact assessment, a deployer or a developer would be required to provide the impact assessment to the Civil Rights Department. A deployer or developer who violates this section would be liable for an administrative fine of not more than $10,000 in an administrative enforcement action brought by the Civil Rights Department.

Twelfth, on and after January 1, 2026, a person may bring a civil action against a deployer or developer for a violation of this ;aw. A court may aware compensatory damages, injunctive relief, declaratory relief, and reasonable attorneys’ fees.

Thirteenth, a person, before commencing an action for injunctive relief, would be required to provide 45 days’ written notice to a deployer or developer of the alleged violations of this law. If the developer or deployer demonstrates to the court within 45 days of receiving the written notice that it has cured a noticed violation and provides the person who gave the notice an express written statement that the violation has been cured and that no further violations would occur, a claim for injunctive relief would not be maintained for the noticed violation.

Fourteenth, a city or county would be prohibited from adopting, maintaining, enforcing, or continuing in effect any law, regulation, rule, requirement, or standard related to the performance of an impact assessment or governance program, or the equivalent thereof, of an automated decision tool.

Print Friendly, PDF & Email
Spread the news:

 RELATED ARTICLES

2 thoughts on “AB 331: AI Tools

  1. Not sure what this bill is all about, but there has to be something sinister about it if a radical Marxist Democrat like Assemblywoman Rebecca Bauer-Kahan is behind it? Where the money behind this legislation coming from? Big tech? The CCP?

    1. It looks like they’re afraid, or pretending to be afraid, that our “machines of the future” (AI) are going to be programmed to discriminate against people on the basis of race, creed, color, sex, sexual preference, etc. Bigoted AI! I agree it sounds sinister because no low-tech person knows what the heck they’re talking about so it’s probably a set-up or excuse to harass or take down more businesses/employers (the “deployers”) that this legislator and whoever is behind her don’t like and have their sights set on.

Leave a Reply

Your email address will not be published. Required fields are marked *