| |

Algorithmic Accountability Act for AI Product Managers: Section 5

Section 5 specifies the content of the summary report to be submitted about an automated decision system. This text follows my notes on Sections 1 and 2, Section 3 and Section 4 of the Algorithmic Accountability Act (2022 and 2023).

This is the fourth of a series of texts where I’m providing a critical reading of the Algorithmic Accountability Act (2022 and 2023). I held various product management positions in the past, for products/services which included software as a significant component, and my notes are inevitably biased by that experience; in almost all cases, products supported non-trivial decisions. If you have any questions, suggestions, or want to connect, email me at ivan@ivanjureta.com.

Part 4 of Peter Paul Rubens - The Fall of Phaeton (National Gallery of Art)
Part 4 of Peter Paul Rubens – The Fall of Phaeton (National Gallery of Art) https://en.wikipedia.org/wiki/The_Fall_of_Phaeton_(Rubens)
Algorithmic Accountability Act (2022 and 2023)Implications to product management
SEC. 5. REQUIREMENTS FOR SUMMARY REPORTS TO THE COMMISSION.
The summary report that a covered entity is required to submit under subparagraph (D) or (E) of section 3(b)(1) for any automated decision system or augmented critical decision process shall, to the extent possible—
(1) contain information from the impact assessment of such system or process, as applicable, including—
(A) the name, website, and point of contact for the covered entity;
Section 5 provides requirements on the summary report that the covered entity needs to submit to the Commission.

The summary report needs to be developed from information developed and maintained as part of impact assessments.
(B) a detailed description of the specific critical decision that the augmented critical decision process is intended to make, including the category of critical decision as described in subparagraphs (A) through (I) of section 2(8);Describe the decision that the automated decision system is designed to support. That is, describe the purpose of that (part of) the product you are managing.
(C) the covered entity’s intended purpose for the automated decision system or augmented critical decision process;Describe how the (part of) product is intended to support the decision – is it intended to make a recommendation? Or does it provide an output that allows the comparison of decision options? Or provides information that is then assumed to be helpful to the user? Or does something else with respect to the decision?
(D) an identification of any stakeholders consulted by the covered entity as required by section 3(b)(1)(G) and documentation of the existence and nature of any legal agreements between the stakeholders and the covered entity;All consulted internal and external stakeholders need to be identified. 3(b)(1)(G) refers to internal stakeholders (e.g., employees) and external stakeholders “such as representatives of and advocates for impacted groups, civil society and advocates, and technology experts) as frequently as necessary”.

It is necessary to describe the legal relationships between the covered entity and these stakeholders.
(E) documentation of the testing and evaluation of the automated decision system or augmented critical decision process, including—
(i) the methods and technical and business metrics used to assess the performance of such system or process and a description of what metrics are deemed successful performance;
(ii) the results of any assessment of the performance of such system or process and a comparison of the results of any assessment under test and deployed conditions; and
(iii) an evaluation of any differential performance of such system or process assessed during the impact assessment;
Define how performance is assessed, by defining metrics used to measure performance, and report metric values at prior to deployment, and of the deployed automated decision system, and provide explanations of any differences between these.
(F) any publicly stated guard rail for or limitation on certain uses or applications of the automated decision system or augmented critical decision process, including whether such uses or applications ought to be prohibited or otherwise limited through any terms of use, licensing agreement, or other legal agreement between entities;Provide information on conditions in which the automated decision system should not be used, and how users are informed that it should not be used when these conditions are met.
(G) documentation about the data or other input information used to develop, test, maintain, or update the automated decision system or augmented critical decision process including—
(i) how and when the covered entity sourced such data or other input information; and
(ii) why such data or other input information was used and what alternatives were explored;
Provide information about the training data and the rationale for using it to train the system. This leads to many questions. For example, should information on rights to use the data be provided as well? How much information about the data should be provided? How does the Commission plan to relate the descriptions of data across summary reports, in order to make it easier to identify entities that use the same training data?
(H) documentation of whether and how the covered entity implements any transparency or explainability measures, including—
(i) which categories of third-party decision recipients receive a copy of or have access to the results of any decision or judgment that results from such system or process; and
(ii) any mechanism by which a consumer may contest, correct, or appeal a decision or opt out of such system or process, including the corresponding website for such mechanism, where applicable;
Define what explainability means in the context of the product you manage. Explain if and how explanations are communicated to users. Explain how users can provide feedback and if and how that feedback leads to action, and what action.
(I) any likely material negative impact on consumers identified by the covered entity and a description of the steps taken to remediate or mitigate such impact;Include the risk assessment that is developed as part of the impact assessment.
(J) a list of any impact assessment requirements that were attempted but were not possible to comply with because they were infeasible, as well as the corresponding rationale for not being able to comply with such requirements; andExplain what potential impacts could not be assessed, and for which no risk assessment was developed.
(K) any additional capabilities, tools, standards, datasets, security protocols, improvements to stakeholder engagement, or other resources identified by an impact assessment as necessary or beneficial to improve the performance of impact assessment or the development and deployment of any automated decision system or augmented critical decision process that the covered entity determines appropriate to share with the Commission;Provide the roadmap of what will be done over the future of the product to mitigate risks identified through the impact assessment.
(2) include, in addition to the information required under paragraph (1), any relevant additional information from section 4(a) the covered entity wishes to share with the Commission;Any information from the impact assessment (see Section 4) can be included, but is optional.
(3) follow any format or structure requirements specified by the Commission; andObvious.
(4) include additional criteria that are essential for the purpose of consumer protection, as determined by the Commission.There will be new criteria and requirements, as determined by the Commission, for the summary report. Follow what the Commission communicates about this.

The above covers only Section 5 of the Act. Sections 1 and 2 are covered here, Section 3 is covered here, and Section 4 here. Texts covering other Sections are coming soon.

Similar Posts