| |

Algorithmic Accountability Act for AI Product Managers: Sections 6 through 11

Sections 6 through 11 of the Algorithmic Accountability Act (2022 and 2023) have less practical implications for product management. They ensure that the Act, if passed, becomes part of the Federal Trade Commission Act, as well as introduce requirements that the FTC needs to meet when implementing the Act. This text follows my notes on Sections 1 and 2, Section 3, Section 4, and Section 5 of the Algorithmic Accountability Act (2022 and 2023).

This is the fifth and last of a series of texts where I’m providing a critical reading of the Algorithmic Accountability Act (2022 and 2023). I held various product management positions in the past, for products/services which included software as a significant component, and my notes are inevitably biased by that experience; in almost all cases, products supported non-trivial decisions. If you have any questions, suggestions, or want to connect, email me at ivan@ivanjureta.com.

Peter Paul Rubens - The Fall of Phaeton (National Gallery of Art) https://en.wikipedia.org/wiki/The_Fall_of_Phaeton_(Rubens)
Peter Paul Rubens – The Fall of Phaeton (National Gallery of Art) https://en.wikipedia.org/wiki/The_Fall_of_Phaeton_(Rubens)
Algorithmic Accountability Act (2022 and 2023)Implications to product management
SEC. 6. REPORTING; PUBLICLY ACCESSIBLE REPOSITORY.
(a) Annual Report.—Not later than 1 year after the effective date described in section 3(b)(3), and annually thereafter, the Commission shall publish publicly on the website of the Commission a report describing and summarizing the information from the summary reports submitted under subparagraph (D), (E), or (F) of section 3(b)(1) that—
(1) is accessible and machine readable in accordance with the 21st Century Integrated Digital Experience Act (44 U.S.C. 3501 note); and
(2) describes broad trends, aggregated statistics, and anonymized lessons learned about performing impact assessments of automated decision systems or augmented critical decision processes, for the purposes of updating guidance related to impact assessments and summary reporting, oversight, and making recommendations to other regulatory agencies.
The Commission will publish an annual report containing summaries and interpretations of statistics that describe the received summary reports.
(b) Publicly Accessible Repository.The Commission will create and maintain a repository of data generated from all summary reports that the Commission receives. Subsections of (b) are straightforward, and are requirements that the Commission’s repository needs to meet.
SEC. 7. GUIDANCE AND TECHNICAL ASSISTANCE; OTHER REQUIREMENTS.
(a) Guidance And Technical Assistance From The Commission.—
(1) IN GENERAL.—The Commission shall publish guidance on how to meet the requirements of sections 4 and 5, including resources such as documentation templates and guides for meaningful consultation, that is developed by the Commission after consultation with the Director of the National Institute of Standards and Technology, the Director of the National Artificial Intelligence Initiative, the Director of the Office of Science and Technology Policy, and other relevant stakeholders, including standards bodies, private industry, academia, technology experts, and advocates for civil rights, consumers, and impacted communities.
The Commission will provide guidelines to covered entities, on the preparation of summary reports, and tools to make it easier to do so.
(2) ASSISTANCE IN DETERMINING COVERED ENTITY STATUS.—In addition to the guidance required under paragraph (1), the Commission shall—
(A) issue guidance and training materials to assist persons, partnerships, and corporations in evaluating whether they are a covered entity; and
(B) regularly update such guidance and training materials in accordance with any feedback or questions from covered entities, experts, or other relevant stakeholders.
The Commission will provide guidance that helps determine whether an entity meets the criteria to be considered a covered entity.
(b) Other Requirements.—
(1) PUBLICATION.—Nothing in this Act shall be construed to limit a covered entity from publicizing any documentation of the impact assessment maintained under section 3(b)(1)(B), including information beyond what is required to be submitted in a summary report under subparagraph (D) or (E) of section 3(b)(1), unless such publication would violate the privacy of any consumer.
Any covered entity will decide on its own if it will share and how any information resulting from its impact assessments, that is in or more than the content of its summary report(s).
(2) PERIODIC REVIEW OF REGULATIONS.—The Commission shall review the regulations promulgated under section 3(b) not less than once every 5 years and update such regulations as appropriate.
(3) REVIEW BY NIST AND OSTP.—The Commission shall make available, in a private and secure manner, to the Director of the National Institute of Standards and Technology, the Director of the Office of Science and Technology Policy, and the head of any Federal agency with relevant regulatory jurisdiction over an augmented critical decision process any summary report submitted under subparagraph (D), (E), or (F) of section 3(b)(1) for review in order to develop future standards or regulations.
The Commission will have its own processes for continuous improvement, i.e., reviews of its practices and identification of ways to improve these.
SEC. 8. RESOURCES AND AUTHORITIES. Section 8 establishes specific governance entities and jobs required to implement the Act. While it is useful to know about these, there is no specific impact on a covered entity’s product management practices.
SEC. 9. ENFORCEMENT.Section 9 ensures that the Act is treated as part of the FTC Act (see the Federal Trade Commission Act overview).
Violations of the Act are treated as unfair or deceptive practices. The latter are unfair in the sense that they can cause injury to customers that the customers cannot reasonably avoid. See A Brief Overview of the Federal Trade Commission’s Investigative, Law Enforcement, and Rulemaking Authority.

There is no special interpretation of Section 9 from the product management perspective.
SEC. 10. COORDINATION. In carrying out this Act, the Commission shall coordinate with any appropriate Federal agency or State regulator to promote consistent regulatory treatment of automated decision systems and augmented critical decision processes.No implications on how to do AI product management.
SEC. 11. NO PREEMPTION. Nothing in this Act may be construed to preempt any State, tribal, city, or local law, regulation, or ordinance.Same as above, no implications on how to do AI product management.

The above completes the series of notes on the implications for AI product management, of the Algorithmic Accountability Act (2022 and 2023). The intent was not to provide a legal interpretation thereof, but a reading from the perspective of what needs to be considered when doing product management of products that are or have as their part automated decision systems. Sections 1 and 2 are covered here, Section 3 is covered here, Section 4 here, and Section 5 here.

Similar Posts