| |

Algorithmic Accountability Act for AI Product Managers: Sections 1 and 2

The Algorithmic Accountability Act (2022 and 2023) applies to many more settings than what is in early 2024 considered as Artificial Intelligence. It applies across all kinds of software products, or more generally, products and services which rely in any way on algorithms to support decision making. This makes it necessary for any product manager whose products rely on any kind of algorithm, however implemented, to understand the details of the Act.

This is the first of a series of texts where I’m providing my critical reading of the Algorithmic Accountability Act of 2022. I held various product management positions in the past, for products/services which included software as a significant component, and my notes are inevitably biased by that experience; in almost all cases, products supported non-trivial decisions. If you have any questions, suggestions, or want to connect, email me at ivan@ivanjureta.com.

Part of Peter Paul Rubens - The Fall of Phaeton (National Gallery of Art)
Part of Peter Paul Rubens – The Fall of Phaeton (National Gallery of Art) https://en.wikipedia.org/wiki/The_Fall_of_Phaeton_(Rubens)
Algorithmic Accountability Act (2022 and 2023)Implications to product management
SECTION 1. SHORT TITLE.
This Act may be cited as the “Algorithmic Accountability Act of 2022”.
It’s encouraging that the name of the Act does not refer to Artificial Intelligence, but to algorithms in general – there can be algorithms that affect critical decisions and are implemented via automated decision systems (that is, systems which meet the definition in the Act) and that would not be considered as AI.
SEC. 2. DEFINITIONS.
In this Act:
(1) AUGMENTED CRITICAL DECISION PROCESS.—The term “augmented critical decision process” means a process, procedure, or other activity that employs an automated decision system to make a critical decision.
Ensure that processes / procedures are well documented, so that it is easier to argue, as part of audits, whether the process in fact employs an automated decision system, and whether the process is intended to influence decision making that meets criteria for critical decisions. Process documentation needs to explicitly define the properties of decisions that the algorithms are intended to support.
(2) AUTOMATED DECISION SYSTEM.—The term “automated decision system” means any system, software, or process (including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques and excluding passive computing infrastructure) that uses computation, the result of which serves as a basis for a decision or judgment.Firstly, note the emphasis that there are “software”, “system”, and “process”, which broadens applicability relative to software only. To illustrate, let’s say there is a statistical model that supports the decision to allow a person through a gate. It may be implemented as software that produces a recommendation visible on a user interface shown on a desktop web browser; it may be implemented on specialized hardware and the only indicator may be a green or red light; and it may be implemented as someone calculating a few parameters using a calculator and looking at the formula for the model. These distinctions in the end don’t matter, because especially in the last case, there is no software implementation beyond the code developed to train the model.
Secondly, note the parentheses “(including…)” that indicates that an Automated Decision System does not need to qualify as Artificial Intelligence at all for the Act to apply. If you know the somewhat older and academic term “decision support system”, it looks like the Act applies to all such systems, provided that the supported decision is, according to the Act, a critical decision (see below, and the text here).
(3) BIOMETRICS.—The term “biometrics” means any information that represents a biological, physiological, or behavioral attribute or feature of a consumer.It is necessary to keep records of biometrics data, including both that which you are collecting directly (e.g., your software collects height, weight, color of the eyes, etc.), and collected by third parties, in case you are using the third party biometric data as-is (they provide you, for example, the customer eye color and you use this to derive something about the customer) or you use some sort of estimate a third party makes on the basis of that biometric data. The important point here is that ultimately, traceability needs to be ensured all the way to the point of collection (the party doing the collection) of biometric data, and you need to understand and document the steps from collection through to the output of your Automated Decision System. This is often hard / expensive to do, but even when it cannot be done, it is important to document why this was not possible, we we’ll see in a later Section of the Act.
(4) CHAIR.—The term “Chair” means the Chair of the Commission.
(5) COMMISSION.—The term “Commission” means the Federal Trade Commission.
Not much to say here, other than that the Federal Trade Commission will be the authority you will need to interact with, if the Act applies to your products.
(6) CONSUMER.—The term “consumer” means an individual.The term “individual” to me looks open to interpretation – it is not used in the Act other than in the definition of Consumer. One narrow interpretation is that it refers exclusively to “natural persons” as in the US Internal Revenue Code, but might expand to include legal entities (e.g., a corporation). In other words, it would be too easy to ignore the Act on grounds of it applying to products intended for natural persons only. If you are managing a business to business product, which supports decision making in other organizations, it is likely that the Act applies as well; in any case, if you decide that the Act does not apply, it is best to maintain documentation that provides the justification.
(7) COVERED ENTITY.—

(A) IN GENERAL.—The term “covered entity” means any person, partnership, or corporation over which the Commission has jurisdiction under section 5(a)(2) of the Federal Trade Commission Act (15 U.S.C. 45(a)(2))—

(i) that deploys any augmented critical decision process; and
(I) had greater than $50,000,000 in average annual gross receipts or is deemed to have greater than $250,000,000 in equity value for the 3-taxable-year period (or for the period during which the person, partnership, or corporation has been in existence, if such period is less than 3 years) preceding the most recent fiscal year, as determined in accordance with paragraphs (2) and (3) of section 448(c) of the Internal Revenue Code of 1986;
(II) possesses, manages, modifies, handles, analyzes, controls, or otherwise uses identifying information about more than 1,000,000 consumers, households, or consumer devices for the purpose of developing or deploying any automated decision system or augmented critical decision process; or
(III) is substantially owned, operated, or controlled by a person, partnership, or corporation that meets the requirements under subclause (I) or (II);

(ii) that—
(I) had greater than $5,000,000 in average annual gross receipts or is deemed to have greater than $25,000,000 in equity value for the 3-taxable-year period (or for the period during which the person, partnership, or corporation has been in existence, if such period is less than 3 years) preceding the most recent fiscal year, as determined in accordance with paragraphs (2) and (3) of section 448(c) of the Internal Revenue Code of 1986; and
(II) deploys any automated decision system that is developed for implementation or use, or that the person, partnership, or corporation reasonably expects to be implemented or used, in an augmented critical decision process by any person, partnership, or corporation if such person, partnership, or corporation meets the requirements described in clause (i); or

(iii) that met the criteria described in clause (i) or (ii) within the previous 3 years.

(B) INFLATION ADJUSTMENT.—For purposes of applying this paragraph in any fiscal year after the first fiscal year that begins on or after the date of enactment of this Act, each of the dollar amounts specified in subparagraph (A) shall be increased by the percentage increase (if any) in the consumer price index for all urban consumers (U.S. city average) from such first fiscal year that begins after such date of enactment to the fiscal year involved.
The Covered Entity definition provides conditions that you (your organization) needs to meet, for the Act to apply. Conditions under point 7.A.i are straightforward in most cases, although the revenue conditions may need some effort depending on where revenue is generated and booked, and the ownership structure.

Condition 7.A.i.II seems arbitrary, but there’s not much that can be done about that – a very practical implication is that you need to document your definition of Customer, or User of your product, and document the methodology used to count these, including the frequency at which you count them, and situations / cases in which your count may be incorrect. Let’s say your product can be registered by a “family” – what does that exactly mean? How do you define a family? Why do you chose a certain limit on the number of people in a family? What checks do you have in place to determine the number of people that are part of a “family” from your product’s perspective. A relatively safe approach is to consult legal definitions of “household” and “consumer device” and see how they translate into what you are counting as product users. There may some additional complexity in counting users when your pricing is based on, for example, usage, not users.

Condition 7.A.ii.I also seems arbitrary, and the threshold is quite low, as it will in many cases include startups close to or shortly after funding Series A that (as in Condition 7.A.ii.II) use someone else’s Automated Decision System in their own product. This makes it imperative to have dedicated financial resources for AI governance and compliance very soon in startup / product lifecycle.

Condition 7.A.ii.II refers to cases where a company A is using in its product a system, software, or process made by company B, whereby the Act applies to company B’s product that A is using in its own. In short, if your product has as its component a product of another company, that the Act applies to, then the Act applies to your product as well.

7.B simply means that all amounts mentioned in the definition of Covered Entity are annually adjusted for inflation.
(8) CRITICAL DECISION.I discussed the Critical Decision concept in a different text, here.
(9) DEPLOY.—The term “deploy” means to implement, use, or make available for sale, license, or other commercial relationship.The term “deploy” appears in the Act in places where it was necessary to clarify that the system, software, process is available for use to Consumers, thus contrasting this with prototypes used internally for quality control / assurance purposes.
(10) DEVELOP.—The term “develop” means to design, code, produce, customize, or otherwise create or modify.The term is straightforward, and appears in later Sections in relation to, for example, requirements that the Commission (FTC) may develop in the future, with respect to the impact assessment and reporting about the stages of development of a system, software, process – that is, there may be requirements on how to explain and report on changes to how you design, develop, release, monitor your product. Consequently, and irrespective of whether there are such requirements now, this highlights the need for a well documented product lifecycle.
(11) IDENTIFYING INFORMATION.—The term “identifying information” means any information, regardless of how the information is collected, inferred, predicted, or obtained that identifies or represents a consumer, household, or consumer device through data elements or attributes, such as name, postal address, telephone number, biometrics, email address, internet protocol address, social security number, or any other identifying number, identifier, or code.I noted above, in relation to the term “Biometrics”, that you need to maintain documentation about what information about “Consumers” is being collected, used, and made by your product. While the definition of “Identifying Information” gives examples, this is typically a much, much longer list of properties collected or inferred that are used for distinguishing between product users.
(12) IMPACT ASSESSMENT.—The term “impact assessment” means the ongoing study and evaluation of an automated decision system or augmented critical decision process and its impact on consumers.Alongside “Critical Decision” and “Automated Decision System”, the term “Impact Assessment” is the most important one in the Act. A significant part of the practicalities of complying with the Act is about how to perform and report Impact Assessments. Roughly speaking, an Impact Assessment is a record of your assessment of how your product works for (some) Customers (some) of the time. More on this in next parts of this series.
(13) PASSIVE COMPUTING INFRASTRUCTURE.—The term “passive computing infrastructure” means any intermediary technology that does not influence or determine the outcome of a decision, including—
(A) web hosting;
(B) domain registration;
(C) networking;
(D) caching;
(E) data storage; or
(F) cybersecurity.
The term “Passive Computing Infrastructure” limits the applicability of the Act by removing capabilities often used alongside with, or as enablers for an Automated Decision System. For a product manager, it remains important to know what in their product qualifies as Passive Computing Infrastructure, and ensure they can address any questions about these in fact being “passive” relative to the supported decision process.
(14) STATE.—The term “State” means each of the 50 States, the District of Columbia, and any territory or possession of the United States.Nothing interesting here.
(15) SUMMARY REPORT.—The term “summary report” means documentation of a subset of information required to be addressed by the impact assessment as described in this Act or determined appropriate by the Commission.The Summary Report can be thought of the information package to develop and deliver to the FTC, and this in turn begs questions of who makes and maintains this on the product management or quality teams, how frequently to make it, the terminology it needs to use to be readable by the FTC, and many more that I’ll mention in other parts of this series, and which are called out in other Sections of the Act.
(16) THIRD-PARTY DECISION RECIPIENT.—The term “third-party decision recipient” means any person, partnership, or corporation (beyond the consumer and the covered entity) that receives a copy of or has access to the results of any decision or judgment that results from a covered entity’s deployment of an automated decision system or augmented critical decision process.The term “third-party decision recipient” refers to all entities that may be using the data produced by the Automated Decision System. There are several important implications here.
Firstly, you need to maintain documentation about all third parties who have been given access, how they are accessing the data, how frequently, how they are using the data in their own Automated Decision Systems, and so on – some of this may not be mandated by the Act, but is simply good practice (often, its absence simply raises the risk to reputation in case data is misused by entities consuming it downstream).
Secondly, there need to be well defined governance and legal and economic framework for these entities to use the data, which means that the product manager needs to think through the incentives, as well as potential negative uses.
Thirdly, the inclusion of “third-party decision recipients” begs the question of whether any outputs of their own systems are inputs to your product, and how this may lead to bias and how your product detects and handles such cases.
Fourthly, note that by now, and only on the basis of Section 2 of the Act, the product manager needs to clearly distinguish four (groups of) stakeholders: the Commission (FTC), Consumers, Decision Recipients, and any entity making / providing their own Automated Decision Systems as components of the product. Change and communications management practices need to be defined for each of these groups, among many other things that stakeholder management involves.

The above covers only Sections 1 and 2 of the Act. Texts covering other Sections are coming soon.

Similar Posts