Relating Data, Recommendations, Boredom, and the ROI of AI

What do #data, #recommendations, and #boredom have to do with #AI/#MachineLearning #ROI? Surprisingly lot. #AIeconomics #AIdesign pic.twitter.com/ryX3QMtyWe
— ivanjureta (@ivanjureta) March 5, 2018
As currently drafted (2024), the Algorithmic Accountability Act does not require the algorithms and training data used in an AI System to be available for audit. (See my notes on the Act, starting with the one here.) The way that an auditor learns about the AI System is from documented impact assessments, which involve descriptions…
In the creator economy, the creative individual sells content. The more attention the content captures, the more valuable it is. The incentive for the creator is status and payment for consumption of their content. Distribution channels are Internet platforms, where content is delivered as intended by the author, the platform does not transform it (other…
The Algorithmic Accountability Act of 2022, here, is a very interesting text if you need to design or govern a process for the design of software that involves some form of AI. The Act has no concept of AI, but of Automated Decision System, defined as follows. Section 2 (2): “The term “automated decision system”…
There is no single definition of the term “evidence”, and trying to make one isn’t the purpose of this text. But there are ways of telling if something might be evidence, and knowing when it clearly isn’t. Such knowledge helps you develop a taste, so to speak, in evidence. Isn’t that valuable, given how frequently you may be giving evidence to support your ideas, and how frequently others do the same to you?
Section 4 provides requirements that influence how to do the impact assessment of an automated decision system on consumers/users. This text follows my notes on Sections 1 and 2, and Section 3 of the Algorithmic Accountability Act (2022 and 2023). When (if?) the Act becomes law, it will apply across all kinds of software products,…