This morning, the White House Office of Science and Technology Policy released a long-awaited “Blueprint for an AI Bill of Rights” (“AI Bill of Rights”) that, when implemented, would apply to automated systems that have the potential to meaningfully affect the American public’s rights, opportunities, or access to critical resources or services. The AI Bill of Rights is designed to provide protections to apply broadly to all automated systems that “have the potential” to significantly affect individuals or communities, from civil rights/civil liberties (including privacy), to equal opportunities for healthcare, education, and employment, as well as access to resources and services.
The AI Bill of Rights contains five broad categories of practices designed to “guide the design, use, and deployment of automated systems to protect the rights of the American public in the age of artificial intelligence.” They are:
-
Safe and Effective Systems: Individuals “should be protected from unsafe or ineffective systems.” In addition, “[a]utomated systems should be developed with consultation from diverse communities, stakeholders, and domain experts to identify concerns, risks, and potential impacts of the system.”
-
Algorithmic Discrimination Protections: Individuals “should not face discrimination by algorithms and systems should be used and designed in an equitable way.” To accomplish this, the AI Bill of Rights calls upon designers, developers, and deployers of automated systems to “take proactive and continuous measures to protect individuals and communities from algorithmic discrimination and to use and design systems in an equitable way.”
-
Data Privacy: Individuals “should be protected from abusive data practices via built-in protections” and “have agency over how [personal] data is used.” This includes, among other things, communities being protected “from violations of privacy through design choices that ensure such protections are included by default, including ensuring that data collection conforms to reasonable expectations and that only data strictly necessary for the specific context is collected.”
-
Notice and Explanation: Individuals “should know that an automated system is being used and understand how and why it contributes to outcomes that impact [them].” This includes, among other things, that “[d]esigners, developers, and deployers of automated systems should provide generally accessible plain language documentation including clear descriptions of the overall system functioning and the role automation plays, notice that such systems are in use, the individual or organization responsible for the system, and explanations of outcomes that are clear, timely, and accessible.”
-
Human Alternatives, Consideration and Fallback: Individuals “should be able to opt-out, where appropriate and have access to a person who can quickly consider and remedy problems [] encounter[ed].” This concept is explained as individuals “should be able to opt out from automated systems in favor of a human alternative, where appropriate.” The meaning of appropriateness is circumstance-dependent; as the AI Bill of Rights explains, it “should be determined based on reasonable expectations in a given context and with a focus on ensuring broad accessibility and protecting the public from especially harmful impacts.”
This effort is intended to further the ongoing discussion regarding privacy among federal government stakeholders and the public, but its impact on the private sector could well be limited because it assumes voluntary action rather than mandated outcomes. The document is described as being “intended to support the development of policies and practices that protect civil rights and promote democratic values in the building, deployment, and governance of automated systems.” However, it is “non-binding and does not constitute U.S. government policy.” Additionally, it “is not intended to, and does not, create any legal right, benefit, or defense, substantive or procedural, enforceable at law or in equity by any party against the United States, its departments, agencies, or entities, its officers, employees, or agents, or any other person.”