While the US does not have some specific AI-focused law a host of regulators have been providing their thoughts about AI. Noticeable traction on the topic began in 2020. With the explosion of ChatGPT in 2023, commentary (and scrutiny) has been picking up steam.
Unsurprisingly, the FTC is in the mix sharing its thoughts through various blogs and investigations. Its blogs have focused on specific aspects of AI – use of AI, claims about AI, voice cloning, and now, companies that develop AI. In its latest guidance, the FTC reminds companies that develop AI models of their obligations around privacy commitments made to their users and customers. The take-aways and underlying points in this post are not new. The FTC has long reminded (and enforced) companies that the statements made about how information will be collected, used, shared, protected, etc. must be upheld. That said, this guidance puts those concepts into the context of a growing area of companies – model-as-a-service companies.
Model-as-a-service companies are those companies that develop and host AI models that are made available to other businesses and users via an API. According to the FTC, these companies should be cautious about commitments around data use found not only in privacy policies, but terms of service and other promotional materials. The FTC also cautions that material omissions may be just as problematic – e.g., those statements that would affect whether customers buy a particular product.
Putting it into practice. Companies developing any kind of machine learning, natural language processing, or other AI solutions and capabilities for business customers or individual users should take a close look at its statements around data use. Are any claims about how information is being used overstated? Is there anything material missing in product disclosures and terms that would impact whether a person would buy this service? These types of assessments are key for ongoing compliance. The FTC will continue to scrutinize these types of statements and may require deletion of the developed model and algorithm if a company runs afoul of the law.