In “no duh” news, FTC now prohibits A.I. from posing as a government agency or business

26 February 2024- The Federal Trade Commission has published a new rule that prohibits the use of artificial intelligence to pose as a government agency or business —  a welcomed action that deters scammers that prey upon consumers using the quickly-developing technology.

A letter co-led by the Pennsylvania Office of Attorney General — and signed by 48 other Attorneys General — aided in the rulemaking process. The letter, in part, asked the FTC for stronger protections concerning impersonation scams, especially scam communications appearing to be from government agencies and businesses.

Additionally, the FTC announced it is seeking to expand the rule to include protections against scammers who use artificial intelligence to emulate any individual or agency — not just government entities.

“We commend federal leaders for seeing the value of our argument and message. Shame on the scammers who deceive and manipulate Pennsylvanians by posing as government officials and agencies,” Attorney General Michelle Henry said. “I am very pleased that the FTC acted upon the advice of our bipartisan coalition working to protect consumers. Consumers deserve transparency when they receive a phone call and this rule will ensure they get it.”

Artificial intelligence allows scammers to create clones of voices, including public figures or anyone who has videos posted publicly online.

This new rule would allow the FTC to seek monetary relief in federal court from scammers who are using government seals or business logos, spoofing government or business email addresses and websites, or falsely implying they are affiliated with a government agency or business.

For more information on the FTC’s new rule and proposed addition to the rule, see their press release here.