The FCC introduced on Tuesday a plan to assist shoppers establish and block AI-generated robocalls. The plan, if handed, may affect a key a part of actual property brokers’ lead era strategies.
At Inman Join Las Vegas, July 30-Aug. 1, 2024, the noise and misinformation can be banished, all of your large questions can be answered, and new enterprise alternatives can be revealed. Be part of us.
The Federal Communications Fee has plans to tighten the reigns on synthetic intelligence-generated robocalls.
FCC Chairwoman Jessica Rosenworcel introduced her plan on Tuesday, requiring callers to reveal the AI-generated robocalls when acquiring prior specific consent from shoppers. Even with prior specific consent, callers can be required to make one other disclosure on each AI-generated name they make, a measure Rosenworcel mentioned would assist shoppers “identify and avoid” calls that “contain an enhanced risk of fraud and other scams.”
The plan additionally requires creating tech that helps shoppers establish and block undesirable AI-generated calls and defending “positive uses” of AI-generated requires shoppers with disabilities.
Rosenworcel mentioned her proposal builds on a number of latest actions the FCC has taken to manage robocalls, together with the passage of a declaratory ruling that mentioned voice cloning expertise is against the law and a $6 million wonderful levied towards a New Hampshire man who made voice-cloned robocalls to sway 2024 main voting.
The plan will endure a three-part voting course of, beginning on the FCC’s August Open Assembly. If fee members approve it, it would face public remark and a remaining vote earlier than implementation.
Though the plan doesn’t point out any particular business, it addresses a essential part of many actual property brokers’ lead era plans and rising tech that makes use of AI to automate chilly calls.
Final 12 months, Texas-based franchisor Keller Williams settled a $40 million class motion lawsuit for unsolicited, pre-recorded telemarketing calls its brokers made to shoppers with out their consent. The lawsuit leaned on the 1991 Phone Shopper Safety Act (TCPA), which Rosenworcel cited a number of instances in her announcement on Tuesday.
“Bad actors are already using AI technology in robocalls to mislead consumers and misinform the public,” she mentioned in a written assertion. “That’s why we want to put in place rules that empower consumers to avoid this junk and make informed decisions.”
In an e mail to Inman, advertising knowledgeable Katie Lance mentioned Rosenworcel’s proposal is a “significant development” that brokers and brokers shouldn’t ignore.
“For agents who rely on AI to streamline their marketing tasks, this move underscores the importance of ethical and compliant AI usage,” she mentioned. “AI has revolutionized our industry by enabling more personalized and efficient communication with clients; however, it’s crucial for agents to understand the boundaries of these tools to ensure they are not infringing on consumer privacy or regulatory standards.”
Lance mentioned AI should be used responsibly, and that is the time for brokers to overview what AI instruments they’re utilizing and modify how they’re utilizing them.
“For agents, this means being vigilant about the sources and methods of their AI tools, ensuring they comply with all relevant regulations, and focusing on building genuine connections with clients,” she mentioned. “AI should augment our efforts, not replace the personal touch that is so vital in real estate.”