Deepfaked Fiduciary: The Real AI Threat Nobody’s Talking About

Adviser Neil Plein explains how fiduciaries can be prepared to minimize the risk to their practices from increasingly sophisticated scams driven by artificial intelligence.

The rise of artificial intelligence-driven “deepfake” technology has introduced a dangerous new frontier for those overseeing retirement plans. Deepfakes are AI-generated impersonations—video, audio or text— so realistic they can fool professionals well trained in cybersecurity best practices. For 401(k) plan fiduciaries who authorize distributions, this threat is especially impactful, because under the Employee Retirement Income Security Act, fiduciaries are personally liable for losses caused by their actions. 

The use of tactics like deepfakes in cyberattacks surged by 118% last year, underscoring that this risk is accelerating. As a further example, in 2019, a deepfake voice scam duped a U.K. company’s CEO into wiring $243,000. In 2024, a Hong Kong firm’s employee was tricked via deepfake video into transferring $25 million.

Never miss a story — sign up for PLANADVISER newsletters to keep up on the latest retirement plan adviser news.

These scams—impersonating executives in real time—show how far the technology has come and should sound alarm bells for anyone handling financial transactions. 

Why 401(k) Fiduciaries Are at Risk 

Many people who manage aspects of their company’s retirement plan with discretion don’t realize that, by doing so, they become fiduciaries under ERISA. If you have discretion over plan management or assets (for example, giving final approval for a participant’s distribution or loan), you are a fiduciary and can be held personally liable.

Even those aware of their fiduciary role often have a false sense of security when relying on recordkeepers or third-party administrators. Most service providers operate in a ministerial role—they follow instructions and administer the plan, rather than make discretionary decisions. Thus, if a distribution request is fraudulent but an internal staff member approves it, the liability falls on the plan sponsor, not the vendor. Some providers even offer “fiduciary warranty” programs that sound reassuring but generally do not transfer liability to the provider. In short, you can outsource tasks, but not the risk.

A dangerous gap remains the human element in cybersecurity. Providers might guarantee to restore accounts if their systems are hacked, but deepfake schemes target the fiduciary outside those systems. If a fraudster convinces a plan administrator to approve a bogus transaction—an “off-system” breach—the recordkeeper’s protections likely will not apply, leaving the plan sponsor on the hook for the loss.

Jason Roberts, an ERISA attorney and a partner in the Fiduciary Law Center, cautions, “While we may not have seen a fiduciary breach via deepfake yet, it’s only a matter of time. … Plan sponsors are generally left exposed with respect to this sort of fraud.” 

Best Practices to Combat Deepfake Fraud 

To protect plan assets and limit personal liability, fiduciaries should take these proactive steps: 

Clarify Roles and Responsibilities: Ensure everyone involved knows who the plan’s fiduciaries are and what that means. Document who has authority to approve transactions and educate committee members and staff on their duties.

Tighten Controls and Verification: Close weak spots in your processes. For instance, if possible, eliminate paper distribution or loan forms in favor of secure online workflows. Require a secondary confirmation (such as a callback or video check) for any fund transfers, instead of relying on a single email or form. Be extra wary of out-of-pattern requests, such as a large distribution request shortly after an address change, and verify requests like this through additional channels.

Educate and Train Continuously: Regularly train plan fiduciaries (and even participants) to spot social engineering and deepfake tricks. Emphasize a healthy skepticism when things seem “off”—for example, a strange tone or timing in the voice of a caller claiming to be an executive or participant or a lack of inhale/exhale breath from the caller—and encourage additional verification.

Review Insurance Coverage for Gaps: Don’t assume your insurance will cover a deepfake-induced loss. Check your fidelity bond, cyber policy and fiduciary liability insurance for any “social engineering” fraud exclusions. Many insurers exclude these scams unless you add special coverage. If coverage is lacking, address it proactively with your insurer(s). 

Coordinate With Providers and Plan Ahead: Treat security as a responsibility shared with your recordkeeper and TPA. Ask vendors about their fraud prevention practices and use the Department of Labor’s cybersecurity guidelines as a checklist when evaluating providers. Have an incident response plan that includes your IT or security team so that any breach can be addressed with a swift, coordinated response. 

Deepfake-enabled fraud is moving from theory to reality, so fiduciaries must treat this risk seriously. The good news is that with a forward-thinking approach—rethinking assumptions, plugging process gaps and staying skeptical—plan sponsors can stay a step ahead of even the most sophisticated scams. In the age of digital deception, the biggest risk isn’t just falling for a fake—it’s assuming you’re protected when you’re not. 

Neil Plein is the lead consultant at Aldrich Wealth, with 17 years of experience advising retirement plans nationwide.

This feature is to provide general information only, does not constitute legal or tax advice, and cannot be used or substituted for legal or tax advice. Any opinions of the author do not necessarily reflect the stance of ISS STOXX or its affiliates.


More on this topic:

Selecting the Right Platforms for a Successful Practice
Tackling Succession Misalignment in Advisory Firms
Staffing Up: The When and How for Retirement Plan Advisory Firms
Who’s On Deck? Solving the Succession Planning Crisis at Advisory Firms
I Used to Be Skeptical About Financial Wellness. Here’s What Changed My Mind.

«