Journal of Financial Planning: May 2025
Chris Heye, Ph.D., is founder and CEO of Whealthcare Planning and Whealthcare Solutions.
Click HERE to read this article in the DIGITAL EDITION.
Large language models (LLMs) and other AI-powered technologies are dramatically changing the way financial professionals deliver their services. Most of the impact now is being felt in operations, investment management, and marketing. Generative AI is helping to make back-office functions more productive, portfolio management more efficient, and client outreach efforts more effective.
AI-driven improvements in the productivity of financial planning services will continue to proliferate for the foreseeable future. But the rapid advancement of LLM capabilities will spill over into other planning-adjacent domains with equal or perhaps greater effect.
Many tech analysts believe that in the next few years, we will witness an explosion of smart AI agents that perform tasks, offer recommendations, and interact with users in ways that closely mimic human intelligence. These agents will analyze data, learn from patterns, and respond to specific user questions in real time. Some of them will operate in the background, but others will engage with users directly.
We know these agents now as chatbots or virtual assistants. Most of us have used Alexa or Siri, queried customer service chatbots, or asked our car infotainment system for driving directions. The responses we receive from these agents are becoming increasingly human-like, in many cases passing the “Turing Test.” That is, it is getting harder and harder to tell for certain whether we are interacting with a machine or a real human.
So, what does the rise of AI agents mean for the financial planning industry?
Testing AI’s Limits on Empathy
One technology worth monitoring is AI-powered capabilities designed to improve the quality of client decision-making and the productivity of client conversations; in short, AI agents that serve as 24/7 behavioral finance coaches.
Because of the research conducted by Tversky, Kahneman, Thaler, Schiller, and others, we know that few people make financial decisions based on a thorough, rational, and objective evaluation of all available data. We have come so far that anyone who suggests that individuals actually make decisions in the manner described in classical microeconomic textbooks is no longer taken seriously.
Most financial professionals understand that their clients make financial choices based on “rule of thumb” heuristics and are influenced by personal biases, and that these choices may or may not produce optimal or even satisfactory outcomes. The fact that the CFP Board has recently introduced behavioral training requirements into its curriculum reflects a growing acknowledgement that for most clients, financial decision-making cannot be cleanly separated from emotions, personal experiences, or family histories. For most clients, “money” is not money, but rather some combination of self-esteem, security, power, and anxiety.
AI-powered conversational agents like ChatGPT already have the ability to collect inputs from users, understand the overall context of a chat session, recall earlier conversations, and deliver highly personalized responses. This technology is improving rapidly, with agents generating fewer hallucinations and offering an increasingly human-like user experience.
Building on these capabilities, it is only a matter of time until we have AI-powered agents with the ability to generate highly personalized, scenario-based, expert-informed recommendations. AI agents will improve financial literacy by providing targeted educational materials and administering short quizzes or assessments, all within a gamified environment that encourages frequent interaction. They will offer nudges that steer users toward better financial decisions without restricting freedom of choice. Personalized nudges that tap into behavioral tendencies like procrastination, overconfidence, or loss aversion will help users follow through on their financial goals. Along the way these agents will achieve a greater understanding of who the user is: their preferences, goals, personal histories, and more. (Spoiler alert: chatbots are already doing some of this now.)
Importantly, AI agents hold out the promise of being able to raise self-awareness around counterproductive personality traits, including financially unhealthy behaviors, in a supportive and non-confrontational way. These agents can be programmed to withhold judgment and practice endless degrees of patience. They will have instantaneous access to a virtually infinite reservoir of educational resources, academic research, and best practices. They will be able to tailor their advice to individual users and gradually understand habits and predispositions. In other words, AI agents will become expert and empathetic financial therapists.
By helping clients recognize and overcome biases, access financial education resources and tools, and raise levels of self-awareness, AI-powered behavioral coaches will raise the quality of client financial decision-making. But, properly designed, these agents will also indirectly improve client financial decisions by enhancing the quality and frequency of conversations with financial professionals.
Imagining a Client Use Case
The subject matter of adviser–client conversations is often complex and emotionally charged. Many clients feel overwhelmed, uneducated, or anxious about their finances and do not know how to initiate the conversations that they really want to have with their adviser. The inability to effectively communicate can lead to suboptimal financial decision-making, incomplete or inadequate plans, and client dissatisfaction.
A well-designed AI agent can inform and prepare clients for meetings with their advisers and help them figure out the right questions to ask.
Let’s envision a hypothetical scenario in the not-too-distant future. Mary is in her early 60s and concerned about how she will afford her retirement. She does not know much about annuities, but she has read a few articles recently suggesting that they are expensive and complicated. On the other hand, she likes the idea of protected income in retirement, especially for her spouse who has a family history of dementia. Even though she is the primary family income earner and de facto CFO, she is anxious about money in general. She often procrastinates before making important financial decisions, frequently to her detriment.
Mary has a meeting with her adviser in a few days and wants to talk about retirement plans. She starts preparing by asking an AI agent to explain annuities. The agent teaches her about the different types of annuities and how they are structured. She mentions she has recently read that annuities are expensive, and the agent advises that there are reasonable cases both for and against buying an annuity and warns against confirmation bias. The agent suggests that purchasing an annuity might help simplify her finances and reduce the number of (anxiety-ridden) decisions she needs to make. The agent helps her figure out that regular, automatic payments might compensate for her tendency to postpone important financial decisions. Finally, it points out that if she buys an annuity now, she will not need to sell off as much of her savings in the future to generate income.
But even after a productive session with the AI agent, she has questions about the suitability of an annuity for her. She is still worried that they might be too expensive and unclear how an annuity fits with the rest of her retirement portfolio. She wants to hear from her adviser what type of annuity is best for her. She is also unsure how much retirement income she really needs. And which assets should she sell now if she decides to buy an annuity?
When Mary meets with her adviser, she knows what questions to ask. She has even learned a little more about herself, and how she makes decisions, in the process. During the conversation with her adviser, Mary expresses how much anxiety she has around finances and what it would mean to her to put more decisions on autopilot. She explains how the thought of drawing down her investment principle in the future pains her, even though retirement is exactly what she has been saving for.
Mary continues to share her concerns, and by extension, key personality traits, with her adviser. She tells a personal family story that reveals the source of some of her anxiety about money. In the end, whether or not Mary decides to purchase the annuity almost becomes secondary to the personal self-discovery process she is going through, and to the door she has opened for her adviser. This more informed—and intimate—conversation is a great way for the adviser to get to know Mary better. Her adviser can apply insights into Mary’s personality gleaned from this conversation in other planning sessions. The AI-powered behavioral finance coaching not only helps Mary make better financial decisions, but also strengthens her relationship with her adviser, enabling the adviser to provide more personalized recommendations.
Hands-on Training for Planners
The value of these conversations points to an additional way AI agents might enhance client decision-making and the overall client experience. Many financial professionals are still uncomfortable having emotion-laden conversations with their clients. Unfortunately, few professionals have received much training in how to successfully navigate client emotions and biases. Making matters worse, difficult discussions about personal finances can quickly become entangled with equally sensitive concerns around the financial consequences of health and health events. This is especially true when dealing with older adult clients.
But AI agents can help here as well. There is now an extensive body of research showing how to successfully complete these types of sensitive conversations. We have recently learned a lot about how the brain works, what makes people defensive, and why stories resonate more than facts. These learnings can be incorporated into the knowledge base of an AI agent and delivered as specific recommendations to financial professionals.
These AI agents can also act as sound boards for professionals while providing detailed feedback in a supportive and non-judgmental environment. Advisers will be able to practice conversations on a variety of topics—money anxiety, impulsive decision-making, divorce, health events, grieving, and loss, etc.—before participating in a live client meeting. They could rehearse with the agent on their own or in groups. Just before the meeting starts, they can ask the agent to provide possible conversation starter scripts.
Using the example of Mary above, an AI agent might recommend that the adviser spend more time listening to Mary’s concerns and less time offering technical advice. It might encourage the adviser to get Mary to tell a personal story. It might provide guidance that helps the adviser understand that for Mary, money means anxiety and security. The agent could recommend a conversation starter script designed to lower the psychological defenses that make it hard for Mary to express her feelings about money. After the meeting, the adviser can feed the transcript to the agent and ask it to make suggestions about how discussions with Mary might be further improved in the future.
In summary, AI agents are likely to be an important part of the future of financial planning. Do they constitute a threat to financial professionals? Hard to know for sure, but, in principle, they should not. The type of agent I have been describing is not designed to replace advisers but rather to educate clients and enhance client–adviser relationships. For the foreseeable future, most clients will want to interact with a human adviser, and virtual behavioral coaches are not likely to change that, even as they become more human-like.
Over time, AI-enabled agents will become just another tool in the adviser’s technology toolkit, like AI-powered marketing content generators, meeting summarizers, or portfolio optimization algorithms. The real threat to financial professionals is not adopting these AI technologies, it is failing to adopt them quickly enough. By fully embracing AI, financial professionals will enhance their value, deepen client relationships, and lead the next era of advice.