AI in Financial Planning: A Practitioner's Guide to Compliance

There was life before the internet, life before cell phones and texting, and now there's life before AI and life after AI. For me, this technology sits alongside those other fundamental shifts that changed not just what I do, but how I think and approach problems on a daily basis.

I'm not a tech early adopter or fanboy by any stretch. I use Android phones, I have zero interest in VR headsets or cryptocurrency, and I'm generally skeptical of the latest shiny technology trends. But with AI, I'm all in because the practical impact on my daily work is undeniable.

I use these tools extensively in both my personal and professional work because they have fundamentally altered how I process information, organize complex thoughts, and tackle the kind of cognitive challenges that come with managing both financial planning and therapeutic relationships.

As someone living with ADHD and dyslexia, these platforms have become essential in helping me cut through the cognitive friction that used to slow me down, and I pay for multiple services because the practical value in my daily workflow is undeniable. But here's what matters most in my professional world: I operate under strict regulatory frameworks that govern how I can and cannot use any technology that touches client information.

The Compliance Reality

As a financial advisor, I'm bound by SEC regulations and state licensing requirements. As a licensed therapist, I operate under HIPAA and state health board standards. This dual responsibility fundamentally shapes how I approach any new technology, especially one that processes information as extensively as these platforms do.

My approach is straightforward: I will not input personally identifiable information into any system unless I'm absolutely certain it meets strict privacy standards. In most cases, this means preprocessing client data in Excel to remove all identifiers before using these tools for analysis or summary work. Some platforms like Schwab and Advyzon are already building in privacy protections by offering the ability to download anonymized information, which shows the industry is recognizing this need.

When you're entrusted with people's financial lives and emotional well-being, there's simply no room for casual experimentation with client information. The regulatory frameworks exist for good reason, and the consequences of violations can end careers and harm the people who trust us with their most sensitive information.

So how do you actually use these tools safely within such strict boundaries? It started small. I began testing AI platforms with completely anonymized scenarios - hypothetical client situations with all identifying details removed. I experimented with public information analysis, like comparing fund prospectuses or researching general financial concepts. Only after months of understanding how these systems worked, where they failed, and what safeguards I needed did I begin incorporating them into my actual workflow - always with client data fully scrubbed and never with anything that could identify a real person.

Real-World Applications

At this year's Financial Therapy Association conference, I co-led a session where we live-tested these tools in a therapeutic setting. My co-presenter acted as the client, and instead of responding to my questions aloud, he typed his answers into ChatGPT, which reflected his responses in real time.

What happened surprised us both. He shared that typing into the system allowed him to express something more honest than he would have spoken aloud. The anonymity and psychological distance created a kind of safety that unlocked deeper emotional processing. The session became memorable not because the technology replaced therapy, but because it enhanced the therapeutic process in an unexpected way.

In my financial planning work, I use these tools differently but just as practically. When clients ask specific questions about complex benefit documents that can run 60-100 pages, these platforms help me cut through the information quickly, sometimes saving hours of manual review. Recently, I used Claude to compare closed-end funds, and it not only retrieved the data but also presented it in a way that saved me from juggling multiple websites, Excel models, and manual calculations.

But here's what cannot be overstated: errors and hallucinations are very real, and you absolutely cannot get lazy with verification. I verify everything because I often know what the answer should be before I ask, so when the output produces something wildly incorrect, I catch it immediately. Just last month, I was using AI to help analyze a client's portfolio performance, and it told me a position was up 80% for the year. That seemed high but not impossible - until I double-checked the actual statements and realized the position was actually up 800%. A decimal point error that could have dramatically affected our planning conversation if I hadn't verified.

This is a tool to augment and improve your work, not replace your professional judgment. With 20+ years in this business, I don't think that will ever be a problem for me, but I worry about younger advisors or therapists who might be tempted to rely too heavily on these systems without developing the experience base to know when something is wrong.

What This Means for You

My clients are already using AI in their personal lives - researching investment decisions, drafting difficult family conversations, and exploring complex life challenges. What they want to know is simple: how does this technology affect the advice and service they receive from me?

Here's what's different in your experience. When you ask me to review a complicated benefits package or compare insurance options, I can cut through 60-100 pages of documentation in minutes instead of hours. This means faster, more comprehensive answers to your questions. When we're working through family financial dynamics or estate planning conversations, these tools help me organize complex information and identify patterns that might take longer to surface otherwise.

But let me be clear about what hasn't changed. Your personal information never goes into these systems. I preprocess all client data to remove any identifying information before using AI tools for analysis. The technology helps me work more efficiently, but it doesn't make decisions for you or replace the professional judgment that comes from 20+ years of experience.

What I find most interesting is how many of my clients are using these platforms themselves. One client is building a tool to help people navigate difficult conversations in toxic relationships, which mirrors much of what we do together when working through family financial conflicts. Others are using AI to draft emails to aging parents about money matters or to explore their own beliefs about wealth and security.

This creates an opportunity for us to work together more effectively. Instead of discouraging your use of these tools, I can help you understand how to get better results from them. This means teaching you about effective prompting - how to ask specific, detailed questions rather than vague ones. It means explaining concepts like hallucinations (when AI generates false information confidently) and bias (how these systems can reflect prejudices in their training data). I help clients understand the differences between various AI platforms and their strengths - some are better for research, others for creative tasks, and still others for data analysis.

Most importantly, I help you recognize both the advantages and limitations of these tools. Yes, they can help you explore complex financial scenarios or draft difficult family conversations, but they can't replace your personal judgment about what's right for your situation. Together, we can address both the fears (will this replace human advisors?) and hopes (can this help me make better decisions?) while keeping you in control of your financial planning process.

Your privacy and my professional judgment remain non-negotiable. The technology is simply another tool that helps me serve you better.

My Bottom Line

I'm going to keep using these tools because the practical benefits in my daily work are too significant to ignore. The ability to quickly process complex information, organize thoughts more effectively, and reduce the friction that comes with ADHD and dyslexia has fundamentally changed how I operate as both a financial advisor and therapist. But it goes beyond just client work - these platforms help me manage business operations, from organizing complex projects to planning presentations, and they've become invaluable in my personal life for organizing family workflows, planning house projects, tracking pickleball strategy, managing plant care schedules - you name it. I have over 20 different folders of saved conversations for various projects and planning needs.

But I won't hand over my professional judgment to any algorithm, no matter how sophisticated it becomes. My 20+ years of experience, my licenses, and most importantly, the trust my clients place in me aren't negotiable. These platforms can assist, reflect, and organize information, but they cannot be responsible for the decisions that affect people's financial security and emotional well-being.

Whether I'm building a comprehensive financial plan or supporting a client through a difficult life transition, these tools remain exactly that - tools in a larger professional toolkit. I know how to use them ethically, legally, and strategically because I approach them with both curiosity about their capabilities and strict adherence to the compliance frameworks that govern my work.

The technology will continue to evolve rapidly, and I'll continue to evaluate new applications as they emerge. But the fundamental principle remains unchanged: these systems augment professional expertise, they don't replace it. That distinction matters more than any efficiency gain or technological advancement, and it's what separates responsible adoption from reckless experimentation with other people's lives.

If you're curious about how modern technology can enhance your financial planning experience while maintaining the highest standards of privacy and professional judgment, I'd welcome a conversation about your goals and concerns. You can learn more about our approach here or reach out directly to explore how we might work together.

Jonathan Kolmetz is a Licensed Professional Counselor, Financial Advisor, and President of Oaks Wealth Management. He holds an MBA, a Master’s in Clinical Mental Health Counseling, and is passionate about helping families rewrite their money stories.