Skip to Main Content

Before using AI investing tools, watch out for these costly pitfalls

Written by Edited by
Published on November 29, 2023 | 7 min read

Bankrate is always editorially independent. While we adhere to strict , this post may contain references to products from our partners. Here's an explanation for . Our is to ensure everything we publish is objective, accurate and trustworthy.

Chat GPT logo is displayed on a smartphone.
SOPA Images/Getty Images

“How has Microsoft stock performed since Satya Nadella took over as CEO?”

That’s the kind of question that generative artificial intelligence (AI) can excel at, say experts. AI can save human investors significant time in looking up the facts – “When did Nadella take the helm?” and “How has the stock moved since then?” – and then making the calculations. AI can then produce a report in natural language that’s easily accessible to readers.

Investors may already be open to AI, despite some potential drawbacks. Of investors using social media and online platforms for advice, 85 percent used generative AI advice when adjusting their portfolio, according to a TD Wealth survey of 500 high-net-worth and mass affluent individuals. But a report from the CFP Board suggests that just 31 percent of investors feel comfortable actually implementing generative AI advice without first verifying it elsewhere.

This type of AI – what some call “language AI” – has the potential, for example, to increase the productivity of those who need to communicate, such as financial advisors. It’s a relatively new entrant to the AI world, with the advent of large language models such as ChatGPT.

“It’s important to understand that AI is a very wide term,” says Daniel Satchkov, co-founder and president at RiXtrema, a software platform for financial advisors and brokers. “When you talk about AI, you need to understand what you’re talking about.”

Language AI differs significantly from “numbers AI,” a short-hand term used by Satchkov. Numbers AI has already been looking for patterns and executing trades for years, he says.

These two types of AI could continue to change how financial advisors and investors of all sizes conduct their business, increasing their efficiency and helping them become more agile.

Here’s how AI is shaking up investing

Generative AI has received tons of fawning press over the last year, but some experts see the adulation as way overblown, given its abilities, at least so far. Still, it can be good at specific tasks that it’s been highly trained for, such as producing text, though not without drawbacks. Those tasks include producing reports for investors and parsing sentiment on social media.

“Humans are good at judgment, while machines are good at triaging extraordinary amounts of data,” says R.J. Assaly, chief product officer at Toggle AI, an AI tool for investors. “AI can watch all these disparate data points, look back for what’s been anomalous and look back through history at how things have responded.”

Assaly compares the jump in knowledge as if people were moving from the Yellow Pages to an iPhone – in effect, receiving custom-made knowledge produced with a smartly posed question.

As in the example above, generative AI apps such as ChatGPT can be effective at pulling data and then synthesizing it into a readable report. An investor or advisor can ask a direct and precisely phrased question and get an answer based on data that’s been fed into the model.

The future is using large language models to connect to certain data sets, says Satchkov. “It’s like an interface between people and data to understand the data and make decisions quicker.”

AI can be an amazing tool for advisors and others who need to respond to clients on a timely basis but simply don’t have the time to dig into every client enquiry personally. It can be a huge value-add for advisors who need to explain what’s going on in the market or even summarize a client’s portfolio in a report and why it’s been performing as it has. And even casual users can access ChatGPT or other language AI to uncover relationships in the data and draft a report.

Assaly anticipates that this kind of language AI may eventually be able to anticipate users’ questions and then even pose questions that they might not have thought of yet.

Finally, in broad strokes what generative AI can do for investors is turn data into information in response to a specific request, but humans must decide how and whether to use that info.

Trading with AI

In contrast to language AI, numbers AI can discern relationships between numbers and then execute on the data – and it’s been in the game for years. Such predictive analytics can help traders understand where the market is likely to move based on patterns and who’s buying.

Predictive AI finds the “footprints” of big players making investing moves or any trading pattern, says Ryan Pannell, CEO and global chair of asset manager Kaiju Worldwide. “It’s extremely good at quantitative and technical trading,” he says.

Newer AI models can up the game with improved functionality, beating already-sophisticated models by using tech that can “learn” and adjust, says Pannell.

Compared to the “old” quantitative strategy – Pannell laughs when he calls quant strategies “old” – AI can change and update trading strategies by “reweighting the series of investing criteria that it invests on.” Such strategies can then shift as the market shifts.

High-net-worth investors have had access to numbers AI for years. Their deep pockets and capacity to absorb higher costs in exchange for potentially higher returns mean AI has stayed in that tax bracket. However, regular investors may gain access to AI through ETFs over time.

For example, Pannell’s Kaiju has issued the BTD Capital Fund (DIP), an AI-powered fund that uses AI to identify and trade short-term moves in the market using a “buy the dip” strategy.

Given the traits of numbers AI, does language AI pose a threat? No, says Satchkov.

“I don’t think ChatGPT will replace numbers AI,” he says. “Numbers AI just works too well and is well-established, and it’s closely guarded as a trade secret.”

Drawbacks of AI

AI has received a lot of full-throated but uncritical praise since the debut of ChatGPT, and investors looking to use similar AI tech should be critical of what they’re getting when they do. Some providers may talk a big game about how AI powers their products, piggybacking off the popularity of the term “AI,” but actually use it more as a marketing gimmick than a reality.

Beyond this aspect, investors and others using AI should pay attention to the following issues.

Accuracy and honesty

One of the biggest problems with generative AI is that you can never be too sure that you’re getting something that’s accurate. For a while, ChatGPT was limited to data before September 2021, meaning if it happened after that date, you might not get an accurate answer. While that’s no longer the case, users must still consider that the AI may not have all the facts.

“AI is only as good as the data that it consumes,” says Pannell, noting that AI is scraping publicly available sources that may not always be high quality.

This fact means that “garbage in” equals “garbage out,” an expression of old line-computer programmers. But sometimes AI can fabricate nonsense even without that input.

“You can get answers that are bizarre about people who don’t exist,” says Pannell.

AI may not always be great about showing you where its source data came from, so you may be left to wonder how truthful a given fact actually is and forced to go to the source data.

Questions must be asked correctly

To elicit the right responses, you must ask the right questions, say the experts. Otherwise, the AI may end up providing you with a response you’re not looking for. Worse, however, is that it may provide only a response that you’re looking for rather than a comprehensive answer.

Satchkov says that users must always think and formulate questions clearly and ask exactly for what they want or it could be misleading. That’s a design trait, he says, to eliminate ambiguity. Ask “What are interest rates?” and you may wind up with a definition of rates when you really wanted to know today’s rate or how rates performed over some time period.

“Generative AI looks for the answer that you want to hear,” says Pannell.

This weakness means that the AI will try to deliver only a response you want to see. It will even go so far as to make up sources – what experts call “hallucinations” – in order to give users what it expects they want. That’s why asking a good question is so vital to the process.

The models always want to mimic the person asking the question, says Assaly. “You need to have a healthy skepticism of potential ambiguity in a question or response,” he says.

Unsuitability of general models such as ChatGPT

“ChatGPT tries to be a jack of all trades – a humongous model of all things to all people,” says Satchkov. That may be fine for some information, he suggests, but investors need a fine-tuned AI model that is specialized for the task.

Such a large model lacks key data sources that are relevant to investors and traders, and it’s slow, he says. In contrast, specialized models can deliver higher-quality and faster answers.

A report writer, not a decision-maker

Large language models such as ChatGPT are so far designed to analyze relationships among data and write reports in natural language that provide insight into the data or answer a specific question. They’re not decision-makers tasked with sifting the data and then making decisions.

“You’re still the decision maker,” says Assaly. “You still have to make that end decision.”

(Here are five ways to research stocks like the pros.)

AI must have the right guardrails

All that said, a numbers AI model can be tasked with making trades based on the data patterns it’s spotting. But even those actions must be tightly scoped and set up with guardrails to keep the AI in check for making sensible decisions. Otherwise, it risks losing significant money on silly conclusions.

“AI cannot understand context or nuance,” says Pannell.

And that’s where humans not only must keep a hand on the wheel, to extend the metaphor, but also the potential for non-obvious potholes along the way.

“Humans do still have a role working cooperatively with this technology,” says Pannell.

Bottom line

AI has the power to rapidly change various elements of investing, from identifying relationships between data and summarizing that into readable reports up to making trading decisions in a well-controlled and tightly scoped environment. But investors looking to wade in AI for help with investing should be aware of its limitations and understand where it does not work effectively.