Some may be surprised to learn that free version of commercial AI programs are unable to plot the relative performance of a list of stock symbols over some specified time frame. For example, Claude, OpenAi, and Gemini all fail at the following prompt, “Generate a chart year to date comparing the relative performance of MSTR, QQQ, COIN, IBIT, and QQQ,” either plotting the wrong chart or not plotting anything at all.
Claude actually plots an overlay chart, but it’s wrong:

Although MSTR is plotted correctly, QQQ and COIN should have negative performance. The relative performance of COIN vs QQQ is also wrong; COIN is down 26% YTD, versus QQQ only being down 2%.
Gemini and ChatGPT failed at the task. ChatGPT plotted each chart separately but failed to overlay them to compare the relative performance. Gemini provided a CSV file. I would call that a failure. A downloadable CSV file is not what comes to my mind when I think of an ‘all-in-one solution,’ which is how AI is often marketed. Having to process an external file in Python or some other program, is just extra work, negating the claimed productivity gains of AI.
The funny thing is, I tried this prompt last year on the leading AI programs and it still failed. A year later, nothing changed.
By comparison, the low-tech non-AI website Barchart.com (which was founded in 1995) processed it correctly on the first try. I simply typed the tickers into the textbox and clicked ‘compare’, and download the image file to my computer afterwards, instead of having to use a CSV file. Here is what it looks like, correct:

The fact that commercial AI programs sometimes fail at seemingly simple tasks, or have obvious usability blind spots, pours cold water on the hype or promise of AI obsoleting jobs. Financial analysts, whose jobs involve generating charts and other metrics, don’t need to fear losing their roles when a 1990s-era website can outperform cutting-edge AI software.