The simplest term to describe ChatGPT is a contextual regurgitation machine. As a Generative AI tool, it uses Machine Learning. That learning is based on massive amounts of data of what’s already been done by humans. And it likes to make up its own facts. The very reason JP Morgan won’t allow its use internally.
Innovation is today often conflated with the idea that it must be using technology or a product. That is just one aspect of innovation in an organisation. There’s also profit models, customer experience, operational and more. Innovations succeed and work best when combined in unique ways that are often counterintuitive. ChatGPT is not a tool that combines and reconfigures to deliver innovative ideas.
Tools like ChatGPT can be used in the early stages of developing innovations. In research. They can help surface failures and summarise. But the outputs need to be carefully scrutinised to ensure they haven’t made up any facts. Should they, it would mean an innovation would fail. Innovating has enough risk for the bottom line to begin with. It doesn’t need another risk point thrown into the mix.
I’ve used ChatGPT and a couple of other tools in the research phase. I’ve found it to be terrible when it comes to financial analysis. It was helpful in competitor summaries, but in a limited way. Since company press releases and annual reports are generally about promoting the value of a business, ChatGPT is not good at showing where mistakes were made. This is key when working on an innovation for a business.
So if you are considering using ChatGPT in an innovation team or project, be sure to understand its limitations and if you’re looking at facts, double, triple check them first. It can’t be relied on for assumptions. Especially if you’re combining innovation elements that will impact multiple parts of the organisation.
The bottom line is that ChatGPT and similar tools can be used in analysis, but with caution. Innovative ideas, for now, are still driven by human brains.
Leave a Reply