GPT-wrapper or GPT as a wrapper?
What to look for while building or using AI products in 2024 and beyond
The rise of GPT Wrappers: AI Products 1.0
One of the most common questions I get from investors is whether TheWordsmith.ai is a GPT wrapper. When AI products started flooding the business world in early 2023, the approach taken by most product creators and founders was to use GPT, or another LLM, as the product's core and do prompt engineering around it to deliver a product that addressed certain specific use cases. Take, for example, AI content generators. Most are architected with either GPT or another LLM, at their core with a layer of prompts about copywriting, blog creation, the role of the assistant, and plenty more to orchestrate the LLM’s behavior to deliver marketing content. The LLM and its training data formed the foundation of what drove the content creation.
Prompt engineering, however, is getting commoditized. There are marketplaces to buy elaborate prompts for any use case you can think of. You have a million and one posts on LinkedIn and other social media about how someone can generate great results from ChatGPT with intelligent prompt engineering.
This leads to another common question I get, this time from potential customers of TheWordsmith - Couldn’t I generate all the content I need by just using ChatGPT? The short answer may be “yes”, but it comes with several caveats and reservations. Here are a few,
Are you able to provide complete context to ChatGPT about the Who, What, Where, and Why for the content you are trying to generate?
Are you able to maintain a consistent tone of voice every time you ask for a piece of content?
What about training data cut-off dates? Are you limited by how recent ChatGPTs knowledge of the world is?
How are you ensuring that ChatGPT isn’t giving you generic content that it also gives other users trying to generate something similar for their product/brand?
GPT as a wrapper: AI Products 2.0
After spending the last several months building and iterating over an AI product, I’ve been able to go deep into the capabilities and limitations of LLMs.
LLMs are excellent at providing general assistance around a variety of use cases. You can ask about pretty much anything and they will have a response for you in a matter of seconds. The training data ensures that the models have a relatively deep understanding of a broad range of topics. So whether you need recommendations for vacations, or want to learn more about black holes, you will get enough usable information from an LLM, or ChatGPT, which is the de facto proxy for AI assistants built on top of LLMs.
On the other hand, LLMs, as we all know by now, are probabilistic models that predict the most likely next word in the response they are generating. Suffice it to say that a similar, undifferentiated prompt will likely generate an equally undifferentiated response.
LLMs are also static when it comes to their training data, whereas the world we live in is anything but static.
LLMs as a delivery mechanism, not the engine
Moving beyond GPT/LLM wrappers, the fresh perspective that creators, and users of AI products alike, need to apply is how much beyond the LLM does the product go. While the early AI products used the LLM as the core engine of their products, today’s AI products need to go beyond that, and significantly.
Using the LLM as one of the components of the AI product, where it serves as a delivery mechanism for the output, rather than the core of the product itself, is essential.
At TheWordsmith.ai, our core principle for product development has been “depth of context and uniqueness of the brand”, and we are constantly expanding the capabilities of the product by bringing in first, and third-party data to deepen the context the LLM has to work with to deliver the desired output to the user.
Here are a few things that the LLM works off of under the hood in TheWordsmith,
The Company, Product, and Brand DNA - TheWordsmith captures deep context about everything there is to know about a user’s company and brand to ensure the assistance it provides is in that context
Access to real-time news and Twitter feed for more current context on a topic at hand. If you were are South by Southwest last week and needed assistance with social media content from an AI tool, you would have wanted TheWordsmith with you since it knew what was happening at SXSW that very moment.
Very soon, TheWordsmith will have access to consumer research data that will provide deep insights into the attitudes, behaviors, media consumption habits, and much more about a company’s ideal customer profile.
Here’s a simple example of how real-time access to news and Twitter affects the output of AI tools. We asked ChatGPT and TheWordsmith for ideas on social posts for marketers on the topic “Sora”. Due to ChatGPT’s training cut-off predating the announcement of Sora by OpenAI, the resulting response only refers to the protagonist of the video game Kingdom Hearts, released in the early 2000s. Whereas, TheWordsmith clearly understands the relevance of the recent announcement of OpenAI’s Sora and its implications for marketing.
So, as you build, or use an AI-powered product, use this lens to evaluate its capabilities - “Is it a GPT Wrapper, or is GPT a wrapper that delivers the output and user experience that’s built on something that goes substantially beyond the LLM?”
Like, share, or comment if you find this content helpful. Subscribe to receive this newsletter in your inbox.