Using the ChatGPT API for Business (Wanna Buy a Ginsu Knife?)

I made a small Flask web site that reads the latest news feed from CNN and uses the ChatGPT API to generate a (hopefully funny) sales pitch for a product

Most people have tried out the web interface to ChatGPT, and been impressed by its capabilities. But how might you use it in your business? There’s an API for ChatGPT so you can build this into any application you want. I tested it out by building a small Python Flask web site.

My idea was that you might want to use ChatGPT to generate topical sales pitches for your products. So I grabbed a news feed from CNN and picked a few classic humorous products, like the Ginsu Knife and the Fishin’ Magician, and tried to get ChatGPT to help me generate web site content. I used the news stories to build a prompt for ChatGPT, and asked it to summarize the stories and segue to a sales pitch for one of the products.

You can see my code in Github. I started with a Flask blogging tutorial on the Digital Ocean web site, and modified it as needed. The app consists of just one Python file, which reads the news feed and makes the calls to the ChatGPT API, plus some HTML files, and a CSS file (these were mostly unchanged from the tutorial app).

You can get idea of the results below.

Sometimes the content ChatGPT generated was hilarious, and other times it was a bit generic. At one point I got this one, which was pretty funny:

This news story discusses how President Donald Trump was indicted on two charges related to hush money payments made to adult film star Stormy Daniels. Trump is accused of breaking campaign finance laws by paying off Daniels in order to silence her about an alleged affair. Why should you buy the K-tel Fishin’ Magician? Because with this device, you can cast your line and catch a big fish like President Trump did with Stormy Daniels – without the legal consequences! Don’t get caught with your pants down – get the K-tel Fishin’ Magician and reel in the big one!

Of course this is just a fun example that I created, but as you can see, I’m using ChatGPT to generate topical content in a fully automated way, and this would (hopefully) be useful to a business that needs to tailor a sales pitch on the fly. You could imagine feeding in a person’s LinkedIn profile, or their company’s profile to obtain even more tailored text.

In the course of trying this out, I learned a few lessons:

  1. The API is quite expensive. OpenAPI gives you a trial, which contains $18 in credits. They only give you a couple of days to use them however. So I found myself signing in through other accounts so I could keep playing with it. I had hoped to deploy this on a web server so others could try it, but it looks like that could cost a fortune. I’m hoping Google Bard will cost less and perhaps I can switch to it.
  2. You pay by the token. OpenAI says that “you can think of tokens as pieces of words, where 1,000 tokens is about 750 words.” I was using the “Davinci” model, which costs 2 cents/1000 tokens. They have more expensive models, like ChatGPT4, which costs up to 12 cents per 1000 tokens. Since I was summarizing a lot of news stories, I found that by just testing my site I was blowing through a lot of tokens.
  3. I was surprised that the API didn’t behave like the Chatbot web site OpenAPI provides. It appears you need to keep reminding ChatGPT what it said previously. I was trying get ChatGPT to generate a headline for a previously generated news summary, but it kept generating headlines that seemed to be completely unrelated to the previously generated text. So I changed my code to take that text and feed it back to ChatGPT, and told it to consider that input and generate a short, headline-style summary. That worked much better. The only problem with that is it seems to consume more tokens because I believe you pay for both “prompts” (your input text) and the output text. So if you’re using the API to carry on a conversation it’s going to get more and more expensive the longer the conversation goes.
Advertisement