Overview
This tutorial outlines how to build a web application that generates business name and tagline ideas using OpenAI.
Here’s a quick demo video:
The user nominates their business’ industry and a concept, and the app will generate a set of business name ideas using OpenAI’s GPT-3 model text-davinci-002
.
If you aren’t already familiar with OpenAI, it is a paid service that allows you to access AI models (including GPT-3) using a REST API.
Starter App
To help get up and running quickly, we have created a starter Laravel application that implements the above functionality including user interface elements.
The starter app is built using Laravel, Livewire and Tailwind CSS.
The starter app shows hard-coded results at the beginning, and you integrate it with OpenAI’s API to return live results.
This tutorial will guide you through how to integrate OpenAI into this starter app, however, the basic steps can also be followed with your own app if you prefer.
The tutorial uses the free Tectalic OpenAI API client for PHP to make the development process faster and minimise manual coding.
You can also find the final implementation here if you’d like to jump straight to the solution.
Requirements
To follow along, you will need the following:
- An OpenAI account. If you don’t already have one, go to openai.com and create an account.
- An already configured PHP development environment, including:
- PHP 8.0+
- composer
- git
- node and NPM
- A PHP-capable IDE such:
- Familiarity with your terminal/shell.
- Familiarity with Laravel will help but is not essential.
Build the User Interface
Clone the skeleton app from GitHub using the following terminal commands:
git clone https://github.com/tectalichq/the-ai-name-generator.git
cd the-ai-name-generator
At this point, you will have the Laravel skeleton app installed, with version 0.1 checked out.
Now we will build the JavaScript and CSS assets that are required:
npm install
npm run build
We are now ready to get started with integrating OpenAI into the application – read on for details.
Integrate the OpenAI API
Install the Tectalic OpenAI API Client
Install the Tectalic OpenAI API Client, following the installation instructions:
composer require tectalic/openai
The above command adds the package into our project.
Configure the OpenAI API Client
This step allows us to set up and easily access our OpenAI API Client from anywhere in our Laravel application.
Open your app/Providers/AppServiceProvider.php
file, and add the following highlighted code block to the existing register()
method:
<?php
public function register()
{
$this->app->singleton(\Tectalic\OpenAi\Client\Client::class, function ($app) {
if (\Tectalic\OpenAi\Manager::isGlobal()) {
// Tectalic OpenAI REST API Client already built.
return \Tectalic\OpenAi\Manager::access();
}
/**
* Build the Tectalic OpenAI REST API Client globally.
*/
$auth = new \Tectalic\OpenAi\Authentication(config('services.openai.token'));
$httpClient = new \GuzzleHttp\Client();
return \Tectalic\OpenAi\Manager::build($httpClient, $auth);
});
}
The above code does several things:
- It adds the OpenAI API Client (
Tectalic\OpenAi\Client\Client
) to the Laravel Service Container.
- It configures the OpenAI API Client’s Authentication token based on a value in Laravel’s configuration (more on that in the following step).
- It configures the OpenAI API Client to use the already installed Guzzle HTTP Client for performing HTTP requests. Any other PSR-18 compatible HTTP client (such as the Symfony HTTP Client) could also be used.
Authenticating the OpenAI API Client
Next, we will add a Laravel configuration setting to our app so that the OpenAI API Client uses the specified API Token.
To do so, open your config/services.php
file, and add the following three green lines to the existing array:
<?php
return [
'openai' => [
'token' => env('OPENAI_TOKEN'),
],
];
Next, open the .env.example
file, and add the following line to it:
OPENAI_TOKEN=
Now copy .env.example
to .env
:
cp .env.example .env
Now go to your OpenAI API Keys page and create a new secret key.
Paste your actual secret key into your .env
file, so that the OPENAI_TOKEN
value is replaced with your unique OpenAI API key.
For example:
OPENAI_TOKEN=sk-1234
(where sk-1234
is your actual OpenAI secret API Key).
This means that any OpenAI API calls will be authenticated using your own OpenAI secret API Key from your .env
file.
Query the OpenAI API During the Form Submission Process
Logic and UI are separated
In the skeleton app, the App\Http\Livewire\NameGenerator
Livewire component is what implements the form and UI. While the generateNames()
method in the the app/Http/Livewire/NameGenerator.php
file holds the logic.
Prepare the request
In this example, we will be using the OpenAI create completions endpoint, which allows us to send a prompt to OpenAI, and OpenAI will respond with one or more completions that match our context.
For more details on this OpenAI API endpoint, please consult the OpenAI Documentation.
Open the app/Http/Livewire/NameGenerator.php
file, and find the generateNames()
method around line 300.
This method is executed whenever the user submits the form, so we will now modify this method so that it queries the OpenAI API instead of returning hard-coded results.
First of all, this method needs access to our OpenAI API Client, so we will modify the method signature as follows:
297 /**
298 * When the form is submitted.
299 */
300 public function generateNames(): void
300 public function generateNames(\Tectalic\OpenAi\Client $client): void
301 {
This change will instruct Laravel to use its Service Container to automatically inject an OpenAI API Client instance into this method when the form is submitted, allowing us to refer to it using $client
.
Next we will build an OpenAI prompt, based on the industry and concept chosen by the user in the form:
300 public function generateNames(\Tectalic\OpenAi\Client $client): void
301 {
302 $validated = $this->validate();
303
304 $this->clearValidation();
305
306 $prompt = sprintf(
307 'Give me a business name and tagline for a company in the %s industry with a %s concept',
308 $validated['industry'],
309 $validated['concept'],
310 );
If you’ve never designed an OpenAI prompt before, read OpenAI’s prompt design documentation.
Building the Request
Write the code that builds the actual API request that will be sent to the OpenAI create completions endpoint with the prompt from the previous step:
You can see from the above video that we use our OpenAI API Client instance ($client
) to access the completions API handler, then call the create completion API method.
The Tectalic API Client, when combined with an IDE such as PhpStorm, will show you code completion information as you type, making it simpler to understand which API Handlers and Methods are supported by the API Client.
This create completion API method expects one argument, a Tectalic\OpenAi\Models\Completions\CreateRequest
class instance. This PHP class contains all the properties that you can send to a create completion API request.
The structured nature of the request parameters makes it simpler to understand the structure and information required to send the request using only your IDE – you don’t need to spend time reading and understanding the create completion documentation.
In our particular case, we are using the following properties for the request:
Property |
Description |
model |
The text-davinci-007 OpenAI model. |
prompt |
The already assembled prompt ($prompt ). |
max_tokens |
We override the default token limit from 16 to 2048, which ensures that OpenAI can interpret our prompt. See here if you’d like to learn more about OpenAI tokens. |
prompt |
We set this to 5 so that we get 5 completions (choices) back from OpenAI. |
If you’d like more detail on the create completion API request parameters, please see here.
Sending the Request
Now that our request is assembled, it’s time to actually send the request to the OpenAI API endpoint:
As you can see, sending the request is as simple as calling the toModel()
method on our $request
instance.
Parsing the Response
The toModel()
function used in the previous step actually does two things:
- It sends the API request to OpenAI.
- It parses the raw API response from OpenAI, and returns it in a structured way – in this case a
Tectalic\OpenAi\Models\Completions\CreateResponse
class instance.
Behind the scenes, the method is also converting the request to a JSON formatted request, sending it to the correct OpenAI endpoint, and then parsing the raw JSON formatted response, ensuring that it matches the expected format.
This structured response object makes it simpler to understand the structure and information in the OpenAI response – without needing to spend time reading the create completion response documentation.
It also enables static analysis tools for PHP, such as PHPStan or Psalm, which helps detect errors such as typos without the code being executed.
Choosing the Relevant Data From the Response
Now that we have sent the API request to OpenAI and retrieved the response, it is time to choose the relevant data from the OpenAI response and display it to our user:
The structured response objects mean you can use IDE code completion suggestions to understand what information is returned from OpenAI.
From the OpenAI response, we are only interested in the text
property from the choices
objects.
We use Laravel’s Array Map helper to iterate over the OpenAI response and return a simple array of result text(s).
These choices are saved to the names
class property on our Livewire Component, which Livewire then automatically displays to our user in the Results section of the user interface.
Error Handling
The above implementation should be working, so if you were to run your app at this point, everything should work as expected. Phew!
However, we’ve only considered and implemented the successful (happy) path – we haven’t yet handled the unsuccessful (unhappy) path(s) for our API integration, including scenarios such as:
- OpenAI’s API is temporarily inaccessible or unavailable.
- A communication issue between your application and OpenAI’s API occurs.
- OpenAI’s API returns an error code such as a 400 Bad Request (in the case that the request is invalid).
The following video demonstrates a simple method to handle these error scenarios gracefully:
This error handling is relatively straightforward to implement because the Tectalic OpenAI API Client automatically throws an Exception when errors occur. Specifically, a Tectalic\OpenAi\Exception\ClientException
is thrown. Consult the error handling documentation for details.
In this case, we add try
/catch
block to our code, and then return a friendly error message to our user if an error does occur.
The Final Implementation
In case you need it, below is the final implementation, with code additions highlighted in green and code removals highlighted in red.
You can also find the implementation in the develop
branch on GitHub.
composer.json
:
13 "laravel/tinker": "^2.7",
14 "livewire/livewire": "^2.10"
14 "livewire/livewire": "^2.10",
15 "tectalic/openai": "^1.0.0"
16 },
.env.example
59
60# OpenAI API Token
61OPENAI_TOKEN=
.env
59
60# OpenAI API Token
61OPENAI_TOKEN=sk-1234
(where sk-1234
is your actual OpenAI secret API Key).
config/services.php
:
24 'openai' => [
25 'token' => env('OPENAI_TOKEN'),
26 ],
app/Providers/AppServiceProvider.php
:
18 public function register()
19 {
20 $this->app->singleton(\Tectalic\OpenAi\Client\Client::class, function ($app) {
21 if (\Tectalic\OpenAi\Manager::isGlobal()) {
22 // Tectalic OpenAI REST API Client already built.
23 return \Tectalic\OpenAi\Manager::access();
24 }
25 /**
26 * Build the Tectalic OpenAI REST API Client globally.
28 */
29 $auth = new \Tectalic\OpenAi\Authentication(config('services.openai.token'));
30 $httpClient = new \GuzzleHttp\Client();
31 return \Tectalic\OpenAi\Manager::build($httpClient, $auth);
32 });
33 }
app/Http/Livewire/NameGenerator.php
:
297 /**
298 * When the form is submitted.
299 */
300 public function generateNames(): void
300 public function generateNames(\Tectalic\OpenAi\Client $client): void
301 {
302 $validated = $this->validate();
303
304 $this->clearValidation();
305
306 $prompt = sprintf(
307 'Give me a business name and tagline for a company in the %s industry with a %s concept',
308 $validated['industry'],
309 $validated['concept'],
310 );
311
312 $request = $client->completions()->create(
313 new \Tectalic\OpenAi\Models\Completions\CreateRequest([
314 'model' => 'text-davinci-002',
315 'prompt' => $prompt,
316 'max_tokens' => 2048,
317 'n' => 5 // 5 completions
318 ])
319 );
320
321 try {
322 /** @var \Tectalic\OpenAi\Models\Completions\CreateResponse $result */
323 $result = $request->toModel();
324 // Transform the result, as we only need to use the text from each completion choice.
325 $this->names = \Illuminate\Support\Arr::map($result->choices, function (\Tectalic\OpenAi\Models\Completions\CreateResponseChoicesItem $item) {
326 return $item->text;
327 });
328 } catch (\Tectalic\OpenAi\Exception\ClientException $e) {
329 // Error querying OpenAI.
330 // Clear any existing results and display an error message.
331 $this->reset(['names']);
332 $this->addError('results', __('Results are temporarily unavailable. Please try again later.'));
333 }
334 }
Run Your App
At this point, we will run our app and give it a try in our web browser:
Congratulations – we’re done!
The app should now be fully functional. Take a few minutes to experiment with various industries and concepts, and entertain yourself with the results.