All Collections
OpenAI API
Prompt engineering
Controlling the length of Completions
Controlling the length of Completions
Michael Schade avatar
Written by Michael Schade
Updated over a week ago

Controlling the length of Completions

The main way to control the length of your completion is with the max tokens setting. In the Playground, this setting is the “Response Length.” These requests can use up to 2,049 tokens, shared between prompt and completion.

Let's compare the Response Length of the science fiction book list maker and classification example prompts.

Science Fiction Book List Maker

Classification

The book list maker has a relatively high Response Length value of 200, as the task is to produce a list of up to 10 books. On the other hand, the classification example has a Response Length value of just 6, as the task is to determine the category of a company in just a few words. Given “FedEx” as an input, the completion is just “Logistics, Transportation.”

In addition to setting the Response Length value, there are a few other strategies you can use to control the length of output:

Give Instructions

Provide instructions to generate the desired output length, such as a specific number of items in a list. This works especially well with the instruct series. Going back to the science fiction book list maker example, you could provide explicit instructions to generate a list of ten books. The instruction would then be “A list of ten science fiction books” instead of “Science fiction books.”

Add examples of a specific length

The API is great at recognizing patterns and will consider the length of examples given when generating text. By providing an example, or multiple examples, with the desired output length, you can give needed context about the expected length. In the prompt below, we modified the science fiction book list maker to create a list of five books in any given genre. We provided an example with five list items, in addition to giving explicit instructions to create a list of five books.

Strategic Stop Sequences

In the example above, the stop sequences are “###” and “6.”. If the API attempts to generate a sixth list item, it will run into the “6.” stop sequence. In a similar way, a period can be used to generate a single sentence. You can learn more about Stop Sequences here.

Note: There is not currently a way to set a minimum number of tokens.

Did this answer your question?