Artificial Intelligence 5 min read

Introducing the GPT‑3.5 Turbo API: Features, Pricing, and Sample Node.js Integration

The article announces the GPT‑3.5 Turbo API, highlights its ten‑fold lower price of $0.002 per 1k tokens, explains the new chat‑completion format with messages, and provides a practical Node.js example along with various application ideas.

Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Introducing the GPT‑3.5 Turbo API: Features, Pricing, and Sample Node.js Integration

Before work I checked OpenAI news and discovered the launch of the GPT‑3.5 API, specifically the gpt-3.5-turbo model, which uses the same underlying model as the ChatGPT interface and includes improvements.

The ChatGPT model family we are releasing today, gpt-3.5-turbo , is the same model used in the ChatGPT product.

The gpt-3.5-turbo model is priced at $0.002 per 1,000 tokens, making it ten times cheaper than the previous Davinci models.

It is priced at $0.002 per 1k tokens, which is 10x cheaper than our existing GPT‑3.5 models.

Integrating the API is straightforward; you can switch from the text-davinci-003 model to gpt-3.5-turbo with minimal changes.

It’s also our best model for many non‑chat use cases—we’ve seen early testers migrate from text-davinci-003 to gpt-3.5-turbo with only a small amount of adjustment needed to their prompts.

Based on the gpt-3.5-turbo model, you can build AI chat scenarios as well as many other applications, such as drafting emails, generating code, answering document‑related questions, creating intelligent AI customer service, adding natural‑language processing to software, acting as a pseudo‑expert, or providing NPC dialogue in games.

Draft emails or other written documents.

Generate Python or other code snippets.

Answer questions related to documentation.

Create an intelligent AI customer service.

Enable natural‑language processing in your applications.

Serve as a pseudo‑expert in a specific domain.

Provide dialogue for NPCs in games.

The new completion interface uses a messages array where each entry contains a role and content , offering a more semantic way to convey conversation context.

Authentication is performed via a Bearer token included in the Authorization header.

Authorization: Bearer ${your api token}

The official OpenAI Python library already supports the new endpoint, while the npm openai package is still catching up. To help developers, a simple wrapper called gpt-node has been published to npm.

Using the wrapper is easy: instantiate it with your token and call the api.completions method.

const api = new ChatGPT35("your token")

const result = await api.completions({
    messages: [
        { role: "ai", content: "hello" },
        { role: "user", content: "你是谁?" }
    ]
})

console.log(result)

The ChatGPT experience continues to improve, with costs decreasing and capabilities expanding, suggesting the field will become increasingly competitive and innovative.

Artificial IntelligenceNode.jsAPIOpenAIpricingGPT-3.5
Rare Earth Juejin Tech Community
Written by

Rare Earth Juejin Tech Community

Juejin, a tech community that helps developers grow.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.