DeepSeek API has just received a significant upgrade, equipping developers with powerful new interface features designed to unlock the full potential of large language models (LLMs). This update focuses on enhanced control over output formatting, expanded creative capabilities, and improved interaction with the physical world. Let's dive into the key improvements:
The latest DeepSeek API update introduces the following capabilities:
All of these new features are compatible with both deepseek-chat
and deepseek-coder
models.
The /chat/completions
interface has been significantly enhanced with the following key additions:
The new JSON Output feature provides a robust mechanism for ensuring that the DeepSeek API returns data in a consistent, easily parsable JSON format. This is particularly useful for tasks involving data extraction, analysis, or integration with other systems.
How to leverage JSON Output:
response_format
parameter to {'type': 'json_object'}
.max_tokens
parameter to prevent truncation of the JSON string.For a deeper understanding, refer to the comprehensive JSON Output Guide.
DeepSeek API's new Function Calling capability allows models to interact with external tools and APIs, significantly expanding their functionality and application scope. This feature enables LLMs to perform tasks beyond simple text generation, such as retrieving real-time information, controlling devices, or automating complex workflows.
Key features of Function Calling:
Consider exploring open-source frontends like LobeChat to integrate deepseek-coder
with Function Calling.
The Dialogue Prefix Completion feature provides unparalleled control over the model's output by allowing users to specify a prefix for the final assistant message. This powerful functionality can be used to:
Enabling and Utilizing Dialogue Prefix Completion:
base_url
to https://api.deepseek.com/beta
to access the Beta features.messages
list has the role of "assistant".prefix
parameter to True
for the last message.To cater to scenarios requiring more extensive text generation, the Beta version of the API extends the max_tokens
parameter limit to 8K. This unlocks the potential for creating more detailed and nuanced content, accommodating complex narratives, in-depth analyses, or comprehensive code generation.
How to access extended output:
base_url
to https://api.deepseek.com/beta
.max_tokens
parameter up to a maximum value of 8192.The update also introduces a brand new /completions
interface with the following features:
The FIM (Fill-In-the-Middle) Completion interface empowers users to seamlessly complete content by providing a custom prefix and suffix, with the model filling in the missing section. This is extremely valuable for tasks such as:
To start experimenting with the new Beta features:
base_url
to https://api.deepseek.com/beta
when making your request.It's important to note that Beta interfaces are experimental and subject to change. Be sure to stay updated with the latest documentation and announcements on the DeepSeek Platform. You can view the original publications here.
The latest DeepSeek API update represents a jump forward in control and power when using LLMs. These new JSON output, function calling, creative and extended generation capabilities provide developers with the resources they need to build smarter, more interactive, and more creative AI-powered applications.