Large language models (LLMs) are rapidly evolving, and function calling is a game-changing feature that significantly enhances their capabilities. With DeepSeek API, function calling allows the model to interact with external tools and services, making it more versatile and practical for real-world applications. This article delves into the concept of function calling within the DeepSeek API, explores its benefits, and provides a practical example to get you started.
Function calling is a feature that enables an LLM to dynamically request information from and utilize external tools. Instead of merely generating text, the model can now trigger specific actions based on the user's input. In essence, the model identifies when it needs external data or functionality and formulates a request to use a specific tool or function.
The DeepSeek API offers a robust function calling feature, allowing developers to seamlessly integrate external tools into their applications.
Let's examine a practical example of using function calling with the DeepSeek API to get the current weather information for a user's location. Note: This example requires you to have an OpenAI library installed.
from openai import OpenAI
def send_messages(messages):
response = client.chat.completions.create(
model="deepseek-chat",
messages=messages,
tools=tools
)
return response.choices[0].message
client = OpenAI(
api_key="<your api key>", # Replace with your actual API Key
base_url="https://api.deepseek.com",
)
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather of an location, the user shoud supply a location first",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
}
},
"required": ["location"]
},
}
},
]
messages = [{"role": "user", "content": "How's the weather in Hangzhou?"}]
message = send_messages(messages)
print(f"User>\t {messages[0]['content']}")
tool = message.tool_calls[0]
messages.append(message)
messages.append({"role": "tool", "tool_call_id": tool.id, "content": "24℃"}) # Tool returns the current temperature
message = send_messages(messages)
print(f"Model>\t {message.content}")
Explanation:
tools
list defines available functions. In this case, it's get_weather
, which requires a location
parameter.get_weather({location: 'Hangzhou'})
.get_weather
function, which fetches the weather data and returns it to the model. In this example, the function returns "24℃".This showcases the power of function calling, where the LLM intelligently determines the need for external data and seamlessly integrates it to provide a comprehensive answer. Please refer to the DeepSeek API Chat Completion documentation for detailed information on the API format for Function Calling.
Function calling is a significant advancement in LLM technology, and the DeepSeek API provides a powerful platform for leveraging this feature. By enabling models to interact with external tools, function calling unlocks new possibilities for building intelligent applications across various domains.