Skip to main content

DuckDuckGoSearch

This notebook provides a quick overview for getting started with DuckDuckGoSearch. For detailed documentation of all DuckDuckGoSearch features and configurations head to the API reference.

DuckDuckGoSearch offers a privacy-focused search API designed for LLM Agents. It provides seamless integration with a wide range of data sources, prioritizing user privacy and relevant search results.

Overview​

Integration details​

ClassPackageSerializablePY supportPackage latest
DuckDuckGoSearch@langchain/communitybetaβœ…NPM - Version

Setup​

The integration lives in the @langchain/community package, along with the duck-duck-scrape dependency:

yarn add @langchain/community duck-duck-scrape

Credentials​

It’s also helpful (but not needed) to set up LangSmith for best-in-class observability:

process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "your-api-key";

Instantiation​

Here we show how to insatiate an instance of the DuckDuckGoSearch tool, with

import { DuckDuckGoSearch } from "@langchain/community/tools/duckduckgo_search";

const tool = new DuckDuckGoSearch({ maxResults: 1 });

Invocation​

Invoke directly with args​

await tool.invoke("What is Anthropic's estimated revenue for 2024?");
[{"title":"Anthropic forecasts more than $850 mln in annualized revenue rate by ...","link":"https://www.reuters.com/technology/anthropic-forecasts-more-than-850-mln-annualized-revenue-rate-by-2024-end-report-2023-12-26/","snippet":"Dec 26 (Reuters) - Artificial intelligence startup <b>Anthropic</b> has projected it will generate more than $850 million in annualized <b>revenue</b> by the end of <b>2024</b>, the Information reported on Tuesday ..."}]

Invoke with ToolCall​

We can also invoke the tool with a model-generated ToolCall, in which case a ToolMessage will be returned:

// This is usually generated by a model, but we'll create a tool call directly for demo purposes.
const modelGeneratedToolCall = {
args: {
input: "What is Anthropic's estimated revenue for 2024?",
},
id: "tool_call_id",
name: tool.name,
type: "tool_call",
};
await tool.invoke(modelGeneratedToolCall);
ToolMessage {
"content": "[{\"title\":\"Anthropic forecasts more than $850 mln in annualized revenue rate by ...\",\"link\":\"https://www.reuters.com/technology/anthropic-forecasts-more-than-850-mln-annualized-revenue-rate-by-2024-end-report-2023-12-26/\",\"snippet\":\"Dec 26 (Reuters) - Artificial intelligence startup <b>Anthropic</b> has projected it will generate more than $850 million in annualized <b>revenue</b> by the end of <b>2024</b>, the Information reported on Tuesday ...\"}]",
"name": "duckduckgo-search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "tool_call_id"
}

Chaining​

We can use our tool in a chain by first binding it to a tool-calling model and then calling it:

Pick your chat model:

Install dependencies

yarn add @langchain/openai 

Add environment variables

OPENAI_API_KEY=your-api-key

Instantiate the model

import { ChatOpenAI } from "@langchain/openai";

const llm = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0
});
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableConfig } from "@langchain/core/runnables";
import { AIMessage } from "@langchain/core/messages";

const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["human", "{user_input}"],
["placeholder", "{messages}"],
]);

// specifying tool_choice will force the model to call this tool.
const llmWithTools = llm.bindTools([tool], {
tool_choice: tool.name,
});

const llmChain = prompt.pipe(llmWithTools);

const toolChain = async (
userInput: string,
config?: RunnableConfig
): Promise<AIMessage> => {
const input_ = { user_input: userInput };
const aiMsg = await llmChain.invoke(input_, config);
const toolMsgs = await tool.batch(aiMsg.tool_calls, config);
return llmChain.invoke({ ...input_, messages: [aiMsg, ...toolMsgs] }, config);
};

const toolChainResult = await toolChain(
"What is Anthropic's estimated revenue for 2024?"
);
const { tool_calls, content } = toolChainResult;

console.log(
"AIMessage",
JSON.stringify(
{
tool_calls,
content,
},
null,
2
)
);
AIMessage {
"tool_calls": [
{
"name": "duckduckgo-search",
"args": {
"input": "Anthropic revenue 2024 forecast"
},
"type": "tool_call",
"id": "call_E22L1T1bI6xPrMtL8wrKW5C5"
}
],
"content": ""
}

With an agent​

We can also pass the DuckDuckGoSearch tool to an agent:

import { DuckDuckGoSearch } from "@langchain/community/tools/duckduckgo_search";
import { ChatOpenAI } from "@langchain/openai";
import type { ChatPromptTemplate } from "@langchain/core/prompts";

import { pull } from "langchain/hub";
import { AgentExecutor, createOpenAIFunctionsAgent } from "langchain/agents";

// Define the tools the agent will have access to.
const toolsForAgent = [new DuckDuckGoSearch({ maxResults: 1 })];

// Get the prompt to use - you can modify this!
// If you want to see the prompt in full, you can at:
// https://smith.langchain.com/hub/hwchase17/openai-functions-agent
const promptForAgent = await pull<ChatPromptTemplate>(
"hwchase17/openai-functions-agent"
);
const llmForAgent = new ChatOpenAI({
model: "gpt-4-turbo-preview",
temperature: 0,
});
const agent = await createOpenAIFunctionsAgent({
llm: llmForAgent,
tools: toolsForAgent,
prompt: promptForAgent,
});
const agentExecutor = new AgentExecutor({
agent,
tools: toolsForAgent,
});
const agentResult = await agentExecutor.invoke({
input: "What is Anthropic's estimated revenue for 2024?",
});

console.log(agentResult);
{
input: "What is Anthropic's estimated revenue for 2024?",
output: 'Anthropic has projected that it will generate more than $850 million in annualized revenue by the end of 2024.'
}

API reference​

For detailed documentation of all DuckDuckGoSearch features and configurations head to the API reference


Was this page helpful?


You can also leave detailed feedback on GitHub.