LangChain Integration

Pinecall offers seamless integration with LangChain, allowing you to leverage your existing LangChain agents for voice interactions. Our plugin acts as an interceptor through WebSockets, enabling real-time voice communication with your LangChain workflows.

Prerequisites:

  • A Pinecall account with API access
  • LangChain installed in your project
  • Familiarity with LangChain agents
  • Node.js (v14+) or Python (v3.8+)

Installing the Plugin

First, install the Pinecall LangChain plugin in your project:

terminal
# For JavaScript/TypeScript
npm install @pinecall/langchain-plugin
# For Python
pip install pinecall-langchain

WebSocket Interceptor Architecture

The Pinecall LangChain plugin works as a WebSocket interceptor that sits between your LangChain agents and Pinecall's voice AI infrastructure:

LangChain Agent

Pinecall WebSocket Interceptor

Pinecall Voice AI

This architecture provides several benefits:

  • Real-time bidirectional communication between your LangChain agent and the voice call
  • Seamless transformation of text to speech and speech to text
  • Preservation of your LangChain agent's context and memory throughout the conversation
  • Access to all of Pinecall's voice capabilities like emotion detection and voice customization

JavaScript/TypeScript Integration

Here's how to integrate your LangChain agent with Pinecall using our WebSocket interceptor:

langchain-integration.js
import { Pinecall } from '@pinecall/sdk';
import { PinecallLangChainPlugin } from '@pinecall/langchain-plugin';
import { ConversationChain } from 'langchain/chains';
import { ChatOpenAI } from 'langchain/chat_models/openai';
import { BufferMemory } from 'langchain/memory';
// Initialize Pinecall
const pinecall = new Pinecall({
apiKey: process.env.PINECALL_API_KEY
});
// Create your LangChain agent
const model = new ChatOpenAI({
temperature: 0,
modelName: 'gpt-4'
});
const memory = new BufferMemory();
const chain = new ConversationChain({ llm: model, memory });
// Initialize the Pinecall LangChain plugin
const plugin = new PinecallLangChainPlugin({
pinecall,
agent: chain,
voiceConfig: {
voice: 'sophia',
language: 'en-US'
},
// Optional: Configure how to handle the conversation
config: {
interceptMode: 'realtime', // 'realtime' or 'turnbased'
contextPreservation: true,
transcriptHandling: 'memory' // Add transcript to agent memory
}
});
// Start the WebSocket server
plugin.listen(3000);
console.log('LangChain interceptor running on ws://localhost:3000');
// Connect the plugin to a phone number (optional)
async function connectToPhoneNumber() {
const phoneNumber = await pinecall.phoneNumbers.update('pn_12345', {
agentConfig: {
type: 'websocket',
endpoint: 'ws://your-public-endpoint:3000'
}
});
console.log(`Phone number ${phoneNumber.number} connected to LangChain agent`);
}
connectToPhoneNumber();

Python Integration

langchain_integration.py
from pinecall import Pinecall
from pinecall_langchain import PinecallLangChainPlugin
from langchain.chat_models import ChatOpenAI
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain
import os
# Initialize Pinecall
pinecall = Pinecall(api_key=os.environ["PINECALL_API_KEY"])
# Create your LangChain agent
llm = ChatOpenAI(temperature=0, model_name="gpt-4")
memory = ConversationBufferMemory()
conversation = ConversationChain(llm=llm, memory=memory)
# Initialize the Pinecall LangChain plugin
plugin = PinecallLangChainPlugin(
pinecall=pinecall,
agent=conversation,
voice_config={
"voice": "sophia",
"language": "en-US"
},
# Optional: Configure how to handle the conversation
config={
"intercept_mode": "realtime", # 'realtime' or 'turnbased'
"context_preservation": True,
"transcript_handling": "memory" # Add transcript to agent memory
}
)
# Start the WebSocket server
plugin.listen(3000)
print("LangChain interceptor running on ws://localhost:3000")
# Connect the plugin to a phone number (optional)
async def connect_to_phone_number():
phone_number = await pinecall.phone_numbers.update("pn_12345", {
"agent_config": {
"type": "websocket",
"endpoint": "ws://your-public-endpoint:3000"
}
})
print(f"Phone number {phone_number.number} connected to LangChain agent")
import asyncio
asyncio.run(connect_to_phone_number())

Configuring the Interceptor

The Pinecall LangChain plugin offers several configuration options to fine-tune how it interacts with your LangChain agent:

OptionDescriptionValuesDefault
interceptModeHow the agent interacts with the conversation'realtime', 'turnbased''realtime'
contextPreservationWhether to maintain conversation contexttrue, falsetrue
transcriptHandlingHow to handle call transcripts'memory', 'none', 'function''memory'
streamingWhether to use streaming responsestrue, falsetrue

Advanced Usage: Custom Tools Integration

You can integrate LangChain tools with Pinecall to enable your voice agent to perform actions like querying databases, accessing APIs, and more:

advanced-langchain.js
import { Pinecall } from '@pinecall/sdk';
import { PinecallLangChainPlugin } from '@pinecall/langchain-plugin';
import { ChatOpenAI } from 'langchain/chat_models/openai';
import { ConversationChain } from 'langchain/chains';
import { BufferMemory } from 'langchain/memory';
import { SerpAPI } from 'langchain/tools';
import { Calculator } from 'langchain/tools/calculator';
import { initializeAgentExecutorWithOptions } from "langchain/agents";
// Initialize Pinecall
const pinecall = new Pinecall({
apiKey: process.env.PINECALL_API_KEY
});
// Set up LangChain tools
const tools = [
new SerpAPI(process.env.SERPAPI_API_KEY, {
location: "San Francisco,California,United States",
hl: "en",
gl: "us",
}),
new Calculator(),
];
// Create an agent with tools
const model = new ChatOpenAI({
temperature: 0,
modelName: 'gpt-4'
});
const executor = await initializeAgentExecutorWithOptions(
tools,
model,
{
agentType: "chat-conversational-react-description",
verbose: true,
memory: new BufferMemory({
returnMessages: true,
memoryKey: "chat_history",
}),
}
);
// Initialize the Pinecall LangChain plugin with the agent executor
const plugin = new PinecallLangChainPlugin({
pinecall,
agent: executor,
voiceConfig: {
voice: 'sophia',
language: 'en-US'
}
});
// Start the WebSocket server
plugin.listen(3000);
console.log('LangChain agent with tools running on ws://localhost:3000');

Important Considerations

When using the WebSocket interceptor, keep these points in mind:

  • The WebSocket server needs to be publicly accessible for Pinecall to connect to it
  • Consider using secure WebSockets (WSS) for production environments
  • Real-time mode is recommended for natural conversation flow but requires more computing resources
  • Implement proper error handling for production deployments
  • Monitor connection status to ensure continuous availability

Debugging and Monitoring

The plugin includes debugging capabilities to help you monitor and troubleshoot your integration:

debug.js
// Enable debug mode
const plugin = new PinecallLangChainPlugin({
// ... configuration options
debug: true,
logLevel: 'verbose' // 'error', 'warn', 'info', 'verbose', 'debug'
});
// Add event listeners for monitoring
plugin.on('connect', (connectionInfo) => {
console.log('New connection established:', connectionInfo);
});
plugin.on('message', (message) => {
console.log('Message received:', message);
});
plugin.on('error', (error) => {
console.error('Error occurred:', error);
});
plugin.on('disconnect', (reason) => {
console.log('Connection closed:', reason);
});

Next Steps

Now that you've integrated LangChain with Pinecall, you can: