Enabling Multi-Step Tool Calls
Allow your chatbot to use tool results in follow-up responses
Enabling Multi-Step Tool Calls
You may have noticed that while the tool results are visible in the chat interface, the model isn't using this information to answer your original query. This is because once the model generates a tool call, it has technically completed its generation.
To solve this, you can enable multi-step tool calls using stopWhen. By default, stopWhen is set to stepCountIs(1), which means generation stops after the first step when there are tool results. By changing this condition, you can allow the model to automatically send tool results back to itself to trigger additional generations until your specified stopping condition is met. In this case, you want the model to continue generating so it can use the weather tool results to answer your original question.
Update Your API Route
Modify your app/api/chat+api.ts file to include the stopWhen condition:
import {
streamText,
UIMessage,
convertToModelMessages,
tool,
stepCountIs,
} from 'ai';
__PROVIDER_IMPORT__;
import { z } from 'zod';
export async function POST(req: Request) {
const { messages }: { messages: UIMessage[] } = await req.json();
const result = streamText({
model: __MODEL__,
messages: await convertToModelMessages(messages),
stopWhen: stepCountIs(5),
tools: {
weather: tool({
description: 'Get the weather in a location (fahrenheit)',
inputSchema: z.object({
location: z.string().describe('The location to get the weather for'),
}),
execute: async ({ location }) => {
const temperature = Math.round(Math.random() * (90 - 32) + 32);
return {
location,
temperature,
};
},
}),
},
});
return result.toUIMessageStreamResponse({
headers: {
'Content-Type': 'application/octet-stream',
'Content-Encoding': 'none',
},
});
}You may need to restart your development server for the changes to take effect.
Head back to the Expo app and ask about the weather in a location. You should now see the model using the weather tool results to answer your question.
By setting stopWhen: stepCountIs(5), you're allowing the model to use up to 5 "steps" for any given generation. This enables more complex interactions and allows the model to gather and process information over several steps if needed. You can see this in action by adding another tool to convert the temperature from Fahrenheit to Celsius.
Add More Tools
Update your app/api/chat+api.ts file to add a new tool to convert the temperature from Fahrenheit to Celsius:
import {
streamText,
UIMessage,
convertToModelMessages,
tool,
stepCountIs,
} from 'ai';
__PROVIDER_IMPORT__;
import { z } from 'zod';
export async function POST(req: Request) {
const { messages }: { messages: UIMessage[] } = await req.json();
const result = streamText({
model: __MODEL__,
messages: await convertToModelMessages(messages),
stopWhen: stepCountIs(5),
tools: {
weather: tool({
description: 'Get the weather in a location (fahrenheit)',
inputSchema: z.object({
location: z.string().describe('The location to get the weather for'),
}),
execute: async ({ location }) => {
const temperature = Math.round(Math.random() * (90 - 32) + 32);
return {
location,
temperature,
};
},
}),
convertFahrenheitToCelsius: tool({
description: 'Convert a temperature in fahrenheit to celsius',
inputSchema: z.object({
temperature: z
.number()
.describe('The temperature in fahrenheit to convert'),
}),
execute: async ({ temperature }) => {
const celsius = Math.round((temperature - 32) * (5 / 9));
return {
celsius,
};
},
}),
},
});
return result.toUIMessageStreamResponse({
headers: {
'Content-Type': 'application/octet-stream',
'Content-Encoding': 'none',
},
});
}You may need to restart your development server for the changes to take effect.
Update the UI for the new tool
To display the temperature conversion tool invocation in your UI, update your app/(tabs)/index.tsx file to handle the new tool part:
import { generateAPIUrl } from '@/utils';
import { useChat } from '@ai-sdk/react';
import { DefaultChatTransport } from 'ai';
import { fetch as expoFetch } from 'expo/fetch';
import { useState } from 'react';
import { View, TextInput, ScrollView, Text, SafeAreaView } from 'react-native';
export default function App() {
const [input, setInput] = useState('');
const { messages, error, sendMessage } = useChat({
transport: new DefaultChatTransport({
fetch: expoFetch as unknown as typeof globalThis.fetch,
api: generateAPIUrl('/api/chat'),
}),
onError: error => console.error(error, 'ERROR'),
});
if (error) return <Text>{error.message}</Text>;
return (
<SafeAreaView style={{ height: '100%' }}>
<View
style={{
height: '95%',
display: 'flex',
flexDirection: 'column',
paddingHorizontal: 8,
}}
>
<ScrollView style={{ flex: 1 }}>
{messages.map(m => (
<View key={m.id} style={{ marginVertical: 8 }}>
<View>
<Text style={{ fontWeight: 700 }}>{m.role}</Text>
{m.parts.map((part, i) => {
switch (part.type) {
case 'text':
return <Text key={`${m.id}-${i}`}>{part.text}</Text>;
case 'tool-weather':
case 'tool-convertFahrenheitToCelsius':
return (
<Text key={`${m.id}-${i}`}>
{JSON.stringify(part, null, 2)}
</Text>
);
default:
return null;
}
})}
</View>
</View>
))}
</ScrollView>
<View style={{ marginTop: 8 }}>
<TextInput
style={{ backgroundColor: 'white', padding: 8 }}
placeholder="Say something..."
value={input}
onChange={e => setInput(e.nativeEvent.text)}
onSubmitEditing={() => {
sendMessage({ text: input });
setInput('');
}}
autoFocus={true}
/>
</View>
</View>
</SafeAreaView>
);
}You may need to restart your development server for the changes to take effect.
Now, when you ask "What's the weather in New York in celsius?", you should see a more complete interaction:
- The model will call the weather tool for New York.
- You'll see the tool result displayed.
- It will then call the temperature conversion tool to convert the temperature from Fahrenheit to Celsius.
- The model will then use that information to provide a natural language response about the weather in New York.
This multi-step approach allows the model to gather information and use it to provide more accurate and contextual responses, making your chatbot considerably more useful.
This simple example demonstrates how tools can expand your model's capabilities. You can create more complex tools to integrate with real APIs, databases, or any other external systems, allowing the model to access and process real-world data in real-time. Tools bridge the gap between the model's knowledge cutoff and current information.