Non-streaming chat
Build reliable chat integrations with complete response patterns for batch processing and simple UIs
Overview
Build a chat integration that receives complete responses after processing, perfect for batch processing, simple UIs, or when you need the full response before proceeding. Ideal for integrations where real-time display isn’t essential.
What You’ll Build:
- Simple request-response chat patterns with immediate complete responses
- Session-based conversations that maintain context across multiple chats
- Basic integration with predictable response timing
Prerequisites
- Completed Chat quickstart tutorial
- Understanding of basic HTTP requests and JSON handling
- Familiarity with JavaScript/TypeScript promises or async/await
Scenario
We’ll build a help desk system for “TechFlow” that processes support messages through text chat and maintains conversation history using sessions.
1. Basic Non-Streaming Implementation
Create a simple chat function
Start with a basic non-streaming chat implementation:
Basic Non-Streaming Request
2. Context Management with Sessions
Create a session for persistent context
Sessions allow multiple chats to share the same conversation context:
Create Session
3. Using previousChatId for Context
4. Custom Assistant Configuration
Next Steps
Enhance your non-streaming chat system further:
- Add streaming capabilities - Upgrade to real-time responses for better UX
- OpenAI compatibility - Use familiar OpenAI SDK patterns
- Integrate tools - Enable your assistant to call external APIs and databases
- Add voice capabilities - Extend your text chat to voice interactions