Customer Support
AI ticket triage with sentiment analysis and RAG-powered response drafting. Classifies by priority, blocks spam, and stores new solutions.
connic init my-project --templates=customer-supportOverview
Classifies incoming tickets by priority, category, and customer sentiment, then drafts empathetic responses using semantic search over your knowledge base. Middleware blocks spam senders with StopProcessing before the LLM runs, saving tokens. New solutions discovered during response drafting are stored back into the knowledge base for future tickets. Tools use context injection for run traceability.
Use cases
Real-time chat support
Connect via WebSocket for streaming responses in a live chat widget with sentiment-aware tone matching.
Ticketing system integration
Process tickets from Zendesk or Intercom via webhook, auto-classify and suggest first responses.
Knowledge base building
Automatically grow your support knowledge base as agents discover and store new solutions.
Architecture
Scaffolded project structure
Running connic init my-project --templates=customer-support creates this file tree.
customer-support/
agents/
support-triager.yaml
support-responder.yaml
support-pipeline.yaml
tools/
support_tools.py
middleware/
support-triager.py
schemas/
ticket-classification.json
requirements.txt
README.mdGet started
Install the template, create a Connic project, and deploy. Choose Git (automatic on push) or CLI (works with any provider).
Prerequisites
- Python 3.10+
- A Connic account (create a project first)
- API key for your LLM provider (e.g. Gemini, OpenAI) to add in project variables
Install and scaffold
Install the SDK and create a project from this template.
pip install connic-composer-sdkconnic init my-project --templates=customer-supportThen cd my-project
Deploy
Pick your deployment method. Git auto-deploys on push; CLI works with GitLab, Bitbucket, or no Git.
Git integration
- In Connic: Project Settings → Git Repository, connect your GitHub repo
- Settings → Environments: map branch (e.g.
main) to Production - Push your scaffolded project to that repo
git add .
git commit -m "Add Customer Support template"
git push origin mainCLI deploy
- In Connic: Project Settings → CLI, create an API key and copy project ID
- Run
connic loginin your project folder connic testto try with hot-reload, orconnic deployfor production
connic login
connic test # Ephemeral dev env with hot-reload
connic deploy # Deploy to productionConnect and configure
Add a WebSocket connector for real-time chat with streaming responses. Optionally add an HTTP Webhook (outbound) connector to forward escalations to Slack or PagerDuty. Add your LLM provider API key in Project Settings → LLM Provider API Keys.
Template source
Browse the full template, contribute improvements, or fork for your own use.
connic-org/connic-awesome-agents/tree/main/customer-supportExplore other templates
View allTelegram Personal Assistant
Multimodal Telegram assistant that processes text, images, voice, video, and documents. Searches the web, saves notes and knowledge, and schedules follow-up reminders with persistent sessions.
S3 Document Pipeline
Auto-process documents uploaded to S3 with text extraction, classification, and routing. Handles PDFs, images, and text files with retries.
SQS Order Processor
Validate and fulfill orders from SQS queues with fraud detection, inventory checks, and concurrency keys. Results publish to an outbound queue.
Email Helpdesk
End-to-end email support with IMAP polling, auto-reply filtering, and intent classification. Drafts contextual replies from your knowledge base.