Connic

Customer Support

AI ticket triage with sentiment analysis and RAG-powered response drafting. Classifies by priority, blocks spam, and stores new solutions.

connic init my-project --templates=customer-support

Overview

Classifies incoming tickets by priority, category, and customer sentiment, then drafts empathetic responses using semantic search over your knowledge base. Middleware blocks spam senders with StopProcessing before the LLM runs, saving tokens. New solutions discovered during response drafting are stored back into the knowledge base for future tickets. Tools use context injection for run traceability.

Use cases

Real-time chat support

Connect via WebSocket for streaming responses in a live chat widget with sentiment-aware tone matching.

Ticketing system integration

Process tickets from Zendesk or Intercom via webhook, auto-classify and suggest first responses.

Knowledge base building

Automatically grow your support knowledge base as agents discover and store new solutions.

Architecture

WebSocket / Webhook
support-pipeline
support-triager
support-responder
Knowledge Base

Scaffolded project structure

Running connic init my-project --templates=customer-support creates this file tree.

customer-support/
  agents/
    support-triager.yaml
    support-responder.yaml
    support-pipeline.yaml
  tools/
    support_tools.py
  middleware/
    support-triager.py
  schemas/
    ticket-classification.json
  requirements.txt
  README.md

Get started

Install the template, create a Connic project, and deploy. Choose Git (automatic on push) or CLI (works with any provider).

Prerequisites

  • Python 3.10+
  • A Connic account (create a project first)
  • API key for your LLM provider (e.g. Gemini, OpenAI) to add in project variables
Create project
1

Install and scaffold

Install the SDK and create a project from this template.

terminal
pip install connic-composer-sdk
connic init my-project --templates=customer-support

Then cd my-project

2

Deploy

Pick your deployment method. Git auto-deploys on push; CLI works with GitLab, Bitbucket, or no Git.

Git integration

  1. In Connic: Project Settings → Git Repository, connect your GitHub repo
  2. Settings → Environments: map branch (e.g. main) to Production
  3. Push your scaffolded project to that repo
terminal
git add .
git commit -m "Add Customer Support template"
git push origin main

CLI deploy

  1. In Connic: Project Settings → CLI, create an API key and copy project ID
  2. Run connic login in your project folder
  3. connic test to try with hot-reload, or connic deploy for production
terminal
connic login
connic test    # Ephemeral dev env with hot-reload
connic deploy # Deploy to production
3

Connect and configure

Add a WebSocket connector for real-time chat with streaming responses. Optionally add an HTTP Webhook (outbound) connector to forward escalations to Slack or PagerDuty. Add your LLM provider API key in Project Settings → LLM Provider API Keys.

Template source

Browse the full template, contribute improvements, or fork for your own use.

connic-org/connic-awesome-agents/tree/main/customer-support