- Build
- Tools
Predefined, custom,
or any MCP server
Pick from a growing catalog of typed, tested tools. Write your own as Python functions. Or expose any MCP server, and every agent in the project can use it.
Read the tools docsTools
6 attached- Custominvoices.parse_pdf
- Custominvoices.store
- Predefinedweb_search
- MCPlinear.create_issue
- MCPcontext7.search_docs
- Custombilling.calculate_vat
The tools you'd otherwise reimplement
A built-in catalog for the things every agent needs: knowledge search, the project database, agent-to-agent orchestration, and web access.
query_knowledgeSearch the knowledge base for relevant information.
store_knowledgeQueue new information for knowledge base indexing.
delete_knowledgeRemove entries from the knowledge base.
kb_list_namespacesList knowledge base namespaces and their hierarchy.
db_findQuery documents from a collection using filters.
db_insertInsert one or more documents into a collection.
db_updateUpdate documents matching a filter.
db_deleteDelete documents matching a filter.
db_countCount documents in a collection, optionally filtered.
db_list_collectionsList collections with counts and storage sizes.
trigger_agentTrigger another agent within the same project.
trigger_agent_atSchedule an agent to run at a future time.
web_searchSearch the web for real-time information.
web_read_pageFetch a web page and return its content as markdown.
A Python function is a tool
Plain functions in tools/. Type hints and a docstring are everything Connic needs to expose them to your agent.
from datetime import date
def calculate_late_fee(
invoice_total: float,
due_date: date,
paid_date: date,
) -> dict:
"""Compute the late fee for an overdue invoice.
Args:
invoice_total: The original invoice total in dollars
due_date: The original due date
paid_date: The date the invoice was paid
Returns:
Dict with the fee amount in cents and the days overdue.
"""
days_overdue = max(0, (paid_date - due_date).days)
fee_cents = int(invoice_total * 100 * 0.015 * days_overdue)
return {"fee_cents": fee_cents, "days_overdue": days_overdue}The agent sees a typed JSON schema generated from the function signature. No decorators, no manual schema authoring.
The LLM uses your docstring to decide when to call the tool. Document each parameter under Args so the model picks the right one.
Drop the file into your tools/ directory and reference it as module.function in the agent YAML. Wildcards like calculator.* work too.
Any MCP server, every agent
Add servers to mcp_servers in the agent YAML. Connic connects, discovers tools, and exposes them to the LLM at runtime.
support-triagemcp_servers:
- name: context7
url: https://mcp.context7.com/mcp
- name: research-hub
url: https://mcp.example.com/research
headers:
Authorization: "Bearer ${RESEARCH_TOKEN}"
tools:
- search_papers
- fetch_abstract
- name: internal-tools
url: https://mcp.internal.company.com/tools
headers:
Authorization: "Bearer ${INTERNAL_TOKEN}"
discoverable: true # indexed for on-demand search, not loaded upfrontProduction primitives, per call
Predefined, custom, and MCP tools all share the same discovery, observability, and lifecycle semantics.
Drop a Python file in tools/ and reference module.function in your agent YAML. Subfolders and wildcards (calculator.*) both work.
Raise StopProcessing from a tool to end the run successfully with a final message, or AbortTool from a hook to skip a single call.
Every tool call shows up in run details with inputs, output, and duration. Prints, stderr, and tools.* loggers stream to the Logs tab.
Mark rarely-used tools (or whole MCP servers) as discoverable. They stay out of the LLM context until the agent searches for them by description.