Production-grade behavioral analysis without writing SQL? With Keboola MCP Server and Claude AI, it's now possible to build full data pipelines just by describing what you need. The AI interprets the request, generates the transformations, and Keboola executes them in a governed, scalable workflow.


Quick Summary:


  • Goal: Segment customers based on behavior - no SQL needed
  • Tools: Claude AI + Keboola MCP Server
  • How it works: Natural language → AI builds the logic → Keboola runs it
  • Outcome: Automated, repeatable pipelines without engineering bottlenecks

Let’s say you need to slice through raw usage data, segment users by behavior, and surface onboarding bottlenecks, all without writing a single line of SQL, because you’re not really fluent in SQL and even quite simple queries take you a lot of time to fix. Normally, this means jumping through BI hoops, waiting on data teams, or building SQL views by hand.


But with Keboola MCP Server + Claude AI, I skipped the wait and built production-grade analysis pipelines myself. Just by describing what I wanted.

This combo turns any large language model into a full-stack data agent. I prompted. Claude engineered. Keboola ran the pipeline.

Here’s how it played out.


Wait, What Does MCP Server Mean?


The Keboola MCP Server connects large language models (like Claude or ChatGPT) directly to your Keboola project. It acts as a smart interface between natural language and data infrastructure. So instead of writing SQL or building pipelines manually, you just describe what you want. The AI interprets your intent, composes transformations, runs jobs, and pushes results. All through governed, production-ready workflows inside Keboola.


What I Wanted to Do


I needed to:


  • Filter customer projects based on specific logic (payments, requester emails, project type).

  • Measure onboarding behavior over multiple windows: 2h, 24h, 72h, 240h.

  • Track configs, job outcomes, flow creation, and execution.

  • Segment only the first projects per user to track new customer behavior.

  • Export results to CSV. Automate the process for repeatability.

  • Later, pivot from revenue-based logic to maturity-level segmentation (“recurring use”).

Traditionally? This would involve coordinating across analytics, data engineering, and product teams. Instead, I handled it solo — through prompts.


Why Not Use a BI Tool?


Sure, dashboards are fine for viewing data. But when it comes to composing logic, building transformations, or stitching multi-source pipelines - BI hits a ceiling.

Side-by-side infographic comparing traditional BI tools with Keboola MCP server and Claude, showcasing the transition from static dashboards to AI-driven data pipelines.
Prompt to Pipeline: How Keboola MCP Server Replaces Static BI with Real-Time AI Automation

Here’s where LLM + MCP + Keboola goes way further:


Traditional BI Keboola MCP + Claude 
View-only access Full write + orchestration 
Locked to existing models Direct access to raw telemetry 
Requires SQL or analyst handoffs Prompt → Deploy pipeline 
Static dashboards Dynamic transformations 
Changes = more backlog Changes = new prompt 

You’re not just querying, you’re building. You’re not looking at what’s already there. You’re composing workflows, executing code, and shipping results directly.


How It Worked Step by Step


Let’s walk through what actually happened.


Step 1: Describe the outcome


I told Claude:


“Find all PayAsYouGo projects created after Jan 1, 2024, that made a real payment, exclude .cz domains, ignore test projects, and include only the first project per requester.”

Claude parsed that and built the logic. Joins, filters, and aggregations - in Keboola’s transformation engine creating an SQL I could barely understand. I checked the output with our BI, it matched perfectly.


Infographic showing Keboola data automation with conversational AI, visualizing steps from user query to SQL and dashboard output.
Keboola workflow powered by AI — from plain requests to automated SQL transformations, visualized through a four-step journey: Outcome → Windows → Business Logic → Automation.

Step 2: Layer analysis windows


Next, I added behavioral tracking:


  • Number of configs created

  • Job success/failure

  • Flow setup and execution

Each tracked over:


  • First 2 hours

  • First 24 hours

  • First 72 hours

  • First 240 hours

No need to calculate timestamps manually. Claude handled the time logic and added columns with those metrics.


Step 3: Add business logic


Later, I updated the query to:


  • Pull maturity_level from kbc_project_updated

  • Filter for "recurring use" instead of revenue

  • Add first_payment_date, time_to_first_payment_days, and time_to_first_successful_flow_days

Again, no coding. Just conversation.


Step 4: Automate it


I asked Claude:


“Can you turn this into a transformation?”


Done. It:


  • Creates a new output table (payg_customer_onboarding_analysis_v3)

  • Drops the old version to keep it clean

  • Has full input/output mappings

  • Runs on schedule inside Keboola

What I Got from It


Here’s what this workflow unlocked:


  • Got projects segmented by behavior and maturity

  • Full pipeline created and deployed without code

  • Zero need to involve engineering, analytics, or ops

  • All logic version-controlled and observable inside Keboola

I didn’t just analyze data. I shipped a pipeline that runs reliably in production. With logging. Scheduling. Governance. All the good stuff.


Why This Matters


This isn’t about avoiding SQL (though that’s nice). It’s about changing the shape of work:


  • Business users can now build pipelines — not just request them.

  • AI becomes a real contributor, not just a co-pilot.

  • Data infrastructure gets conversational, not code-bound.

In other words: no more waiting. No more backlogs. You describe the outcome — the platform builds the rest.


Lessons Learned from Real-World Use


Despite the power of using Claude and the MCP Server, I hit several snags that are worth Using Claude + MCP Server is powerful, but not foolproof. Here are a few things that tripped me up — and how to avoid them:


  • False revenue signals: Some “paying” projects showed $0 revenue because the payments table included free credit grants. Lesson: make sure the LLM interprets data semantics the way you do — payment ≠ revenue.

  • Incorrect project joins: One project meant to show a pharmacy appeared as a bank. Why? Claude hallucinated when I asked for a CSV export. Fix: always export data directly from Keboola Storage, not the AI’s memory.

  • SQL syntax & table structure mismatches: A transformation failed because Claude tried to save results into a table that no longer matched the new structure. Solution: have Claude drop the table or rename it before updating output schemas.

For a full list of best practices and common pitfalls, check out the official Keboola MCP Server FAQ and guide: Keboola MCP Server – Best Practices and Frequently Asked Questions.


TL;DR - To Get the Best Results:


  • Iterate in small steps — don’t try to build a perfect pipeline in one go. Check validity of results after each step. This is probably the most important one.

  • Be specific and scoped in your prompt — also on what to exclude

  • Test your logic in query mode before converting it into a transformation. It saves a lot of time.

  • Remember: the AI executes instructions, not intentions — clarity beats cleverness

Key Benefits of MCP + Claude + Keboola


  • Prompt-driven ETL — Skip the dev backlog and build real workflows instantly

  • Maturity-level insights — Segment users by real usage, not just revenue

  • Flexible analysis windows — Configure onboarding funnels without touching timestamps

  • No glue code needed — Everything runs on Keboola’s infrastructure

  • Faster than asking the data team — And more scalable

Final Thought


Keboola MCP + Claude turns “what if we could…” into “we just did.”

You go from data question → working solution in a single session. You skip meetings, tickets, and translations — and go straight to outcomes.

This isn’t the future of analytics. It’s already here.


FAQs


1. Do I need to know SQL to analyze data with Keboola MCP Server?


No. You simply describe what you want in natural language. Claude AI interprets your request, builds the necessary transformations, and Keboola executes them. SQL is generated behind the scenes — no manual coding required.


2. How is this different from a traditional BI tool?


BI tools are great for viewing data, but they fall short when it comes to building pipelines, composing logic, or working across multiple data sources. With MCP + Claude, you're not just querying data — you're building, orchestrating, and automating full workflows.


3. Is this reliable enough for production use?


Yes — with some care. You should validate outputs step by step and be specific in your prompts. AI follows instructions, not intentions. The good news: all logic is versioned, observable, and fully governed within Keboola, so you're always in control.