AI Data Analysis for Non-Python Teams
How product managers, analysts, and ops teams can use DataStoryBot without writing Python — via the playground, curl commands, or simple API calls from any language.
AI Data Analysis for Non-Python Teams
Most AI data analysis tutorials start with import pandas. This article doesn't. You don't need Python to analyze data with DataStoryBot. You need a CSV file and one of three access methods: the web playground, curl, or whatever programming language your team already uses.
DataStoryBot's API is HTTP + JSON. Any tool that can make HTTP requests can use it. Python is just the most common example in documentation — not a requirement.
Option 1: The Playground (Zero Code)
The fastest path from CSV to insights. Open datastory.bot, upload your file, and the analysis runs in the browser.
The playground flow:
- Upload — drag and drop your CSV or click to browse
- Review metadata — column names, row count, data types are shown automatically
- Get story angles — DataStoryBot analyzes the data and presents 2-4 story angles
- Pick a story — click the one that matters most
- Read the narrative — full analysis with charts, all in the browser
- Optionally refine — add a steering prompt to focus the analysis
No API key needed for the playground. No code. No installation. The entire analysis runs in the browser with DataStoryBot's backend doing the computation.
When to use the playground:
- Ad-hoc analysis of a single file
- Stakeholders who want to explore data themselves
- Quick validation before building an automated pipeline
- Demos and presentations
When the playground isn't enough:
- You need to analyze data on a recurring schedule
- The analysis needs to integrate with other systems (email, Slack, dashboards)
- You're processing multiple files programmatically
Option 2: curl (Command Line)
If you're comfortable with the terminal but not with Python, curl works for the entire API flow:
Upload
curl -X POST https://datastory.bot/api/upload \
-F "file=@quarterly_sales.csv"
Save the containerId from the response:
CONTAINER_ID="ctr_abc123"
Analyze
curl -X POST https://datastory.bot/api/analyze \
-H "Content-Type: application/json" \
-d "{
\"containerId\": \"$CONTAINER_ID\",
\"steeringPrompt\": \"Analyze revenue trends by region. Highlight any anomalies.\"
}"
Refine
curl -X POST https://datastory.bot/api/refine \
-H "Content-Type: application/json" \
-d "{
\"containerId\": \"$CONTAINER_ID\",
\"selectedStoryTitle\": \"West Region Revenue Declines 12% While Other Regions Grow\"
}"
Download Charts
curl -X GET "https://datastory.bot/api/files/$CONTAINER_ID/file-chart001" \
--output chart_1.png
That's the entire API in four curl commands. Wrap them in a bash script and you have an automated analysis pipeline without writing a single line of Python.
Complete Bash Script
#!/bin/bash
# analyze.sh — CSV to data story in bash
CSV_FILE="${1:?Usage: analyze.sh <csv-file> [steering-prompt]}"
STEERING="${2:-Analyze this dataset and find the most interesting patterns.}"
# Upload
echo "Uploading $CSV_FILE..."
UPLOAD=$(curl -s -X POST https://datastory.bot/api/upload \
-F "file=@$CSV_FILE")
CONTAINER_ID=$(echo "$UPLOAD" | jq -r '.containerId')
echo "Container: $CONTAINER_ID"
# Analyze
echo "Analyzing..."
STORIES=$(curl -s -X POST https://datastory.bot/api/analyze \
-H "Content-Type: application/json" \
-d "{\"containerId\": \"$CONTAINER_ID\", \"steeringPrompt\": \"$STEERING\"}")
TITLE=$(echo "$STORIES" | jq -r '.[0].title')
echo "Top story: $TITLE"
# Refine
echo "Generating full narrative..."
REPORT=$(curl -s -X POST https://datastory.bot/api/refine \
-H "Content-Type: application/json" \
-d "{\"containerId\": \"$CONTAINER_ID\", \"selectedStoryTitle\": \"$TITLE\"}")
echo "$REPORT" | jq -r '.narrative' > report.md
echo "Narrative saved to report.md"
# Download charts
echo "$REPORT" | jq -r '.charts[].fileId' | while read -r FID; do
curl -s "https://datastory.bot/api/files/$CONTAINER_ID/$FID" \
--output "${FID}.png"
echo "Chart saved: ${FID}.png"
done
echo "Done."
Usage: ./analyze.sh sales_data.csv "Compare performance across regions"
Option 3: Your Language's HTTP Client
DataStoryBot's API is REST + JSON. If your team uses JavaScript, Ruby, Go, Java, or anything else — use your language's HTTP library.
JavaScript (Node.js)
const fs = require("fs");
const FormData = require("form-data");
const BASE_URL = "https://datastory.bot/api";
async function analyzeCSV(csvPath, steering) {
// Upload
const form = new FormData();
form.append("file", fs.createReadStream(csvPath));
const uploadRes = await fetch(`${BASE_URL}/upload`, {
method: "POST",
body: form,
});
const { containerId } = await uploadRes.json();
// Analyze
const storiesRes = await fetch(`${BASE_URL}/analyze`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ containerId, steeringPrompt: steering }),
});
const stories = await storiesRes.json();
// Refine
const reportRes = await fetch(`${BASE_URL}/refine`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
containerId,
selectedStoryTitle: stories[0].title,
}),
});
return reportRes.json();
}
analyzeCSV("sales.csv", "Analyze revenue trends").then((report) => {
console.log(report.narrative);
});
Ruby
require "net/http"
require "json"
require "uri"
BASE_URL = "https://datastory.bot/api"
# Upload
uri = URI("#{BASE_URL}/upload")
req = Net::HTTP::Post.new(uri)
form_data = [["file", File.open("sales.csv")]]
req.set_form(form_data, "multipart/form-data")
res = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true) { |http| http.request(req) }
container_id = JSON.parse(res.body)["containerId"]
# Analyze
uri = URI("#{BASE_URL}/analyze")
req = Net::HTTP::Post.new(uri, "Content-Type" => "application/json")
req.body = { containerId: container_id, steeringPrompt: "Analyze revenue trends" }.to_json
res = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true) { |http| http.request(req) }
stories = JSON.parse(res.body)
# Refine
uri = URI("#{BASE_URL}/refine")
req = Net::HTTP::Post.new(uri, "Content-Type" => "application/json")
req.body = { containerId: container_id, selectedStoryTitle: stories[0]["title"] }.to_json
res = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true) { |http| http.request(req) }
report = JSON.parse(res.body)
puts report["narrative"]
Google Apps Script (for Google Sheets teams)
function analyzeSheet() {
const sheet = SpreadsheetApp.getActiveSheet();
const data = sheet.getDataRange().getValues();
// Convert sheet data to CSV
const csv = data.map(row => row.join(",")).join("\n");
const blob = Utilities.newBlob(csv, "text/csv", "sheet_data.csv");
// Upload
const uploadRes = UrlFetchApp.fetch("https://datastory.bot/api/upload", {
method: "post",
payload: { file: blob }
});
const containerId = JSON.parse(uploadRes.getContentText()).containerId;
// Analyze
const storiesRes = UrlFetchApp.fetch("https://datastory.bot/api/analyze", {
method: "post",
contentType: "application/json",
payload: JSON.stringify({
containerId: containerId,
steeringPrompt: "Analyze this data and find the key insights."
})
});
const stories = JSON.parse(storiesRes.getContentText());
// Refine
const reportRes = UrlFetchApp.fetch("https://datastory.bot/api/refine", {
method: "post",
contentType: "application/json",
payload: JSON.stringify({
containerId: containerId,
selectedStoryTitle: stories[0].title
})
});
const report = JSON.parse(reportRes.getContentText());
// Put narrative in a new sheet
const resultSheet = SpreadsheetApp.getActiveSpreadsheet()
.insertSheet("Analysis Results");
resultSheet.getRange(1, 1).setValue(report.narrative);
}
This runs directly from Google Sheets — no server, no Python, no deployment. Select your data, run the script, get the analysis in a new tab.
No-Code Automation
For teams using no-code platforms, DataStoryBot's REST API integrates via HTTP request blocks:
Zapier / Make / n8n:
- Trigger: new file in Google Drive / Dropbox / email attachment
- Action: HTTP POST to
/uploadwith the file - Action: HTTP POST to
/analyzewith the container ID - Action: HTTP POST to
/refinewith the top story title - Action: Send the narrative via email / Slack / Notion
Retool / Appsmith: Build an internal tool with a file upload widget → API calls → display the narrative and charts in the UI. No backend code required.
Choosing the Right Access Method
| Method | Best For | Skill Required |
|---|---|---|
| Playground | One-off analysis, demos | None |
| curl / bash script | Scheduled reports, CLI users | Basic terminal |
| JavaScript | Node.js teams, serverless functions | JavaScript |
| Ruby / Go / Java | Backend teams in those languages | That language |
| Google Apps Script | Google Sheets-centric teams | Basic JavaScript |
| No-code platforms | Non-technical teams, simple automations | Zapier/Make familiarity |
The API doesn't care what calls it. Pick whatever your team already knows.
What to Read Next
For the Python-focused quickstart (if your team does use Python), see getting started with the DataStoryBot API.
For building automated report pipelines in any language, read automating weekly data reports.
For the complete API endpoint reference, see the DataStoryBot API reference.
Or start with the playground — no code required.
Ready to find your data story?
Upload a CSV and DataStoryBot will uncover the narrative in seconds.
Try DataStoryBot →