📊

JSON to CSV Converter

Convert JSON data to CSV spreadsheet format instantly with support for nested object flattening, custom delimiters, and headers. All conversions happen locally in your browser.

Developer Tools
Loading tool...

How to Use JSON to CSV Converter

What is JSON to CSV Conversion?

JSON (JavaScript Object Notation) and CSV (Comma-Separated Values) are two fundamental data formats in software development and data analysis. JSON is the standard format for APIs and web applications, while CSV is the universal format for spreadsheets, databases, and data analysis tools.

This tool converts JSON data into CSV format entirely in your browser, with no server uploads or data storage. It supports nested object flattening, custom delimiters, and automatic header generation to match your exact needs.

Why Convert JSON to CSV?

Converting JSON to CSV is essential for many development and data workflows:

  • Data Analysis: Import API responses into Excel, Google Sheets, or data analysis tools
  • Debugging: Visualize API responses in tabular format for easier inspection
  • Reporting: Create reports from JSON logs or API data
  • Database Import: Bulk import JSON data into SQL databases via CSV
  • Spreadsheet Integration: Share JSON data with non-technical teams using Excel/Sheets
  • Quick Exports: Generate downloadable CSV files from web app data
  • Testing: Create CSV test data from JSON fixtures

Understanding JSON and CSV Formats

JSON Format

Structure: Hierarchical format with objects, arrays, and nested data.

Example:

[
  {
    "name": "Alice",
    "age": 30,
    "address": {
      "city": "London",
      "country": "UK"
    }
  },
  {
    "name": "Bob",
    "age": 25,
    "address": {
      "city": "Berlin",
      "country": "Germany"
    }
  }
]

Characteristics:

  • Nested objects and arrays
  • Native data types (strings, numbers, booleans, null)
  • Flexible structure per item
  • Standard for web APIs

CSV Format

Structure: Flat tabular format with rows and columns.

Example (with flattening):

address.city,address.country,age,name
London,UK,30,Alice
Berlin,Germany,25,Bob

Example (without flattening):

address,age,name
"{""city"":""London"",""country"":""UK""}",30,Alice
"{""city"":""Berlin"",""country"":""Germany""}",25,Bob

Characteristics:

  • Simple two-dimensional table
  • Text-based with delimiter-separated values
  • Header row describes columns
  • Universal compatibility with spreadsheet tools

How to Use This Tool

Quick Start (30 seconds)

  1. Paste JSON data into the left panel
  2. Click "Convert to CSV" to transform it instantly
  3. View CSV output in the right panel
  4. Copy or download the result

Step-by-Step Workflow

Step 1: Input Your JSON Data

Paste JSON text:

  • Click in the JSON input textarea (left panel)
  • Paste your JSON data (Ctrl+V / Cmd+V)
  • Line numbers appear automatically for easy reference

Accepted JSON formats:

Array of objects (most common):

[
  {"name": "Alice", "age": 30},
  {"name": "Bob", "age": 25}
]

Single object (automatically wrapped):

{"name": "Alice", "age": 30}

→ Treated as: [{"name": "Alice", "age": 30}]

Tip: Copy JSON directly from API responses, browser DevTools, or your code editor.

Step 2: Configure Conversion Options

Delimiter Selection

Choose the character that separates values in the CSV output:

  • Comma (,): Standard CSV format (default)

    name,age,city
    Alice,30,London
    
  • Semicolon (;): Common in European locales

    name;age;city
    Alice;30;London
    
  • Tab: Tab-separated values (TSV)

    name    age    city
    Alice   30     London
    

When to use each:

  • Comma: Default for most use cases, Excel (US)
  • Semicolon: European Excel, data with commas in values
  • Tab: Data analysis tools, easy copy-paste

Include Header Row

Check this box to add column names as the first row:

With headers (checked, default):

name,age,city
Alice,30,London
Bob,25,Berlin

Without headers (unchecked):

Alice,30,London
Bob,25,Berlin

Tip: Enable for most use cases. Spreadsheets and databases expect headers.

Flatten Nested Objects

Check this box to convert nested JSON objects into flat columns using dot notation:

With flattening (checked, default):

JSON input:

[
  {
    "name": "Alice",
    "address": {
      "city": "London",
      "zip": "SW1"
    }
  }
]

CSV output:

address.city,address.zip,name
London,SW1,Alice

Without flattening (unchecked):

CSV output:

address,name
"{""city"":""London"",""zip"":""SW1""}",Alice

When to use:

  • Flatten ON: When you want each nested field as a separate column (recommended)
  • Flatten OFF: When you want to preserve JSON structure in cells

Arrays in flattened output:

{"orders": [{"id": 1}, {"id": 2}]}

→ Becomes: orders[0].id,orders[1].id

Step 3: Convert the Data

Click "Convert to CSV" button:

  • The tool parses your JSON and converts to CSV
  • CSV appears in the right panel
  • Row and column counts display above output
  • Any parsing errors show in a red alert box

Automatic handling:

  • Single objects are wrapped in arrays automatically
  • Missing keys across objects are handled gracefully (empty cells)
  • Inconsistent object structures are merged (all keys from all objects)

Step 4: Use the CSV Output

Review the output:

  • Scroll through the CSV preview in the right panel
  • Verify column names and data structure
  • Check that nested data is flattened as expected

Copy to clipboard:

  • Click "Copy" button above CSV output
  • The entire CSV is copied (even if not all visible)
  • "Copied!" confirmation appears briefly
  • Paste into Excel, Google Sheets, or any text editor

Download as file:

  • Click "Download" button
  • Saves as data.csv
  • Ready to import into spreadsheets, databases, or analysis tools
  • Open directly in Excel, LibreOffice Calc, or Numbers

Statistics:

  • Row count: Number of data rows
  • Column count: Number of columns (headers)

Step 5: Adjust and Reconvert (If Needed)

If output isn't right:

  1. Toggle "Flatten nested objects": Try with/without flattening
  2. Change delimiter: Switch between comma, semicolon, tab
  3. Toggle "Include header row": Add/remove column names
  4. Click "Convert to CSV" again to regenerate

Reset button: Clears all input and output to start fresh.

Common Use Cases

API Response Analysis

Scenario: You fetch data from a REST API and want to analyze it in Excel.

Workflow:

  1. Call your API and copy the JSON response
  2. Paste JSON into this tool
  3. Enable "Flatten nested objects" to expand all fields
  4. Convert to CSV
  5. Download and open in Excel for analysis

Example:

API response:

[
  {
    "user": {"id": 1, "name": "Alice"},
    "order": {"total": 99.99, "items": 3}
  },
  {
    "user": {"id": 2, "name": "Bob"},
    "order": {"total": 149.99, "items": 5}
  }
]

CSV output (flattened):

order.items,order.total,user.id,user.name
3,99.99,1,Alice
5,149.99,2,Bob

Use in Excel:

  • Calculate totals, averages
  • Create pivot tables
  • Generate charts and graphs

Database Export

Scenario: Export data from a NoSQL database (MongoDB, Firebase) to SQL database.

Workflow:

  1. Export collection as JSON from NoSQL database
  2. Convert JSON to CSV using this tool
  3. Import CSV into SQL database (MySQL, PostgreSQL, SQLite)

Example (MongoDB → PostgreSQL):

# Export from MongoDB
mongoexport --db myapp --collection users --out users.json --jsonArray

# Convert JSON to CSV (using this tool)

# Import to PostgreSQL
psql -d myapp -c "COPY users FROM '/path/to/data.csv' CSV HEADER"

Log File Analysis

Scenario: Application logs in JSON format need analysis in spreadsheet.

Workflow:

  1. Aggregate JSON logs (from Elasticsearch, CloudWatch, etc.)
  2. Convert to CSV
  3. Analyze patterns, errors, timings in Excel

Example (JSON logs):

[
  {"timestamp": "2024-01-15T10:00:00Z", "level": "error", "message": "Connection timeout", "duration": 5000},
  {"timestamp": "2024-01-15T10:01:00Z", "level": "info", "message": "Request completed", "duration": 250}
]

CSV output:

duration,level,message,timestamp
5000,error,Connection timeout,2024-01-15T10:00:00Z
250,info,Request completed,2024-01-15T10:01:00Z

Sharing Data with Non-Technical Teams

Scenario: Developers have JSON data; non-technical team needs it in Excel.

Workflow:

  1. Convert JSON to CSV
  2. Share CSV file via email or cloud storage
  3. Team opens in Excel with familiar interface

Example (Sales data):

Developer has:

[
  {"product": "Laptop", "sales": 15, "revenue": 14999},
  {"product": "Mouse", "sales": 50, "revenue": 1250}
]

Team gets: sales.csv → Opens in Excel with proper columns

Creating Reports

Scenario: Generate CSV reports from web application data.

Workflow:

  1. Fetch data via API in JSON format
  2. Convert to CSV
  3. Email or serve CSV file as downloadable report

Example (User activity report):

// Fetch user activity
const response = await fetch('/api/reports/user-activity');
const data = await response.json();

// Convert to CSV (using this tool or library)
// Download or email the CSV report

Data Visualization Prep

Scenario: Prepare JSON data for Tableau, Power BI, or other BI tools.

Workflow:

  1. Export data from application as JSON
  2. Convert to CSV with proper flattening
  3. Import CSV into BI tool for visualization

Example (Sales by region):

JSON:

[
  {"region": "North", "q1": 100000, "q2": 120000},
  {"region": "South", "q1": 90000, "q2": 95000}
]

Import into Tableau → Create regional performance dashboard

Bulk Data Import

Scenario: Import data into SQL database, CRM, or other systems.

Workflow:

  1. Prepare data in JSON (from scripts, APIs, exports)
  2. Convert to CSV
  3. Use bulk import features of target system

Example (Salesforce import):

Email,FirstName,LastName,Company
alice@example.com,Alice,Smith,Acme Corp
bob@example.com,Bob,Jones,Tech Inc

Import via Salesforce Data Loader

Testing and QA

Scenario: Generate test data in CSV format from JSON fixtures.

Workflow:

  1. Create JSON test fixtures
  2. Convert to CSV
  3. Use CSV for manual testing, test automation, or seeding databases

Example (Test users):

[
  {"email": "test1@example.com", "role": "admin"},
  {"email": "test2@example.com", "role": "user"}
]

Convert → Import into test database → Run QA tests

Advanced Techniques

Handling Deeply Nested JSON

Complex JSON:

[
  {
    "user": {
      "profile": {
        "contact": {
          "email": "alice@example.com",
          "phone": "555-1234"
        }
      }
    }
  }
]

Flattened CSV:

user.profile.contact.email,user.profile.contact.phone
alice@example.com,555-1234

Tip: Deep nesting creates long column names. Consider restructuring JSON before conversion if column names become unwieldy.

Arrays in JSON

JSON with arrays:

[
  {
    "name": "Alice",
    "tags": ["developer", "manager"]
  }
]

Flattened output:

name,tags[0],tags[1]
Alice,developer,manager

Without flattening:

name,tags
Alice,"[""developer"",""manager""]"

For many array items: Consider pre-processing to join array values or restructure data.

Inconsistent Object Shapes

JSON with varying keys:

[
  {"name": "Alice", "age": 30, "city": "London"},
  {"name": "Bob", "age": 25},
  {"name": "Charlie", "city": "Paris", "country": "France"}
]

CSV output (all keys merged):

age,city,country,name
30,London,,Alice
25,,,Bob
,,Paris,France,Charlie

Empty cells for missing keys. Headers include all unique keys from all objects.

Custom Data Transformations

Post-process in spreadsheet:

After conversion, use Excel/Sheets formulas:

  • Parse dates: =DATEVALUE(A2)
  • Calculate fields: =B2*C2
  • Clean data: =TRIM(A2), =UPPER(A2)

Or pre-process JSON:

const data = await response.json();

// Transform before converting
const transformed = data.map(item => ({
  ...item,
  fullName: `${item.firstName} ${item.lastName}`,
  total: item.quantity * item.price
}));

// Convert transformed data to CSV

Large JSON Files

For files > 10 MB:

  1. Split JSON array into chunks
  2. Convert each chunk separately
  3. Combine CSV files (same headers)

Or use command-line tools:

# Node.js json2csv
npm install -g json2csv
json2csv -i data.json -o data.csv

# jq + csvkit
cat data.json | jq -r '(map(keys) | add | unique) as $cols | $cols, map([.[ $cols[] ]])' | csv

Troubleshooting Common Issues

"Failed to parse JSON" Error

Causes:

  • Invalid JSON syntax (missing quotes, commas, brackets)
  • Trailing commas (not allowed in JSON)
  • Single quotes instead of double quotes

Solutions:

  1. Validate JSON in a JSON validator first
  2. Copy from API responses or JSON.stringify() output (guaranteed valid)
  3. Use a code editor with JSON validation (VS Code, Sublime)
  4. Check for hidden characters or encoding issues

Empty CSV Output

Causes:

  • JSON is empty array: []
  • All objects are empty: [{}, {}]
  • JSON root is not object/array

Solutions:

  1. Verify JSON contains actual data
  2. Check that objects have properties
  3. Ensure JSON is array or object, not string/number/boolean

Missing Columns

Problem: Some fields don't appear in CSV output.

Causes:

  • Flattening is off but fields are nested
  • Keys are inconsistent across objects

Solutions:

  1. Enable "Flatten nested objects" for nested fields
  2. Verify all objects have the expected keys
  3. Check that key names match exactly (case-sensitive)

Too Many Columns

Problem: CSV has hundreds of columns from flattening.

Cause: Deeply nested JSON or large arrays flattened into separate columns.

Solutions:

  1. Disable flattening to keep nested data as JSON strings
  2. Pre-process JSON to limit nesting depth
  3. Extract only needed fields before conversion

Special Characters Broken

Problem: International characters or emojis appear corrupted.

Solution:

  1. Ensure browser encoding is UTF-8
  2. When opening CSV in Excel:
    • Don't double-click the file
    • Open Excel first
    • Go to Data → From Text/CSV
    • Select file and choose UTF-8 encoding

Excel Opens CSV Incorrectly

Problem: Excel combines columns or doesn't split properly.

Causes:

  • Wrong delimiter (Excel expects semicolon in European locales)
  • UTF-8 BOM missing

Solutions:

  1. Try semicolon delimiter instead of comma
  2. Use "Data → From Text/CSV" in Excel (not double-click)
  3. Set delimiter explicitly during import

Integration with Development Workflows

JavaScript / Node.js

In browser:

// Fetch API data
const response = await fetch('/api/users');
const data = await response.json();

// Convert using this tool or json2csv library

In Node.js:

const json2csv = require('json2csv').parse;
const fs = require('fs');

const data = require('./data.json');
const csv = json2csv(data);
fs.writeFileSync('output.csv', csv);

Python

Using pandas:

import pandas as pd
import json

# Read JSON
with open('data.json', 'r') as f:
    data = json.load(f)

# Convert to DataFrame then CSV
df = pd.json_normalize(data)  # Flattens nested JSON
df.to_csv('output.csv', index=False)

REST API Integration

Export endpoint:

// Express.js example
app.get('/api/export', async (req, res) => {
  const data = await db.collection('users').find().toArray();

  // Convert to CSV (using json2csv library)
  const csv = json2csv.parse(data);

  res.header('Content-Type', 'text/csv');
  res.attachment('export.csv');
  res.send(csv);
});

Automated Reports

Scheduled export:

// Node.js cron job
const cron = require('node-cron');

cron.schedule('0 9 * * 1', async () => {
  // Every Monday at 9 AM
  const data = await fetchWeeklySalesData();
  const csv = convertToCSV(data);
  await emailReport(csv);
});

Best Practices

JSON Preparation

  • Consistent structure: Ensure all objects have the same shape
  • Flatten when possible: Minimize nesting for cleaner CSV output
  • Use meaningful keys: Keys become column names, make them descriptive
  • Avoid huge arrays: Arrays flatten to many columns, consider limiting
  • Clean data first: Remove null/undefined values or fill with defaults

CSV Output

  • Include headers: Almost always useful for spreadsheet import
  • Choose right delimiter: Comma for US, semicolon for Europe, tab for data analysis
  • Flatten for analysis: Enable flattening for spreadsheet use
  • Keep raw for storage: Disable flattening to preserve structure
  • Test with sample: Convert small sample first to verify output

Data Quality

  • Validate JSON first: Ensure JSON is valid before conversion
  • Check column count: Too many columns may indicate issues
  • Verify data types: Numbers should look like numbers, not strings
  • Handle nulls: Decide how to represent missing data
  • Consistent formatting: Use same date/number formats across objects

Performance

  • Limit JSON size: For very large files (> 50 MB), use command-line tools
  • Flatten carefully: Deep flattening can create many columns and slow conversion
  • Process in batches: For huge datasets, split into chunks

Command-Line Alternatives

For production workflows or very large files:

json2csv (Node.js)

npm install -g json2csv
json2csv -i input.json -o output.csv --flatten-objects

Benefits:

  • Handles huge files
  • Configurable options
  • Streaming support

jq + csvkit

# Convert JSON to CSV
cat data.json | in2csv -f json > data.csv

Python pandas

import pandas as pd

df = pd.read_json('data.json')
df.to_csv('output.csv', index=False)

Benefits:

  • Automatic type detection
  • Advanced data manipulation
  • Handles complex structures

Privacy & Security

Local Processing

All conversions happen in your browser:

  • No JSON data is uploaded to servers
  • No network requests are made
  • Files are processed entirely client-side
  • Data never leaves your machine

Verify (for security-conscious users):

  1. Open browser DevTools (F12)
  2. Go to Network tab
  3. Convert a JSON file
  4. Observe: No network requests to external servers

When to Use This Tool

Safe for:

  • Public API responses
  • Sample/test data
  • Non-sensitive business data
  • Data you'd share via email

Consider alternatives for:

  • Personal identifying information (PII)
  • Financial records
  • Medical records
  • Trade secrets
  • Confidential business data

For maximum security: Use command-line tools on your local machine with no internet connection.

Summary

This JSON to CSV converter provides:

Local processing: No server uploads, complete privacy ✓ Nested flattening: Expand JSON objects into flat columns ✓ Flexible output: Custom delimiters and header options ✓ Smart handling: Wraps single objects, merges inconsistent keys ✓ One-click export: Copy to clipboard or download CSV ✓ Error handling: Clear messages for invalid JSON ✓ Developer-friendly: Line numbers, monospace fonts, stats

Perfect for API response analysis, database exports, log analysis, and sharing data with non-technical teams.

Frequently Asked Questions

Related Development Tools

Share Your Feedback

Help us improve this tool by sharing your experience

We will only use this to follow up on your feedback