Table Data Export Implementation Guide
This guide explains how to implement complete table data export functionality that downloads ALL data from a table, not just the current page. This is useful for exporting large datasets to CSV format.
Overview
The export functionality:
- ✅ Fetches ALL data through pagination (not just visible rows)
- ✅ Supports column selection
- ✅ Respects current table filters (search, segment, status, etc.)
- ✅ Shows progress during export
- ✅ Handles large datasets efficiently
- ✅ Downloads CSV file directly in browser
Architecture
Step-by-Step Implementation
Step 1: Create Export Service
Create a new file: utils/exportTableData.js
import { message } from 'antd';
import axios from 'axios';
const DEFAULT_PAGE_SIZE = 200; // Adjust based on your API limits
/**
* Export all table data to CSV
*
* @param {Object} options
* @param {string} options.apiEndpoint - API endpoint to fetch data from
* @param {string[]} options.selectedColumns - Array of column names to export
* @param {number} options.pageSize - Number of records per page (default: 200)
* @param {Object} options.filters - Current table filters (search, segment, etc.)
* @param {Function} options.transformRow - Function to transform each row
* @param {Function} options.onProgress - Optional progress callback
*/
export async function exportAllTableDataToCSV({
apiEndpoint,
selectedColumns = [],
pageSize = DEFAULT_PAGE_SIZE,
filters = {},
transformRow = (row) => row,
onProgress = null,
} = {}) {
if (!apiEndpoint) {
throw new Error('API endpoint is required');
}
let page = 1;
let total = Infinity;
const rows = [];
let hasMoreData = true;
try {
// Show initial loading message
message.loading({ content: 'Preparing export...', key: 'export', duration: 0 });
// Fetch ALL data by paginating through every page
while (hasMoreData) {
const params = {
page,
limit: pageSize,
...filters, // Include current table filters
};
const { data } = await axios.get(apiEndpoint, { params });
// Adjust these based on your API response structure
const items = data?.items ?? data?.data ?? data?.buildings ?? [];
const pagination = data?.pagination ?? {};
// On first page, determine total count
if (page === 1) {
if (items.length === 0) {
message.destroy('export');
message.info('No data to export');
return;
}
// Get total from pagination if available
const reportedTotal = typeof pagination.total === 'number' && Number.isFinite(pagination.total)
? pagination.total
: null;
total = reportedTotal ?? Infinity;
// Update progress message
if (reportedTotal) {
message.loading({
content: `Exporting ${reportedTotal} records... (0/${reportedTotal})`,
key: 'export',
duration: 0,
});
} else {
message.loading({
content: `Exporting records... (page ${page})`,
key: 'export',
duration: 0,
});
}
}
// Transform and add rows
rows.push(...items.map((item) => transformRow(item, selectedColumns)));
// Update progress
if (total !== Infinity) {
message.loading({
content: `Exporting ${total} records... (${rows.length}/${total})`,
key: 'export',
duration: 0,
});
} else {
message.loading({
content: `Exporting records... (${rows.length} so far, page ${page})`,
key: 'export',
duration: 0,
});
}
// Call progress callback if provided
if (onProgress) {
onProgress({ current: rows.length, total, page });
}
// Determine if there's more data
// Stop if:
// 1. No items returned (empty page)
// 2. Fewer items than pageSize (last page)
// 3. We've reached the total count (if known)
if (items.length === 0) {
hasMoreData = false;
} else if (items.length < pageSize) {
hasMoreData = false; // Last page (partial page)
} else if (total !== Infinity && rows.length >= total) {
hasMoreData = false; // Reached known total
} else {
page += 1; // Continue to next page
}
}
// Destroy loading message
message.destroy('export');
// Generate CSV
const csvContent = generateCSV(selectedColumns, rows);
// Download file
const filename = `export-${getCurrentDate()}.csv`;
downloadFile(csvContent, filename, 'text/csv');
message.success(`Exported ${rows.length} records to CSV`);
} catch (error) {
message.destroy('export');
console.error('Export failed:', error);
message.error('Failed to export data');
throw error;
}
}
/**
* Generate CSV content from headers and rows
*/
function generateCSV(headers, rows) {
const headerRow = headers.join(',');
const dataRows = rows.map((row) =>
row.map((cell) => formatCSVField(cell)).join(',')
);
return [headerRow, ...dataRows].join('\n');
}
/**
* Format a single CSV field (escape quotes, handle special chars)
*/
function formatCSVField(value) {
if (value === null || value === undefined) return '""';
const stringValue = String(value);
const sanitized = stringValue.replace(/"/g, '""'); // Escape quotes
return `"${sanitized}"`;
}
/**
* Download file in browser
*/
function downloadFile(content, filename, mimeType) {
const blob = new Blob([content], { type: `${mimeType};charset=utf-8;` });
const link = document.createElement('a');
const url = URL.createObjectURL(blob);
link.setAttribute('href', url);
link.setAttribute('download', filename);
link.style.visibility = 'hidden';
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
URL.revokeObjectURL(url);
}
/**
* Get current date in YYYY-MM-DD format
*/
function getCurrentDate() {
return new Date().toISOString().split('T')[0];
}Step 2: Create Data Transformation Utility
Create a new file: utils/transformExportData.js
/**
* Column mapping: display name -> data path
* Update this based on your data structure
*/
const COLUMN_MAPPING = {
'ID': 'id',
'Name': 'name',
'Email': 'email',
'Full Address': 'address.full',
'Street': 'address.street',
'City': 'address.city',
'Status': 'status',
'Created At': 'createdAt',
// Add your column mappings here
};
/**
* Transform a single row for export
*
* @param {Object} item - The data item
* @param {string[]} selectedColumns - Array of column display names
* @returns {Array} Array of cell values in the same order as selectedColumns
*/
export function transformRowForExport(item, selectedColumns) {
return selectedColumns.map((columnName) => {
return getColumnValue(item, columnName);
});
}
/**
* Get the value for a specific column from an item
*/
function getColumnValue(item, columnName) {
// Handle custom fields (e.g., "custom.fieldName")
if (columnName.startsWith('custom.')) {
const fieldName = columnName.replace('custom.', '');
return item[fieldName] ?? '';
}
// Handle nested datapoints (e.g., "datapoint.key")
if (columnName.startsWith('datapoint.')) {
const key = columnName.replace('datapoint.', '');
return item?.dataPoints?.[key] ?? '';
}
// Check column mapping
const columnKey = COLUMN_MAPPING[columnName];
if (!columnKey) {
return '';
}
// Handle nested paths (e.g., 'address.full')
if (columnKey.includes('.')) {
const parts = columnKey.split('.');
let value = item;
for (const part of parts) {
value = value?.[part];
if (value === undefined || value === null) {
return '';
}
}
return formatValue(value);
}
// Direct property access
return formatValue(item?.[columnKey]);
}
/**
* Format a value for CSV export
*/
function formatValue(value) {
if (value === undefined || value === null) return '';
if (Array.isArray(value)) return value.join('; ');
if (value instanceof Date) return value.toISOString();
if (typeof value === 'object') return JSON.stringify(value);
return String(value);
}Step 3: Create Column Selection Modal
Create a new file: components/ExportColumnsModal.jsx
import { Modal, Select, Typography, message } from 'antd';
import { useState, useEffect } from 'react';
const { Text } = Typography;
/**
* Modal for selecting columns to export
*
* @param {boolean} open - Whether modal is open
* @param {Function} onCancel - Cancel handler
* @param {Function} onConfirm - Confirm handler (receives selectedColumns array)
* @param {Array} availableColumns - Array of {label, value} objects
* @param {Array} defaultColumns - Default selected columns
* @param {boolean} loading - Loading state
*/
export default function ExportColumnsModal({
open,
onCancel,
onConfirm,
availableColumns = [],
defaultColumns = [],
loading = false,
}) {
const [selectedColumns, setSelectedColumns] = useState(defaultColumns);
useEffect(() => {
if (open) {
setSelectedColumns(defaultColumns);
}
}, [open, defaultColumns]);
const handleConfirm = () => {
if (selectedColumns.length === 0) {
message.warning('Please select at least one column');
return;
}
onConfirm(selectedColumns);
};
return (
<Modal
title="Select Columns to Export"
open={open}
onCancel={onCancel}
onOk={handleConfirm}
okText="Export"
cancelText="Cancel"
width={600}
okButtonProps={{ loading: loading }}
cancelButtonProps={{ disabled: loading }}
>
<div style={{ marginBottom: 16 }}>
<Text type="secondary">
Choose which columns to include in the export file. You can add or remove columns as needed.
</Text>
</div>
<Select
mode="tags"
value={selectedColumns}
onChange={setSelectedColumns}
options={availableColumns.map((col) => ({
label: col.label || col.name,
value: col.value || col.key,
}))}
placeholder="Select columns to export"
style={{ width: '100%' }}
showSearch
filterOption={(input, option) =>
(option?.label ?? '').toLowerCase().includes(input.toLowerCase())
}
/>
<div style={{ marginTop: 12 }}>
<Text type="secondary" style={{ fontSize: '12px' }}>
{selectedColumns.length} column{selectedColumns.length !== 1 ? 's' : ''} selected
</Text>
</div>
</Modal>
);
}Step 4: Integrate in Your Table Component
Update your table component to include export functionality:
// components/YourTableComponent.jsx
import { Button } from 'antd';
import { useState } from 'react';
import { exportAllTableDataToCSV } from '@/utils/exportTableData';
import { transformRowForExport } from '@/utils/transformExportData';
import ExportColumnsModal from '@/components/ExportColumnsModal';
export default function YourTableComponent() {
const [exportModalOpen, setExportModalOpen] = useState(false);
const [exporting, setExporting] = useState(false);
const [filters, setFilters] = useState({});
const [searchTerm, setSearchTerm] = useState('');
// Define available columns for export
const availableColumns = [
{ label: 'ID', value: 'ID' },
{ label: 'Name', value: 'Name' },
{ label: 'Email', value: 'Email' },
{ label: 'Status', value: 'Status' },
{ label: 'Created At', value: 'Created At' },
// Add all your table columns here
];
const defaultColumns = ['ID', 'Name', 'Email', 'Status'];
const handleExport = () => {
setExportModalOpen(true);
};
const handleExportConfirm = async (selectedColumns) => {
try {
setExporting(true);
// Build filters from current table state
const exportFilters = {
// Include search if active
...(searchTerm && { search: searchTerm }),
// Include other active filters (segment, status, etc.)
...filters,
};
await exportAllTableDataToCSV({
apiEndpoint: '/api/v1/your-endpoint', // Your API endpoint
selectedColumns,
filters: exportFilters,
transformRow: (item, columns) =>
transformRowForExport(item, columns),
onProgress: ({ current, total, page }) => {
console.log(`Progress: ${current}/${total} (page ${page})`);
},
});
setExportModalOpen(false);
} catch (error) {
console.error('Export failed:', error);
} finally {
setExporting(false);
}
};
return (
<div>
{/* Your table filters and search */}
<Button onClick={handleExport} loading={exporting}>
Export All Data
</Button>
<ExportColumnsModal
open={exportModalOpen}
onCancel={() => setExportModalOpen(false)}
onConfirm={handleExportConfirm}
availableColumns={availableColumns}
defaultColumns={defaultColumns}
loading={exporting}
/>
{/* Your table component */}
</div>
);
}Backend API Requirements
Your API endpoint must support:
-
Pagination Parameters
page- Page number (starts at 1)limit- Number of items per page
-
Filter Parameters
- Accept any filters your table uses (search, segment, status, etc.)
-
Response Format
{ "items": [...], // or "data", "buildings", etc. "pagination": { "page": 1, "limit": 200, "total": 1500, "totalPages": 8 } }
Example API Handler
// pages/api/v1/your-endpoint.js
export default async function handler(req, res) {
const { page = 1, limit = 25, search, status } = req.query;
const pageNumber = parseInt(page, 10) || 1;
const limitNumber = parseInt(limit, 10) || 25;
const skip = (pageNumber - 1) * limitNumber;
// Build query from filters
const query = {};
if (search) {
query.$or = [
{ name: new RegExp(search, 'i') },
{ email: new RegExp(search, 'i') },
];
}
if (status) {
query.status = status;
}
// Fetch data
const items = await YourModel.find(query)
.skip(skip)
.limit(limitNumber)
.sort({ createdAt: -1 });
const total = await YourModel.countDocuments(query);
const totalPages = Math.ceil(total / limitNumber);
return res.status(200).json({
items,
pagination: {
page: pageNumber,
limit: limitNumber,
total,
totalPages,
},
});
}Key Implementation Details
1. Pagination Strategy
The export function continues fetching pages until:
- An empty page is returned
- A partial page is returned (fewer items than
pageSize) - The total count is reached (if known)
2. Handling Missing Pagination Metadata
If your API doesn’t return pagination.total, the function will:
- Continue fetching until it receives an empty or partial page
- Show progress based on pages fetched rather than total count
3. Progress Feedback
The function shows loading messages with:
- Total count (if available):
"Exporting 1500 records... (500/1500)" - Page-based progress (if total unknown):
"Exporting records... (500 so far, page 3)"
4. Filter Integration
Always pass current table filters to the export function:
const exportFilters = {
...(searchTerm && { search: searchTerm }),
...(selectedSegment && { segment: selectedSegment }),
...(statusFilter && { status: statusFilter }),
};This ensures the export matches what the user sees in the table.
Testing Checklist
- Export works with small datasets (< 1 page)
- Export works with large datasets (multiple pages)
- Export respects search filters
- Export respects segment/status filters
- Column selection works correctly
- CSV file downloads successfully
- CSV file contains all expected columns
- CSV file contains all data (not just first page)
- Progress messages display correctly
- Error handling works (no data, API errors)
Troubleshooting
Issue: Export stops after first page
Solution: Check that your API returns pagination.total or that the function correctly detects partial pages.
Issue: CSV has empty columns
Solution: Verify your COLUMN_MAPPING matches your data structure and that transformRowForExport correctly extracts values.
Issue: Export is slow
Solution:
- Increase
pageSize(try 500 instead of 200) - Check API performance
- Consider server-side export for very large datasets (100k+ records)
Issue: Browser crashes on large exports
Solution:
- Reduce
pageSizeto 100 - Consider implementing server-side export
- Add memory checks and warnings for exports > 50k records
Performance Considerations
- Page Size: Use 200-500 records per page (balance between speed and memory)
- Large Datasets: For 100k+ records, consider server-side export
- Rate Limiting: Add delays if your API has rate limits:
await new Promise(resolve => setTimeout(resolve, 100)); // 100ms delayAlternative: Excel Export
To export to Excel instead of CSV, use the xlsx library:
npm install xlsximport * as XLSX from 'xlsx';
function exportToExcel(headers, rows, filename) {
const worksheet = XLSX.utils.aoa_to_sheet([headers, ...rows]);
const workbook = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(workbook, worksheet, 'Sheet1');
XLSX.writeFile(workbook, `${filename}.xlsx`);
}Summary
This implementation provides a complete solution for exporting all table data:
- ✅ Fetches ALL data through pagination
- ✅ Supports column selection
- ✅ Respects table filters
- ✅ Shows progress feedback
- ✅ Handles edge cases (empty data, API errors)
- ✅ Downloads CSV file in browser
Follow the steps above to implement this in any project. Adjust the column mappings and API endpoints to match your specific data structure.