Illustration for Security guide

Security

Securing file uploads with Resumable.js: authentication, authorization, malware scanning, and input sanitization.

Guides·Updated 2025-11-08

File upload endpoints are among the highest-risk surfaces in any web application. They accept arbitrary binary data from untrusted users, write it to your infrastructure, and often trigger downstream processing pipelines. Every year, file upload vulnerabilities appear in major breach reports—path traversal, malware delivery, storage exhaustion, and authentication bypass are all well-documented attack vectors. This guide covers authentication and authorization for Resumable.js uploads, CSRF protection, server-side metadata validation, filename sanitization, malware scanning, rate limiting, storage quotas, and signed upload URLs. For the full set of Resumable.js patterns, see the guides hub.

Layered security diagram showing authentication and validation steps

Authentication: Tokens in Headers

Most upload endpoints sit behind authentication. Resumable.js supports custom headers on every request via the headers configuration:

const r = new Resumable({
  target: '/api/upload',
  headers: {
    'Authorization': `Bearer ${getAccessToken()}`,
  },
});

Every GET (test) and POST (upload) request includes these headers. For token refresh scenarios—where the access token might expire during a long upload—use a function instead of a static object:

const r = new Resumable({
  target: '/api/upload',
  headers: () => ({
    'Authorization': `Bearer ${getAccessToken()}`,
  }),
});

The function runs before each request, ensuring the latest token is always sent. This is essential for OAuth flows with short-lived tokens. A 10-minute token lifetime and a 30-minute upload means chunks will start failing partway through without dynamic header refresh.

Token scope

Apply the principle of least privilege. If your auth system supports scoped tokens, issue upload-specific tokens that can only write to the upload endpoint—not read data, not delete files, not access other APIs. A compromised upload token should be a contained incident, not a skeleton key.

CSRF Protection

Cross-Site Request Forgery is particularly dangerous for upload endpoints. An attacker could embed a form on a malicious site that submits files to your upload endpoint using the victim's session cookies.

Include a CSRF token in Resumable.js headers:

const r = new Resumable({
  target: '/api/upload',
  headers: {
    'X-CSRF-Token': document.querySelector('meta[name="csrf-token"]').content,
  },
});

Your server validates this token on every request. Most web frameworks (Express with csurf, Django, Rails, Laravel) generate and verify CSRF tokens automatically—just make sure your chunked upload route isn't accidentally excluded from CSRF middleware.

A subtlety: if your CSRF token rotates per request (as some frameworks do), it might change between chunks. Use per-session tokens rather than per-request tokens for chunked uploads, or refresh the token from a cookie that your CSRF middleware updates.

Validating Chunk Metadata Server-Side

Never trust client-reported metadata. Resumable.js sends resumableChunkNumber, resumableChunkSize, resumableTotalSize, and resumableTotalChunks with every request. All of these can be tampered with.

Your server should validate:

function validateChunkMetadata(body) {
  const chunkNumber = parseInt(body.resumableChunkNumber, 10);
  const totalChunks = parseInt(body.resumableTotalChunks, 10);
  const chunkSize = parseInt(body.resumableChunkSize, 10);
  const totalSize = parseInt(body.resumableTotalSize, 10);

  // Chunk number must be within expected range
  if (chunkNumber < 1 || chunkNumber > totalChunks) return false;

  // Total size must be within your limits
  if (totalSize > MAX_FILE_SIZE || totalSize < 1) return false;

  // Total chunks must be consistent with total size and chunk size
  const expectedChunks = Math.ceil(totalSize / chunkSize);
  if (totalChunks !== expectedChunks) return false;

  // Chunk size must match your configuration
  if (chunkSize !== EXPECTED_CHUNK_SIZE) return false;

  return true;
}

Why does this matter? Without validation, an attacker could claim a file has 1 chunk but send 10,000, exhausting disk space. Or claim a total size of 100 bytes while uploading gigabytes. The metadata should be internally consistent and match your server's expectations.

Preventing Path Traversal in Filenames

The resumableFilename parameter comes directly from the user's file system. On most operating systems, filenames can contain characters that create security issues on your server:

  • ../../etc/passwd — classic path traversal
  • file\x00.png — null byte injection
  • CON, PRN, AUX — reserved names on Windows
  • Extremely long filenames that overflow buffers

Never use resumableFilename directly in file paths. Sanitize aggressively:

function sanitizeFilename(filename) {
  // Remove path separators
  let safe = filename.replace(/[/\\]/g, '');

  // Remove null bytes
  safe = safe.replace(/\0/g, '');

  // Strip leading dots (hidden files, directory traversal)
  safe = safe.replace(/^\.+/, '');

  // Limit length
  safe = safe.substring(0, 200);

  // Fallback if empty
  if (!safe || safe.trim() === '') {
    safe = 'unnamed_upload';
  }

  return safe;
}

Better yet, don't use the original filename for storage at all. Generate a UUID or hash-based name on the server and store the original filename in metadata (a database record). This eliminates the entire class of path traversal attacks.

const storageKey = `${crypto.randomUUID()}${path.extname(sanitizedFilename)}`;

Malware Scanning Post-Assembly

Scanning individual chunks for malware is unreliable—a virus signature might span a chunk boundary. Scan the fully assembled file instead.

Integration approaches

Synchronous scanning — After assembly, run the file through ClamAV or a commercial scanner before making it available. The user waits for the scan to complete. Simple, but adds latency to the upload completion.

const NodeClam = require('clamscan');
const clam = await new NodeClam().init({
  clamdscan: { host: '127.0.0.1', port: 3310 },
});

async function scanFile(filePath) {
  const { isInfected, viruses } = await clam.scanFile(filePath);
  if (isInfected) {
    fs.unlinkSync(filePath); // delete immediately
    throw new Error(`Malware detected: ${viruses.join(', ')}`);
  }
}

Asynchronous scanning — Mark the file as "pending review" immediately after assembly. A background worker scans it and updates the status. The user gets a fast completion response, but the file isn't accessible until cleared. This is better for user experience and lets you use heavier scanning tools without blocking the upload response.

Cloud-based scanning — Services like AWS Macie, Google Cloud DLP, or dedicated file scanning APIs handle scanning in the cloud. Upload the assembled file to a quarantine bucket, trigger the scan, and move it to the production bucket only after it passes.

Rate Limiting Per User

Without rate limits, a single user (or bot) can monopolize your upload infrastructure. Implement limits at multiple levels:

  • Requests per minute: Cap the number of chunk uploads per user per minute. Resumable.js with 3 simultaneous uploads and 2MB chunks generates roughly 90 requests per minute on a fast connection. Set your limit above this but below abuse thresholds—say, 200 requests per minute per user.
  • Bytes per hour: Cap total data volume per user. If your service allows 500 MB uploads, maybe a user can upload 5 GB per hour total.
  • Concurrent uploads: Limit how many files a single user can upload simultaneously.
const rateLimit = require('express-rate-limit');

const uploadLimiter = rateLimit({
  windowMs: 60 * 1000,
  max: 200,
  keyGenerator: (req) => req.user?.id || req.ip,
  message: { error: 'Upload rate limit exceeded. Try again shortly.' },
});

app.use('/api/upload', uploadLimiter);

Return 429 Too Many Requests when limits are exceeded. Include 429 in Resumable.js's permanentErrors if you want the upload to halt, or leave it out to let retries handle brief limit bursts.

Storage Quotas

Rate limiting controls speed. Quotas control total consumption. Track cumulative storage per user and reject uploads that would exceed the quota:

app.post('/api/upload', async (req, res) => {
  const userStorage = await getUserStorageUsed(req.user.id);
  const incomingTotal = parseInt(req.body.resumableTotalSize, 10);

  if (userStorage + incomingTotal > USER_QUOTA) {
    return res.status(413).json({
      error: 'QUOTA_EXCEEDED',
      message: `Storage quota exceeded. ${formatBytes(USER_QUOTA - userStorage)} remaining.`,
    });
  }

  // ... handle upload
});

Check on the first chunk rather than every chunk. Store the pending upload's declared size when it starts, and deduct it from available quota. If the upload completes, update the actual usage. If it's abandoned, a cleanup job reclaims the reserved space.

Signed Upload URLs

For architectures where the upload endpoint is separate from the authentication server, signed URLs provide a secure handoff:

  1. The authenticated client requests an upload session from your API.
  2. Your API generates a time-limited, signed URL with embedded permissions.
  3. The client configures Resumable.js to upload to the signed URL.
  4. The upload server validates the signature without needing access to your auth database.
// Server: generate signed upload URL
const token = jwt.sign(
  { userId: user.id, maxSize: 500 * 1024 * 1024, expires: Date.now() + 3600000 },
  SIGNING_SECRET
);
const uploadUrl = `/api/upload?token=${token}`;

The upload server verifies the JWT signature, checks expiration, and enforces the embedded constraints (max size, allowed file types). This decouples authentication from upload handling—useful when your upload servers are separate infrastructure that shouldn't have database access.

Content Type Verification

Don't trust resumableType. As covered in the file validation guide, MIME types reported by the browser are unreliable. After file assembly, verify the actual content:

const { fileTypeFromFile } = require('file-type');

const detected = await fileTypeFromFile(assembledFilePath);
if (!ALLOWED_TYPES.includes(detected?.mime)) {
  fs.unlinkSync(assembledFilePath);
  throw new Error(`Unexpected file type: ${detected?.mime}`);
}

This catches renamed executables, polyglot files, and content that doesn't match its extension. Combined with malware scanning, it forms a robust post-upload verification pipeline.

Security is never a single check—it's layers. Client-side validation improves user experience. Server-side validation enforces policy. Authentication controls access. Rate limiting prevents abuse. Malware scanning catches threats. Each layer catches what the others miss. Build all of them, because attackers only need to find one gap.