Unix Timestamps Explained: Converting Epoch Time Like a Pro

You see them in API responses, database records, log files, and JWT tokens — long numbers like 1711234567 that represent a point in time. These are Unix timestamps, and understanding how they work makes debugging time-related issues much easier.

What Is a Unix Timestamp?

A Unix timestamp (also called epoch time or POSIX time) is the number of seconds elapsed since January 1, 1970 00:00:00 UTC — a moment known as the Unix epoch.

Timestamp:  0            → Jan 1, 1970 00:00:00 UTC
Timestamp:  1000000000   → Sep 9, 2001 01:46:40 UTC
Timestamp:  1711234567   → Mar 23, 2024 20:16:07 UTC
Timestamp:  2000000000   → May 18, 2033 03:33:20 UTC

It's a single integer that uniquely identifies a moment in time, regardless of timezone. That simplicity is why it's used everywhere.

Why Use Unix Timestamps?

1. Timezone-Independent

A Unix timestamp represents an absolute point in time. 1711234567 means the same instant whether you're in New York, Tokyo, or London. The human-readable format changes with timezone, but the number doesn't.

2. Easy to Compare & Calculate

Is event A before event B? Just compare two integers. How much time between them? Subtract. No date parsing, no format matching, no timezone conversion.

// Time between two events
duration_seconds = event_b_timestamp - event_a_timestamp

// Is token expired?
is_expired = current_timestamp > token_exp_timestamp

3. Compact Storage

A timestamp is a single 32-bit or 64-bit integer — 4 or 8 bytes. Compare that to storing "2024-03-23T20:16:07.000Z" as a 24-byte string.

4. Universal Support

Every programming language, database, and operating system supports Unix timestamps natively.

Seconds vs Milliseconds

This is the most common source of confusion. Some systems use seconds, others use milliseconds:

Seconds:      1711234567       (10 digits)
Milliseconds: 1711234567000    (13 digits)
  • Seconds: Unix/Linux, Python (time.time()), PHP, Ruby, JWT exp/iat claims
  • Milliseconds: JavaScript (Date.now()), Java (System.currentTimeMillis()), most frontend APIs
💡 Quick check: If the number has 10 digits, it's seconds. If it has 13 digits, it's milliseconds. If you get a date in 1970 when converting, you probably mixed up the two.

Converting in Common Languages

JavaScript

// Current timestamp (milliseconds)
Date.now()                    // 1711234567000

// Current timestamp (seconds)
Math.floor(Date.now() / 1000) // 1711234567

// Timestamp → Date
new Date(1711234567 * 1000)   // Sat Mar 23 2024 ...

// Date → Timestamp (seconds)
Math.floor(new Date("2024-03-23").getTime() / 1000)

Python

import time
from datetime import datetime, timezone

# Current timestamp (seconds, float)
time.time()                           # 1711234567.123

# Timestamp → datetime (UTC)
datetime.fromtimestamp(1711234567, tz=timezone.utc)

# datetime → timestamp
datetime(2024, 3, 23, tzinfo=timezone.utc).timestamp()

Shell (Bash)

# Current timestamp
date +%s                              # 1711234567

# Timestamp → human-readable (GNU date)
date -d @1711234567                   # Sat Mar 23 20:16:07 UTC 2024

# Timestamp → human-readable (macOS)
date -r 1711234567

SQL (PostgreSQL)

-- Current timestamp
SELECT EXTRACT(EPOCH FROM NOW());

-- Timestamp → date
SELECT TO_TIMESTAMP(1711234567);

-- Date → timestamp
SELECT EXTRACT(EPOCH FROM '2024-03-23'::timestamp);

The Year 2038 Problem

Systems storing Unix timestamps as 32-bit signed integers will overflow on January 19, 2038 at 03:14:07 UTC. After that moment, the timestamp wraps to a large negative number, interpreted as December 1901.

Max 32-bit signed: 2,147,483,647 → Jan 19, 2038 03:14:07 UTC
Next second:       -2,147,483,648 → Dec 13, 1901 20:45:52 UTC

Most modern systems use 64-bit timestamps, which won't overflow for another 292 billion years. But embedded systems, old databases, and legacy code may still use 32-bit values. If you're building something that stores dates beyond 2038, verify your storage uses 64-bit integers.

Common Gotchas

1. Timezone Confusion

Unix timestamps are always UTC. If you create a Date object in JavaScript, it displays in local time by default. Use .toUTCString() or .toISOString() to see UTC.

2. Seconds vs Milliseconds (Again)

If your converted date shows January 1, 1970, you passed seconds to a function expecting milliseconds (or vice versa). Multiply by 1000 or divide by 1000.

3. Leap Seconds

Unix time does not count leap seconds. A Unix day is always exactly 86,400 seconds. In practice, this means UTC clocks occasionally disagree with Unix time by a second or two, but it rarely matters outside of high-precision scientific applications.

4. Negative Timestamps

Dates before January 1, 1970 have negative timestamps. -86400 is December 31, 1969. Not all systems handle negative timestamps correctly.

Try It Yourself

Convert any timestamp to a human-readable date (or vice versa) with the Timestamp Converter — all processing happens in your browser.

Need to convert a timestamp?

Open Timestamp Converter →