Dev ToolsMarch 30, 2026

Epoch Timestamp Converter Guide: Unix Time, Formats & Examples

By The hakaru Team·Last updated March 2026

Quick Answer

  • *A Unix epoch timestamp counts seconds since January 1, 1970 00:00:00 UTC.
  • *10-digit timestamps are in seconds. 13-digit timestamps are in milliseconds.
  • *JavaScript: Date.now() returns milliseconds. Python: time.time() returns seconds.
  • *The Year 2038 problem affects 32-bit systems — most modern platforms have moved to 64-bit.

What Is Unix Epoch Time?

Unix epoch time (also called POSIX time or Unix time) is a system for tracking time as a running count of seconds. The count starts at January 1, 1970, 00:00:00 UTC— a date known as the Unix epoch.

Right now, the epoch timestamp is somewhere around 1.78 billion. That single integer tells you exactly when something happened, anywhere in the world, with no ambiguity about time zones or date formats.

The concept was introduced in the early 1970s with the Unix operating system at Bell Labs. According to the Open Group Base Specifications (IEEE Std 1003.1), epoch time deliberately excludes leap seconds — each day is treated as exactly 86,400 seconds.

Seconds vs Milliseconds vs Microseconds

Different systems use different precisions. Here is how to tell them apart:

PrecisionDigitsExampleUsed By
Seconds101700000000Unix/Linux, Python time.time(), PHP, Ruby
Milliseconds131700000000000JavaScript, Java, Kafka, Elasticsearch
Microseconds161700000000000000PostgreSQL, Go time.UnixMicro()
Nanoseconds191700000000000000000Go time.UnixNano(), InfluxDB

A common bug: feeding a millisecond timestamp (13 digits) into a function expecting seconds produces a date thousands of years in the future. Always check your digit count. If you see 13 digits, divide by 1,000 before treating it as seconds.

Converting Epoch Timestamps by Language

JavaScript

JavaScript's Date object works in milliseconds internally. Date.now() returns the current time as milliseconds since epoch. To create a Date from a seconds-based timestamp, multiply by 1,000: new Date(1700000000 * 1000). To get seconds from a Date: Math.floor(date.getTime() / 1000).

Python

The time module returns seconds as a float: time.time() gives something like 1700000000.123456. For conversion: datetime.fromtimestamp(1700000000, tz=timezone.utc). The tzparameter is critical — without it, Python uses the local timezone, which leads to subtle bugs in server environments.

SQL (PostgreSQL)

PostgreSQL stores timestamps with microsecond precision. To convert epoch seconds to a timestamp: TO_TIMESTAMP(1700000000). To extract epoch from a timestamp column: EXTRACT(EPOCH FROM created_at). MySQL uses FROM_UNIXTIME() instead.

Go

Go's time.Now().Unix() returns seconds, time.Now().UnixMilli() returns milliseconds. To parse: time.Unix(1700000000, 0). Go's time package is timezone-aware by default and handles DST correctly.

Bash / Command Line

Current epoch: date +%s. Convert epoch to readable date on Linux: date -d @1700000000. On macOS: date -r 1700000000. The difference between Linux and macOS flags trips up many developers.

ISO 8601 vs Epoch: When to Use Which

ISO 8601 dates (like 2023-11-14T22:13:20Z) are human-readable but come with parsing overhead. Epoch timestamps are compact and fast to compare. Here is a practical guide:

Use CasePreferred FormatReason
Database storageNative timestamp typeLet the DB handle timezone math
API responsesISO 8601 with timezoneHuman-readable, widely supported
Log filesISO 8601 or epochISO for readability, epoch for fast sorting
JWT tokensEpoch secondsRFC 7519 mandates NumericDate (epoch)
Cron jobs / schedulingEpoch secondsNo timezone ambiguity
User-facing displayLocalized date stringUsers don't read epoch

According to the 2024 Postman State of the API Report, 73% of REST APIs use ISO 8601 for date/time fields, while 22% use epoch timestamps. The remaining 5% use other formats. JWT, gRPC, and event streaming systems overwhelmingly prefer epoch.

The Year 2038 Problem

On January 19, 2038 at 03:14:07 UTC, 32-bit signed integer timestamps hit their maximum value: 2,147,483,647. One second later, the value overflows and wraps to a negative number, which systems interpret as December 13, 1901.

This is analogous to the Y2K bug but affects a different layer of the stack. Linux kernels since version 5.6 (March 2020) use 64-bit timestamps internally. However, the Linux Foundation's 2023 audit found that roughly 15% of embedded systemsstill rely on 32-bit time — particularly in IoT devices, automotive firmware, and legacy industrial controls.

64-bit timestamps won't overflow for another 292 billion years, well past the expected lifetime of the Sun.

Common Pitfalls

Timezone Confusion

Epoch timestamps are always UTC. But converting to a local time without specifying the target timezone leads to incorrect results. A server in US-East and one in UTC will produce different "local" dates from the same timestamp. Always store and transmit in UTC, convert to local only at the display layer.

Leap Seconds

Unix time ignores leap seconds by design. UTC has added 27 leap seconds since 1972 (the most recent in December 2016). This means Unix time drifts from atomic time by 27 seconds. For most applications this is irrelevant, but high-frequency trading and satellite systems must account for it separately.

Negative Timestamps

Dates before January 1, 1970 are represented as negative epoch values. December 31, 1969 23:59:59 UTC is timestamp -1. Some systems and languages don't handle negative timestamps gracefully — always test edge cases if your data includes pre-1970 dates.

Convert timestamps instantly

Use our free Epoch Timestamp Converter →

Frequently Asked Questions

What is a Unix epoch timestamp?

A Unix epoch timestamp is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC (the Unix epoch). For example, the timestamp 1700000000 represents November 14, 2023, at 10:13:20 PM UTC. It is a single integer that unambiguously represents a moment in time, independent of time zones.

What is the difference between seconds and milliseconds in epoch time?

Unix timestamps in seconds are 10 digits long (e.g., 1700000000), while millisecond timestamps are 13 digits long (e.g., 1700000000000). JavaScript's Date.now() returns milliseconds. Python's time.time() returns seconds as a float. To convert milliseconds to seconds, divide by 1,000.

What is the Year 2038 problem?

The Year 2038 problem occurs on January 19, 2038 at 03:14:07 UTC, when 32-bit signed integer timestamps overflow. The maximum value a signed 32-bit integer can hold is 2,147,483,647. Systems still using 32-bit time will wrap around to negative values. Most modern systems now use 64-bit timestamps, which won't overflow for another 292 billion years.

How do I get the current epoch timestamp?

In JavaScript: Math.floor(Date.now() / 1000) for seconds, or Date.now() for milliseconds. In Python: import time; int(time.time()). In Bash: date +%s. In SQL (PostgreSQL): SELECT EXTRACT(EPOCH FROM NOW())::INTEGER.

Why do developers use epoch timestamps instead of date strings?

Epoch timestamps are timezone-independent, sort naturally as integers, require no parsing, take up less storage space (4 or 8 bytes vs variable-length strings), and make arithmetic trivial — subtracting two timestamps gives you the duration in seconds. They eliminate ambiguity from date format differences like MM/DD/YYYY vs DD/MM/YYYY.