Unix Time Converter — Epoch Timestamp

Free Unix time converter. Convert between Unix/Epoch timestamps and human-readable dates instantly. Live clock, millisecond support, and code snippets.

Current Unix Timestamp

Convert

How to Use

Choose a mode: "Unix → Date" to convert a timestamp to a human date, or "Date → Unix" to convert a date to a timestamp.

Enter a Unix timestamp (seconds or milliseconds — auto-detected) or pick a date, time, and timezone.

Click Convert to see the result with UTC date, ISO 8601 format, day of week, and code snippets in 5 languages.

What is a Unix Timestamp?

A Unix timestamp is the number of seconds since January 1, 1970, 00:00:00 UTC (the "Unix Epoch"). It is used by all major operating systems and programming languages to store and compare dates. For example, timestamp 1700000000 = November 14, 2023. The current timestamp updates every second and is shown in the live clock above.

Complete Guide to Unix Time

How Unix Time Works

Unix time is a simple, universal way to represent any point in time as a single number. Instead of dealing with years, months, days, hours, minutes, and timezones separately, everything is reduced to one integer: the number of seconds since the epoch (January 1, 1970 UTC). This makes time comparison trivial — if timestamp A is greater than B, then A is later. It also eliminates timezone confusion since Unix time is always in UTC.

How to Convert Unix Timestamp to Date

In every major programming language, conversion is built in:

JavaScript: new Date(timestamp * 1000) — JavaScript uses milliseconds, so multiply by 1000.

Python: datetime.utcfromtimestamp(ts) — returns a datetime object in UTC.

PHP: date('Y-m-d H:i:s', $ts) — formats the timestamp to your desired format.

SQL: FROM_UNIXTIME(ts) in MySQL, or to_timestamp(ts) in PostgreSQL.

Seconds vs Milliseconds

Most systems use seconds (10 digits, e.g. 1700000000). JavaScript and Java use milliseconds (13 digits, e.g. 1700000000000). Our converter auto-detects: if you enter 13+ digits, it divides by 1000 automatically. When working with APIs, always check their documentation to know which format they expect.

The Year 2038 Problem

Systems using a 32-bit signed integer for Unix time will overflow on January 19, 2038 at 03:14:07 UTC — the maximum value is 2,147,483,647 seconds. After this, the timestamp wraps to a negative number representing December 13, 1901. Modern 64-bit systems store timestamps as 64-bit integers, supporting dates up to 292 billion years in the future. Most modern software has already migrated, but legacy embedded systems (ATMs, IoT devices, older databases) may still be affected.

Unix Time and Timezones

Unix timestamps have no timezone — they always represent UTC. When you display a timestamp to a user, you convert it to their local timezone. This is why Unix time is ideal for storing dates in databases: store once in UTC, display in any timezone. The formula is: local time = UTC time + timezone offset. Our "Date → Unix" converter lets you specify a timezone so the conversion is accurate.

Notable Unix Timestamps

0 — January 1, 1970 00:00:00 UTC (the Epoch)

1000000000 — September 9, 2001 01:46:40 UTC (the billennium)

1234567890 — February 13, 2009 23:31:30 UTC

2000000000 — May 18, 2033 03:33:20 UTC

2147483647 — January 19, 2038 03:14:07 UTC (32-bit max — Y2K38)

Frequently Asked Questions

What is Unix time?
Unix time (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. It is used by virtually all computers and programming languages as a standard way to track time. For example, Unix timestamp 1700000000 represents November 14, 2023 at 10:13:20 PM UTC.
Why does Unix time start on January 1, 1970?
January 1, 1970 was chosen as the "epoch" (starting point) by the creators of the Unix operating system at Bell Labs. It was a convenient recent date at the time of development. The choice was arbitrary but became the universal standard adopted by Linux, macOS, Windows, and nearly all programming languages.
What is the current Unix timestamp?
The current Unix timestamp is shown in the live clock at the top of this page. It increases by 1 every second. You can also get it programmatically: in JavaScript use Date.now()/1000, in Python use time.time(), in PHP use time(), and in Bash use date +%s.
How do I convert a Unix timestamp to a date?
Enter the Unix timestamp in the converter above and it will instantly show the corresponding date and time in both UTC and your local timezone. You can also do it in code: JavaScript new Date(timestamp * 1000), Python datetime.fromtimestamp(timestamp), PHP date("Y-m-d", timestamp).
How do I convert a date to a Unix timestamp?
Use the "Date to Unix" tab above — select the date and time, and the tool will calculate the Unix timestamp. In code: JavaScript Math.floor(new Date("2024-01-01").getTime()/1000), Python int(datetime(2024,1,1).timestamp()), PHP strtotime("2024-01-01").
What is the Year 2038 problem?
Many older systems store Unix time as a signed 32-bit integer, which can hold a maximum value of 2,147,483,647. This corresponds to January 19, 2038 at 03:14:07 UTC. After that, the counter overflows and wraps to a negative number (December 13, 1901). Modern 64-bit systems are not affected — they can count to 292 billion years.
What is the difference between Unix time and UTC?
Unix time is a single number counting seconds since the epoch — it has no timezone. UTC (Coordinated Universal Time) is the world standard time zone. Unix time always represents UTC. When you convert to your local time, you add or subtract your timezone offset from the UTC result.
Do Unix timestamps include leap seconds?
No. Unix time does not count leap seconds. Each day is assumed to have exactly 86,400 seconds. When a leap second occurs, Unix time either repeats a second or skips one, depending on the system. This means Unix time is not perfectly in sync with astronomical time, but the difference is negligible for most applications.
Can Unix timestamps be negative?
Yes. Negative Unix timestamps represent dates before January 1, 1970. For example, -86400 represents December 31, 1969. This works on most modern systems, though some older 32-bit systems may not support negative timestamps correctly.
What is a Unix timestamp in milliseconds?
Some systems (like JavaScript Date.now()) use milliseconds instead of seconds. A millisecond Unix timestamp is 1000 times larger. To convert: divide by 1000 to get seconds. Our converter automatically detects timestamps in milliseconds (13+ digits) and converts them correctly.

Share This Tool

Found it useful? Share it with your friends, classmates, or colleagues.