Loading...
Loading...
Convert between timestamps and dates
A Unix timestamp (also called epoch time) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC. It provides a universal, timezone-independent way to represent a point in time as a single integer.
A seconds-based Unix timestamp is 10 digits (e.g., 1672531200), while a milliseconds timestamp is 13 digits (e.g., 1672531200000). JavaScript uses milliseconds by default, while Python and most Unix systems use seconds.
The Year 2038 problem occurs because 32-bit systems store Unix timestamps as signed integers, which max out at 2,147,483,647 (January 19, 2038). After that, the value overflows and wraps to a negative number. Most modern systems now use 64-bit integers to avoid this.
Use import time; time.time() to get the current timestamp as a float in seconds. For an integer, use int(time.time()). For milliseconds, use int(time.time() * 1000).
Unix timestamps are timezone-independent, sort naturally as numbers, take up less storage space, and avoid ambiguity from different date formats. They are the standard for APIs, databases, and log files.