Unix timestamp to human readable in microseconds.

SageT

Member
Joined
Feb 2, 2022
Messages
10
Programming Experience
3-5
I need to take a bunch of Unix time stamps and I need to convert them to date in microseconds so that I can graph them.
I find that if I have a timestamp for milliseconds I can get it to work. here is my timestamp 1516354232225350.
This will not fit into int, and all the functions/methods that I have seen are all int's.
This is what I need for the output "Friday, January 19, 2018 9:30:32.225 AM" or better yet all the way down to the microsecond, as this is shown in milliseconds.
The above timestamp conversion was taken from Epoch Converter.

How the heck to I do this? I have tried to divide the timestamp but ended up with 1970-01-18 13:12:34.232225. I did that conversion using a Python method.
I really don't know python but it at least gave me an output in microseconds but with the wrong date. The above was done dividing, "timestamp/1000000000".
 
Use long instead of int.

Out of curiosity, as I recall Unix timestamps have a resolution of 1 second, but you are talking about microseconds. Is this a new convention?

Anyway this should get you partway there if you convert your microseconds to milliseconds first:

and

 
Last edited:
Use long instead of int.

Out of curiosity, as I recall Unix timestamps have a resolution of 1 second, but you are talking about microseconds. Is this a new convention?

Anyway this should get you partway there if you convert your microseconds to milliseconds first:

This is what happens when I go to long:
C#:
            long ts = 1516354232225350;

            string TimeStamp = epoch2string(ts);



        public static string epoch2string(long epoch)
        {
            return new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc).AddSeconds(epoch).ToShortDateString();
        }
System.ArgumentOutOfRangeException: 'Value to add was out of range. As it needs a int.

the link that I gave for EPOCH Converter says that it:
"Supports Unix timestamps in seconds, milliseconds, microseconds and nanoseconds."
 
Last edited by a moderator:
I thought that you grabbed some code online which used ints and was trying to convert it over to take that 64-bit number. So it would have just been a matter of changing that code to use long instead of int (barring any overflow issues). I guess I misread your original statement and should have taken it at face value that you were actually looking at the DateTime methods.

Anyway, I gave you the path forward in post number #2:
C#:
DateTimeOffset.FromUnixTimeMilliseconds(ts).ToString("hh:mm:ss.ffffff tt")

Feel free modify the format string above to fit what you need.
 
long ts = 1516354232225350;
that's microseconds, divide by 1000d to get milliseconds as double, then AddMilliseconds to your 1970 date.
 
I thought that you grabbed some code online which used ints and was trying to convert it over to take that 64-bit number. So it would have just been a matter of changing that code to use long instead of int (barring any overflow issues). I guess I misread your original statement and should have taken it at face value that you were actually looking at the DateTime methods.

Anyway, I gave you the path forward in post number #2:
C#:
DateTimeOffset.FromUnixTimeMilliseconds(ts).ToString("hh:mm:ss.ffffff tt")

Feel free modify the format string above to fit what you need.
Yep, tried that two days ago. this is for reply 2.
System.ArgumentOutOfRangeException: 'Valid values are between -62135596800000 and 253402300799999, inclusive.
Parameter name: milliseconds'

I will try reply 3 but I need to put back in the microseconds.

I am not supposed to hand decode the timestamp. I am supposed to use built in methods or a library. I may have to figure out how to decode it by hand and then write a dll for it. Then I can add code to handle nanosecond timestamps.
Although it technically breaks the rules that I have been given for this task.
 
Last edited:
See post #5. Take the time in microseconds the divide by 1000 to get milliseconds. Then you can pass milliseconds to either the DateTimeOffset.FromUnixTimeMilliseconds(), or use it to something like your code in post #3, but add milliseconds instead of seconds.
 
Yep, tried that two days ago.
System.ArgumentOutOfRangeException: 'Valid values are between -62135596800000 and 253402300799999, inclusive.
Parameter name: milliseconds'
That's because you were passing in microseconds instead of milliseconds as the method name indicates.
 
C#:
var stamp = 1516354232225350;

var msWhole = stamp / 1000;
var msRemainder = stamp % 1000 / 1000d;
var d1 = DateTimeOffset.FromUnixTimeMilliseconds(msWhole).AddMilliseconds(msRemainder);

var msFraction = stamp / 1000d;
var d2 = new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc).AddMilliseconds(msFraction);

var format = "dddd d. MMMM yyyy HH.mm.ss.ffffff";
Console.WriteLine(d1.ToString(format));
Console.WriteLine(d2.ToString(format));
 
As a quick aside, I adding the remainder won't quite work the way our OP wants. The values displayed still won't go down to the microsecond.

This is explained in the documentation:
The fractional part of the milliseconds parameter is the fractional part of a millisecond. For example, 4.5 is equivalent to 4 milliseconds and 5000 ticks, where one millisecond equals 10,000 ticks. However, milliseconds is rounded to the nearest millisecond; all values of .5 or greater are rounded up.

(Yes, I was puzzled too. It's why I looked up the docs and then it suddenly became clear. If the OP wants microsecond level precision, he's going to have to roll his own or do some hackery. I would do the hackery personally.)
 
So we can go for ticks
C#:
var ticksPerMicrosecond = TimeSpan.TicksPerMillisecond / 1000; // TicksPerMillisecond = 10_000
var stamp = 1516354232225350;

var msWhole = stamp / 1000;
var ticksRemainder = stamp % 1000 * ticksPerMicrosecond;
var d1 = DateTimeOffset.FromUnixTimeMilliseconds(msWhole).AddTicks(ticksRemainder);

var ticks = stamp * ticksPerMicrosecond;
var d2 = new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc).AddTicks(ticks);

var format = "dddd d. MMMM yyyy HH.mm.ss.ffffff";
Console.WriteLine(d1.ToString(format));
Console.WriteLine(d2.ToString(format));
 
So we can go for ticks
C#:
var ticksPerMicrosecond = TimeSpan.TicksPerMillisecond / 1000; // TicksPerMillisecond = 10_000
var stamp = 1516354232225350;

var msWhole = stamp / 1000;
var ticksRemainder = stamp % 1000 * ticksPerMicrosecond;
var d1 = DateTimeOffset.FromUnixTimeMilliseconds(msWhole).AddTicks(ticksRemainder);

var ticks = stamp * ticksPerMicrosecond;
var d2 = new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc).AddTicks(ticks);

var format = "dddd d. MMMM yyyy HH.mm.ss.ffffff";
Console.WriteLine(d1.ToString(format));
Console.WriteLine(d2.ToString(format));
interesting! Both d1 and d2 output the same thing. Thanks very much!
 
Also assuring that they output 225350 microseconds for that stamp :)
 
In this number of total microseconds 1516354232225350 the red part is milliseconds and blue microseconds, the string formatting pattern ffffff show all these numbers of the millionths of a second.
 
Back
Top Bottom