(original) (raw)
On 23 October 2017 at 01:06, Wes Turner <wes.turner@gmail.com> wrote:
On Saturday, October 21, 2017, Nick Coghlan <ncoghlan@gmail.com> wrote:Yes, as time is a critical part of their experimental setup - when you're operating at relativistic speeds and the kinds of energy levels that particle accelerators hit, it's a bad idea to assume that regular time libraries that assume Newtonian physics applies are going to be up to the task.So yeah, for nanosecond resolution to not be good enough for programs running in Python, we're going to be talking about some genuinely fundamental changes in the nature of computing hardware, and it's currently unclear if or how established programming languages will make that jump (see \[3\] for a gentle introduction to the current state of practical quantum computing). At that point, picoseconds vs nanoseconds is likely to be the least of our conceptual modeling challenges :)There are current applications with greater-than nanosecond precision:
- relativity experiments
\- particle experimentsMust they always use their own implementations of time., datetime. \_\_init\_\_, fromordinal, fromtimestamp ?!
Normal software assumes a nanosecond is almost no time at all - in high energy particle physics, a nanosecond is enough time for light to travel 30 centimetres, and a high energy particle that stuck around that long before decaying into a lower energy state would be classified as "long lived".
Cheers.
Nick.
P.S. "Don't take code out of the environment it was
designed for and assume it will just keep working normally" is one of
the main lessons folks learned from the destruction of the first Ariane 5
launch rocket in 1996 (see
the first paragraph in https://en.wikipedia.org/wiki/Ariane\_5#Notable\_launches )
--
Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia