Inconsistent behavior in Timestamp for large negative values · Issue #14415 · pandas-dev/pandas (original) (raw)
A small, complete example of the issue
Your code here
In [1]: import numpy as np; import pandas as pd In [3]: pd.Timestamp(np.iinfo(np.int64).min + 80000000000000) Out[3]: Timestamp('2262-04-11 22:26:03.145224192')
In [4]: pd.Timestamp(np.iinfo(np.int64).min + 90000000000000) Out[4]: Timestamp('1677-09-22 01:12:43.145224192')
In [5]: pd.Timestamp(np.iinfo(np.int64).min + 85000000000000)
OutOfBoundsDatetime Traceback (most recent call last) in () ----> 1 pd.Timestamp(np.iinfo(np.int64).min + 85000000000000)
pandas/tslib.pyx in pandas.tslib.Timestamp.new (pandas/tslib.c:9932)()
pandas/tslib.pyx in pandas.tslib.convert_to_tsobject (pandas/tslib.c:26453)()
pandas/tslib.pyx in pandas.tslib._check_dts_bounds (pandas/tslib.c:30034)()
OutOfBoundsDatetime: Out of bounds nanosecond timestamp: 2262-04-11 23:49:23
Expected Output
I would expect all the values listed above to evaluate to dates in the 1600s. The behavior here looks like there might be a signed integer overflow happening in C somewhere, which is a little nerve-wracking since that's undefined behavior unless pandas is compiled with -fwrapv
.
Output of pd.show_versions()
# Paste the output here
In [8]: pd.show_versions()
INSTALLED VERSIONS
commit: None
python: 2.7.10.final.0
python-bits: 64
OS: Linux
OS-release: 4.2.0-16-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: None.None
pandas: 0.19.0
nose: None
pip: 8.1.2
setuptools: 28.3.0
Cython: None
numpy: 1.11.2
scipy: None
statsmodels: None
xarray: None
IPython: 5.1.0
sphinx: None
patsy: None
dateutil: 2.5.3
pytz: 2016.7
blosc: None
bottleneck: None
tables: None
numexpr: None
matplotlib: None
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml: None
bs4: None
html5lib: None
httplib2: None
apiclient: None
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: None
boto: None
pandas_datareader: None