BUG: Overflow encountered when using .loc with UInt64Index · Issue #20722 · pandas-dev/pandas (original) (raw)

Code Sample, a copy-pastable example if possible

import pandas as pd import numpy as np s = pd.Series([1, 2], index=[np.iinfo('uint64').max - 1, np.iinfo('uint64').max]) s[np.iinfo('uint64').max] 2 s.loc[np.iinfo('uint64').max]

... OverflowError: Python int too large to convert to C long During handling of the above exception, another exception occurred: KeyError ...

s.index UInt64Index([18446744073709551614, 18446744073709551615], dtype='uint64')

Problem description

loc leads to an overflow even with consistent dtype (uint64 for the index of the Series and the value pass to loc).

Expected Output

s.loc[np.iinfo('uint64').max] 2

Output of pd.show_versions()

INSTALLED VERSIONS
------------------
commit: None
python: 3.6.4.final.0
python-bits: 64
OS: Linux
OS-release: 4.13.0-38-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: en_US.UTF-8

pandas: 0.22.0
pytest: 3.3.2
pip: 9.0.1
setuptools: 38.5.1
Cython: 0.27.3
numpy: 1.14.0
scipy: 1.0.0
pyarrow: None
xarray: 0.10.1
IPython: 6.2.1
sphinx: 1.6.6
patsy: 0.5.0
dateutil: 2.6.1
pytz: 2017.3
blosc: None
bottleneck: None
tables: None
numexpr: None
feather: None
matplotlib: 2.1.2
openpyxl: None
xlrd: 1.1.0
xlwt: None
xlsxwriter: None
lxml: None
bs4: 4.6.0
html5lib: 0.9999999
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: 2.10
s3fs: None
fastparquet: None
pandas_gbq: None
pandas_datareader: None