Use expanduser when writing dataframes to files? · Issue #23473 · pandas-dev/pandas (original) (raw)
Code Sample, a copy-pastable example if possible
The following to_XXX
do not work:
df = pd.DataFrame([[1,1],[2,2]])
df.to_pickle('/df.pkl')
df.to_json('/df.json')
etc.
Problem description
to_XXX
do not work when the path argument includes a ~
. This may be fine, but it is strange that dataframe reading from files does support ~
. i.e. pd.read_json('~/df.json')
works fine.
Expected Output
I expect to_json
and friends to support expanding the home directory.
Perhaps the fix is to add expanduser
to
def _stringify_path(filepath_or_buffer): |
---|
as we have in
return _expand_user(filepath_or_buffer), None, compression, False |
---|
Output of pd.show_versions()
INSTALLED VERSIONS ------------------ commit: None python: 3.6.6.final.0 python-bits: 64 OS: Darwin OS-release: 17.7.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8
pandas: 0.23.4
pytest: 3.9.1
pip: 9.0.1
setuptools: 40.4.1
Cython: 0.27.3
numpy: 1.15.2
scipy: 1.1.0
pyarrow: 0.10.0
xarray: None
IPython: 5.3.0
sphinx: 1.5.1
patsy: 0.4.1
dateutil: 2.7.3
pytz: 2018.5
blosc: None
bottleneck: 1.2.1
tables: 3.4.4
numexpr: 2.6.5
feather: None
matplotlib: 2.2.2
openpyxl: 2.4.1
xlrd: 1.0.0
xlwt: 1.2.0
xlsxwriter: 0.9.6
lxml: 4.1.1
bs4: 4.5.3
html5lib: 0.9999999
sqlalchemy: 1.1.15
pymysql: None
psycopg2: 2.7.1 (dt dec pq3 ext lo64)
jinja2: 2.10
s3fs: 0.1.5
fastparquet: 0.1.3
pandas_gbq: None
pandas_datareader: None