pandas-dev / pandas

Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more
https://pandas.pydata.org
BSD 3-Clause "New" or "Revised" License
43.62k stars 17.91k forks source link

BUG: Cannot use numpy FLS as indicies since pandas 2.2.1 #57645

Open jday1 opened 8 months ago

jday1 commented 8 months ago

Pandas version checks

Reproducible Example

import numpy as np
import pandas as pd

# List of fixed-length strings
fixed_strings = ["apple", "banana", "orange", "grape"]

# Define the fixed length for the strings
string_length = 6  # Adjust as needed

# Create NumPy array of fixed-length strings
arr = np.array(fixed_strings, dtype=f"S{string_length}")

df = pd.DataFrame(pd.Series(arr), columns=["fruit"])

# Raises NotImplementedError: |S6
df.set_index("fruit", inplace=True)

Issue Description

The issue shown in the provided code is that when attempting to set the index of the Pandas DataFrame using df.set_index("fruit", inplace=True), a NotImplementedError is raised with the message |S6.

The reason for this error is that the dtype |S6 is not supported as an index type in Pandas. When you create a NumPy array with a dtype of fixed-length strings using np.array() and attempt to set it as an index in a DataFrame, Pandas tries to convert it to a suitable index type, but |S6 is not recognized as a valid option for the index.

This is applicable to all S dtypes. The offending logic was introduced in pandas 2.2.1 by these lines of this commit: https://github.com/pandas-dev/pandas/commit/b6fb90574631c84f19f2dbdc68c26d6ce97446b4#diff-fb8a9c322624b0777f3ff7e3ef8320d746b15a2a0b80b7cab3dfbe2e12e06daaR239-R240

The code works as expected in pandas 2.2.0.

Expected Behavior

What happens when using 2.2.0:

import numpy as np
import pandas as pd

# List of fixed-length strings
fixed_strings = ["apple", "banana", "orange", "grape"]

# Define the fixed length for the strings
string_length = 6  # Adjust as needed

# Create NumPy array of fixed-length strings
arr = np.array(fixed_strings, dtype=f"S{string_length}")

df = pd.DataFrame(pd.Series(arr), columns=["fruit"])

df.set_index("fruit", inplace=True)

print(df)

Prints:

Empty DataFrame
Columns: []
Index: [b'apple', b'banana', b'orange', b'grape']

Installed Versions

INSTALLED VERSIONS ------------------ commit : bdc79c146c2e32f2cab629be240f01658cfb6cc2 python : 3.11.1.final.0 python-bits : 64 OS : Linux OS-release : 6.6.10-76060610-generic Version : #202401051437~1704728131~22.04~24d69e2 SMP PREEMPT_DYNAMIC Mon J machine : x86_64 processor : x86_64 byteorder : little LC_ALL : None LANG : en_GB.UTF-8 LOCALE : en_GB.UTF-8 pandas : 2.2.1 numpy : 1.26.4 pytz : 2024.1 dateutil : 2.8.2 setuptools : 67.7.2 pip : 24.0 Cython : None pytest : 7.4.4 hypothesis : None sphinx : None blosc : None feather : 0.4.1 xlsxwriter : None lxml.etree : 4.9.3 html5lib : None pymysql : None psycopg2 : None jinja2 : 3.1.2 IPython : 8.15.0 pandas_datareader : None adbc-driver-postgresql: None adbc-driver-sqlite : None bs4 : 4.12.2 bottleneck : None dataframe-api-compat : None fastparquet : None fsspec : 2023.10.0 gcsfs : None matplotlib : 3.8.3 numba : 0.59.0 numexpr : 2.9.0 odfpy : None openpyxl : 3.1.2 pandas_gbq : None pyarrow : 15.0.0 pyreadstat : None python-calamine : None pyxlsb : None s3fs : 2023.10.0 scipy : 1.12.0 sqlalchemy : 2.0.20 tables : 3.9.2 tabulate : 0.9.0 xarray : None xlrd : None zstandard : None tzdata : 2023.3 qtpy : None pyqt5 : None None
rhshadrach commented 8 months ago

cc @phofl

PF2100 commented 7 months ago

take

WillAyd commented 7 months ago

pandas does not support NumPy fixed length strings. Does this exhibit the same issue when you use the object dtype?

PF2100 commented 7 months ago

Using 'object' dtype on the array creation works correctly. However pandas 2.2.1 brought a change to how 'S' dtype arrays were handled during Index creation, no longer automatically converting them to 'object' dtype. Shouldn't this functionality still be implemented?

WillAyd commented 7 months ago

Is that behavior documented somewhere? if not then no...I'd say its not worth doing too much here. We have never offered support for fixed length NumPy strings. object has been the historically supported data type, but we also now have "string", "string[pyarrow]" and "string[pyarrow_numpy]` and maybe even a NumPy native string in the future. I don't see fixed length string support as something worth additionally investing in

phofl commented 7 months ago

it should convert automatically to object, you should never end up with numpy fixed length strings

PF2100 commented 7 months ago

In the commit previously mentioned in the issue description, the if statement:

  elif isinstance(values, ABCSeries):
        return values._values

was added to this function in the file pandas/core/common.py

def asarray_tuplesafe(values: Iterable, dtype: NpDtype | None = None) -> ArrayLike:
    if not (isinstance(values, (list, tuple)) or hasattr(values, "__array__")):
        values = list(values)
    elif isinstance(values, ABCIndex):
        return values._values
    elif isinstance(values, ABCSeries):
        return values._values

    if isinstance(values, list) and dtype in [np.object_, object]:
        return construct_1d_object_array_from_listlike(values)

    try:
        with warnings.catch_warnings():
            # Can remove warning filter once NumPy 1.24 is min version
            if not np_version_gte1p24:
                warnings.simplefilter("ignore", np.VisibleDeprecationWarning)
            result = np.asarray(values, dtype=dtype)
    except ValueError:
        # Using try/except since it's more performant than checking is_list_like
        # over each element
        # error: Argument 1 to "construct_1d_object_array_from_listlike"
        # has incompatible type "Iterable[Any]"; expected "Sized"
        return construct_1d_object_array_from_listlike(values)  # type: ignore[arg-type]

    if issubclass(result.dtype.type, str):
        result = np.asarray(values, dtype=object)

    if result.ndim == 2:
        # Avoid building an array of arrays:
        values = [tuple(x) for x in values]
        result = construct_1d_object_array_from_listlike(values)

    return result

It seems that with the addition of that if statement, any variable values that is of the type series will result in a true statement and return values._values without checking its content , this way not creating an array with the dtype object as it used to. Would something like this be an acceptable solution?.

  elif isinstance(values, ABCSeries) and values._values.dtype.kind != "S"
        return values._values