pyarrow.Scalar — Apache Arrow v20.0.0 (original) (raw)

class pyarrow.Scalar#

Bases: _Weakrefable

The base class for scalars.

__init__(*args, **kwargs)#

Methods

Attributes

as_py(self, *, maps_as_pydicts=None)#

Return this value as a Python representation.

Parameters:

maps_as_pydictsstr, optional, default None

Valid values are None, ‘lossy’, or ‘strict’. The default behavior (None), is to convert Arrow Map arrays to Python association lists (list-of-tuples) in the same order as the Arrow Map, as in [(key1, value1), (key2, value2), …].

If ‘lossy’ or ‘strict’, convert Arrow Map arrays to native Python dicts.

If ‘lossy’, whenever duplicate keys are detected, a warning will be printed. The last seen value of a duplicate key will be in the Python dictionary. If ‘strict’, this instead results in an exception being raised when detected.

cast(self, target_type=None, safe=None, options=None, memory_pool=None)#

Cast scalar value to another data type.

See pyarrow.compute.cast() for usage.

Parameters:

target_typeDataType, default None

Type to cast scalar to.

safebool, default True

Whether to check for conversion errors such as overflow.

optionsCastOptions, default None

Additional checks pass by CastOptions

memory_poolMemoryPool, optional

memory pool to use for allocations during function execution.

Returns:

scalarA Scalar of the given target data type.

equals(self, Scalar other)#

Parameters:

otherpyarrow.Scalar

Returns:

bool

is_valid#

Holds a valid (non-null) value.

type#

Data type of the Scalar object.

validate(self, *, full=False)#

Perform validation checks. An exception is raised if validation fails.

By default only cheap validation checks are run. Pass full=Truefor thorough validation checks (potentially O(n)).

Parameters:

fullbool, default False

If True, run expensive checks, otherwise cheap checks only.

Raises:

ArrowInvalid