Description
Is your feature request related to a problem or challenge? Please describe what you are trying to do.
The Arrow PyCapsule Interface is a new spec to simplify Arrow interop between compiled Python libraries.
For example, there's a from_arrow_table
method
datafusion-python/src/context.rs
Lines 467 to 473 in faa5a3f
to_batches
method). This would fail on a pyarrow.RecordBatchReader
or any non-pyarrow Arrow objects.
A from_arrow
method that looks for the __arrow_c_stream__
method would work out of the box on any Arrow-based Python library implementing this spec. That includes pyarrow Table
s and RecordBatchReader
s, ibis Tables (ibis-project/ibis#9143), nanoarrow objects, and hopefully soon duckdb and polars objects as well (pola-rs/polars#12530).
Implementing __arrow_c_stream__
on datafusion exports means that any of those other libraries would just work on datafusion classes, without needing to know anything specific of datafusion.
Describe the solution you'd like
PyCapsule import has been implemented in arrow upstream, but export hasn't been implemented. I've implemented import and export in pyo3-arrow (separated for a few reasons). I'm not sure if datafusion-python wants another dependency, but the content of pyo3-arrow could also be copied into here. Exporting raw pycapsule objects could be implemented in upstream arrow
if preferred.
Describe alternatives you've considered
Additional context