- polars.read_ipc(source: str | BinaryIO | BytesIO | Path | bytes, *, columns: list[int] | list[str] | None = None, n_rows: int | None = None, use_pyarrow: bool = False, memory_map: bool = True, storage_options: dict[str, Any] | None = None, row_count_name: str | None = None, row_count_offset: int = 0, rechunk: bool = True) DataFrame [source]#
Read into a DataFrame from Arrow IPC (Feather v2) file.
Path to a file or a file-like object. If
fsspecis installed, it will be used to open remote files.
Columns to select. Accepts a list of column indices (starting at zero) or a list of column names.
Stop reading from IPC file after reading
n_rows. Only valid when use_pyarrow=False.
Use pyarrow or the native Rust reader.
Try to memory map the file. This can greatly improve performance on repeated queries as the OS may cache pages. Only uncompressed IPC files can be memory mapped.
Extra options that make sense for
fsspec.open()or a particular storage connection, e.g. host, port, username, password, etc.
If not None, this will insert a row count column with give name into the DataFrame
Offset to start the row_count column (only use if the name is set)
Make sure that all data is contiguous.
memory_mapis set, the bytes on disk are mapped 1:1 to memory. That means that you cannot write to the same filename. E.g.