FlatGeobuf¶
Read and write FlatGeobuf files.
geoarrow.rust.io.read_flatgeobuf ¶
read_flatgeobuf(
file: Union[str, Path, BinaryIO],
*,
store: Optional[ObjectStore] = None,
batch_size: int = 65536,
bbox: Tuple[float, float, float, float] | None = None
) -> Table
Read a FlatGeobuf file from a path on disk or a remote location into an Arrow Table.
Example:
Reading from a local path:
from geoarrow.rust.core import read_flatgeobuf
table = read_flatgeobuf("path/to/file.fgb")
Reading from a Python file object:
from geoarrow.rust.core import read_flatgeobuf
with open("path/to/file.fgb", "rb") as file:
table = read_flatgeobuf(file)
Reading from an HTTP(S) url:
from geoarrow.rust.core import read_flatgeobuf
url = "http://flatgeobuf.org/test/data/UScounties.fgb"
table = read_flatgeobuf(url)
Reading from a remote file on an S3 bucket.
from geoarrow.rust.core import ObjectStore, read_flatgeobuf
options = {
"aws_access_key_id": "...",
"aws_secret_access_key": "...",
"aws_region": "..."
}
store = ObjectStore('s3://bucket', options=options)
table = read_flatgeobuf("path/in/bucket.fgb", store=store)
Parameters:
-
file
(Union[str, Path, BinaryIO]
) –the path to the file or a Python file object in binary read mode.
Other Parameters:
-
store
(Optional[ObjectStore]
) –an ObjectStore instance for this url. This is required only if the file is at a remote location.
-
batch_size
(int
) –the number of rows to include in each internal batch of the table.
-
bbox
(Tuple[float, float, float, float] | None
) –A spatial filter for reading rows, of the format (minx, miny, maxx, maxy). If set to
Returns:
-
Table
–Table from FlatGeobuf file.
geoarrow.rust.io.read_flatgeobuf_async
async
¶
read_flatgeobuf_async(
path: str,
*,
store: Optional[ObjectStore] = None,
batch_size: int = 65536,
bbox: Tuple[float, float, float, float] | None = None
) -> Table
Read a FlatGeobuf file from a url into an Arrow Table.
Example:
Reading from an HTTP(S) url:
from geoarrow.rust.core import read_flatgeobuf_async
url = "http://flatgeobuf.org/test/data/UScounties.fgb"
table = await read_flatgeobuf_async(url)
Reading from an S3 bucket:
from geoarrow.rust.core import ObjectStore, read_flatgeobuf_async
options = {
"aws_access_key_id": "...",
"aws_secret_access_key": "...",
"aws_region": "..."
}
store = ObjectStore('s3://bucket', options=options)
table = await read_flatgeobuf_async("path/in/bucket.fgb", store=store)
Parameters:
-
path
(str
) –the url or relative path to a remote FlatGeobuf file. If an argument is passed for
store
, this should be a path fragment relative to the root passed to theObjectStore
constructor.
Other Parameters:
-
store
(Optional[ObjectStore]
) –an ObjectStore instance for this url. This is required for non-HTTP urls.
-
batch_size
(int
) –the number of rows to include in each internal batch of the table.
-
bbox
(Tuple[float, float, float, float] | None
) –A spatial filter for reading rows, of the format (minx, miny, maxx, maxy). If set to
Returns:
-
Table
–Table from FlatGeobuf file.
geoarrow.rust.io.write_flatgeobuf ¶
write_flatgeobuf(
table: ArrowStreamExportable,
file: str | Path | BinaryIO,
*,
write_index: bool = True
) -> None
Write to a FlatGeobuf file on disk.
Parameters:
-
table
(ArrowStreamExportable
) –the Arrow RecordBatch, Table, or RecordBatchReader to write.
-
file
(str | Path | BinaryIO
) –the path to the file or a Python file object in binary write mode.
Other Parameters:
-
write_index
(bool
) –whether to write a spatial index in the FlatGeobuf file. Defaults to True.