Upload GeoDataFrame into PostGIS database.
This method requires SQLAlchemy and GeoAlchemy2, and a PostgreSQL Python driver (psycopg or psycopg2) to be installed.
It is also possible to use to_file()
to write to a database. Especially for file geodatabases like GeoPackage or SpatiaLite this can be easier.
Name of the target table.
Active connection to the PostGIS database.
How to behave if the table already exists:
fail: Raise a ValueError.
replace: Drop the table before inserting new values.
append: Insert new values to the existing table.
Specify the schema. If None, use default schema: ‘public’.
Write DataFrame index as a column. Uses index_label as the column name in the table.
Column label for index column(s). If None is given (default) and index is True, then the index names are used.
Rows will be written in batches of this size at a time. By default, all rows will be written at once.
Specifying the datatype for columns. The keys should be the column names and the values should be the SQLAlchemy types.
Examples
>>> from sqlalchemy import create_engine >>> engine = create_engine("postgresql://myusername:mypassword@myhost:5432/mydatabase") >>> gdf.to_postgis("my_table", engine)
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4