Write Tabulator data to database

Welcome to the community :slight_smile: I build up a connection to a postgressql database using the sqalchemy package. In my usecase I create a new relation for every pandas Dataframe. Using the method psql_insert_copy makes it al lot faster. Its copied from pandas.DataFrame.to_sql — pandas 1.2.4 documentation

Sqalchemy can be used with MySQL and others as well.

A usecase by @hyamanieu you can find here: IoT measurements data?

from sqlalchemy import create_engine
import pandas as pd
from io import StringIO
import csv

df = yourtabulardata
engine = create_engine(
    'postgresql'+psycopg2://postgres:password@localhost/dbname')

def psql_insert_copy(table, conn, keys, data_iter):
    """
    Execute SQL statement inserting data

    Parameters
    ----------
    table : pandas.io.sql.SQLTable
    conn : sqlalchemy.engine.Engine or sqlalchemy.engine.Connection
    keys : list of str
        Column names
    data_iter : Iterable that iterates the values to be inserted
    """
    # gets a DBAPI connection that can provide a cursor
    dbapi_conn = conn.connection
    with dbapi_conn.cursor() as cur:
        s_buf = StringIO()
        writer = csv.writer(s_buf)
        writer.writerows(data_iter)
        s_buf.seek(0)

        columns = ', '.join('"{}"'.format(k) for k in keys)
        if table.schema:
            table_name = '{}.{}'.format(table.schema, table.name)
        else:
            table_name = table.name

        sql = 'COPY {} ({}) FROM STDIN WITH CSV'.format(
            table_name, columns)
        cur.copy_expert(sql=sql, file=s_buf)
    dbapi_conn.cursor().close()

Insert your tabular data into database. Each table as a new relation.

df.to_sql(name=str(name_relation),
                    con=engine,
                    if_exists="replace",
                    index=False,
                    method=psql_insert_copy)