The Internet seems to have trouble telling you how to load an arbitrary comma separated values datafile into a database. Assuming the CSV has a header row, and given sqlite3 doesn’t know or care about data types, this ought to be a doddle.
import csv
import sqlite
def import_csv_to_db(table_name: str):
"""Load arbitarty CSV file into database"""
filename = f"{table_name}.csv"
columns = None
count = None
with open(filename) as f:
rows = []
reader = csv.reader(f)
for row in reader:
if columns:
rows.append(row)
if len(row)!=count:
print(f"{table_name} column count mismatch; {len(row)} columns in this row, {count} in header")
print(row)
print("Halting")
exit()
else:
columns = row
count = len(columns)
col_list = ",".join(columns)
create_sql = f"CREATE TABLE IF NOT EXISTS {table_name} ({col_list})"
insert_sql = f"INSERT INTO {table_name} ({col_list}) VALUES ({','.join('?' * len(columns))})"
return create_sql, insert_sql, rows
conn = sqlite3.connect(self.db_path)
c = conn.cursor()
create, insert, rows = import_csv_to_db('import_filename')
c.execute(create_sql)
c.executemany(insert_sql, rows)
You could do what I did, and add a column-name:datatype lookup, but rather than throw that in here I’ve left it as an exercise for the reader. Equally, halting the program when there’s missing columns is kinda extreme, but in my particular use case it’s reasonable. Fix it if you don’t like it.