Converting Pandas Columns To Postgresql List?
I'm working with a CSV of a few hundred columns, many of them are just enumerations, ie: [ ['code_1', 'code_2', 'code_3', ..., 'code_50'], [1, 2, 3, ..., 50], [2, 3, 4, ..., 51], .
Solution 1:
Assumed that you already connect to PostgreSQL and already have the table in PostgreSQL. Or visit this link https://wiki.postgresql.org/wiki/Psycopg2_Tutorial
import psycopg2
try:
conn = psycopg2.connect("host='localhost' dbname='template1' user='dbuser' password='dbpass'")
except:
print"I am unable to connect to the database"
First, open the .csv file.
>>>import csv>>>withopen('names.csv') as csvfile:... reader = csv.DictReader(csvfile)...for row in reader:...print(row['first_name'], row['last_name'])...
That's example from https://docs.python.org/2/library/csv.html Change the print line with insert into PostgreSQL.
>>>import psycopg2 >>>cur.execute("INSERT INTO test (num, data) VALUES (%s, %s)",
... (100, "abc'def"))
You can change (100, "abc'def") with (variable1, variable2) See this link http://initd.org/psycopg/docs/usage.html Or in full sample code:
>>>import csv>>>import psycopg2>>>withopen('names.csv') as csvfile:... reader = csv.DictReader(csvfile)...for row in reader:... cur.execute("INSERT INTO test (num, data) VALUES (%s, %s)", (variable1, variable2))...
Hope this will help...
Post a Comment for "Converting Pandas Columns To Postgresql List?"