Since I always forget (database management isn't my day-job): a short guide on how to quickly import large datasets in TXT or CSV into PostgreSQL. For smaller sets I still use ruby and FasterCSV to import the set, but nothing beats native DB imports in terms of speed. And speed doesn't matter when importing a few megabytes of data, but it certainly matters when it gets more than that.
In this example I'll use my current use-case, importing a large Drive-Time Matrix table, with drive times and distances between two postal codes. The head of the TXT file is formatted as such:
"Frompc4","Topc4","Time","Distance"
"1011","1011",0,0
"1011","1012",6,1737
"1011","1013",9,3378
"1011","1014",13,6056
"1011","1015",10,3198
"1011","1016",10,3112
"1011","1017",6,1706
"1011","1018",5,1791
"1011","1019",7,3146
Now let's assume we want to import this in a table 'DTM' with the following columns: from_pc (integer), to_pc (integer), time (in…
Dit artikel van murblog van Maarten Brouwers (murb) is in licentie gegeven volgens een Creative Commons Naamsvermelding 3.0 Nederland licentie .