So, I’ve been quite busy recently.
I’ve made some decisions in my personal life that have resulted in a bit of a change of direction and focus for me.
It’s been exciting, and has necessarily meant some changes. This has given me the opportunity to “sharpen my tools”, and so I’ve been getting around to playing with a bunch of technologies that have always been on my “weekend project” list, but they never made it up the priority list.
This isn’t directly related to the title of this article, but provides context. Since, as part of one of these projects, I found it necessary to populate a database with the contents of a bunch of rather unwieldy CSV files (within a docker container, so that their contents could be exposed by a NodeJS/Express REST API, but I digress).
It was while reading the various man pages, before launching in to writing a script, that I found this little gem. This meant I could do the import and provisioning of my docker instance straight from the /docker-entrypoint-initdb.d
SQL scripts.
First, create your tables in the normal way, defining the data types that are represented in your CSV file. For example:
CREATE TABLE locations (
location varchar(128),
latitude DECIMAL(11,8),
longitude DECIMAL(11,8)
);
Into which, as you might expect, you’d want to import a long list of location coordinates from a CSV file structured as follows:
location, latitude, longitude
"Oxford", 51.7520, 1.2577
"Edinburgh", 55.9533, 3.1883
Now, in your SQL, execute the following query:
LOAD DATA LOCAL INFILE 'locations.csv'
INTO TABLE locations
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS (location, latitude, longitude);
Which tells MariaDB to read each line from locations.csv
into the locations
table, skipping over the first line (which contains the header).
This little trick meant I was able to provision my api’s backend quickly and easily, without the need to hack together some arduous import script.
Hope you find this useful too!