Convert a huge SQL file to CSV

Martin Tschierschke mt at smartdolphin.de
Fri Jun 1 10:15:11 UTC 2018


On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:
> I need to convert a compressed 17GB SQL dump to CSV. A workable 
> solution is to create a temporary mysql database, import the 
> dump, query by python, and export. But i wonder if there is 
> something someway in D to parse the SQL file directly and query 
> and export the data. I imagine this will envolve both parsing 
> and querying because the data is stored in several tables. I am 
> in the process of downloading the dump now so I can’t give 
> excerpt of the data.

You don't need python:
https://michaelrigart.be/export-directly-mysql-csv/

SELECT field1, field2
FROM table1
INTO OUTFILE '/path/to/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
FIELDS ESCAPED BY '\'
LINES TERMINATED BY '\n';

Most important:

INTO OUTFILE : here you state the path where you want MySQL to 
store the CSV file. Keep in mind that the path needs to be 
writeable for the MySQL user

You can write a parser for SQL in D, but even if the import into 
mysql would take some time, it's only compute time and not yours.


Regards mt.


More information about the Digitalmars-d-learn mailing list