Convert a huge SQL file to CSV

biocyberman biocyberman at gmail.com
Sun Jun 3 06:04:36 UTC 2018


On Friday, 1 June 2018 at 10:15:11 UTC, Martin Tschierschke wrote:
> On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:
>> I need to convert a compressed 17GB SQL dump to CSV. A 
>> workable solution is to create a temporary mysql database, 
>> import the dump, query by python, and export. But i wonder if 
>> there is something someway in D to parse the SQL file directly 
>> and query and export the data. I imagine this will envolve 
>> both parsing and querying because the data is stored in 
>> several tables. I am in the process of downloading the dump 
>> now so I can’t give excerpt of the data.
>
> You don't need python:
> https://michaelrigart.be/export-directly-mysql-csv/
>
> SELECT field1, field2
> FROM table1
> INTO OUTFILE '/path/to/file.csv'
> FIELDS TERMINATED BY ','
> ENCLOSED BY '"'
> FIELDS ESCAPED BY '\'
> LINES TERMINATED BY '\n';
>
> Most important:
>
> INTO OUTFILE : here you state the path where you want MySQL to 
> store the CSV file. Keep in mind that the path needs to be 
> writeable for the MySQL user
>
> You can write a parser for SQL in D, but even if the import 
> into mysql would take some time, it's only compute time and not 
> yours.
>
>
> Regards mt.

Ah yes, thank you Martin. I forgot that we can do a "batch" SQL 
query where mysql server can parse and run query commands. So no 
need for Python. But I am still currently waiting for the import 
to  finish the importing of mysql dump. It took 18 hours and is 
still counting! The whole mysql database is 68GB at the moment.  
Can we avoid the import and query the database dump directly?


More information about the Digitalmars-d-learn mailing list