kedarvj / mysqldumpsplitter

MySQL Dump splitter to split / extract databases & tables from mysqldump
http://kedar.nitty-witty.com/blog
MIT License
505 stars 133 forks source link

Big MYSQLDUMP containing only one table #24

Open chinmoybasak opened 1 month ago

chinmoybasak commented 1 month ago

I have a 40GB MYSQLDUMP contain only one table. ALLTABLES does not work. Is there anyway In split the datas?

ramonfincken commented 1 month ago

each row has 1 insert?

chinmoybasak commented 1 month ago

Thank you for reply . No I think each row has multiple inserts.

ramonfincken commented 1 month ago

if you could -> make the export (will be bigger of course) 1 line = 1 insert then you can easlily split the file

chinmoybasak commented 1 month ago

I have exported as ALTABLES, the biggest gz file is 10GB other are tiny, so are you saying now it has 1 line = 1 insert? And how can I segregate that one? Can you please share?

ramonfincken commented 1 month ago

https://stackoverflow.com/questions/12439353/mysqldump-one-insert-statement-for-each-data-row

kedarvj commented 1 month ago

Hi @chinmoybasak, If you have single table of 40G, WHY do you want to split the the inserts into multiple files? Note that multi inserts are faster than single line insert.

Now, if you want to split, you rather can note that there would be multiple INSERT statements. You can confirm this by checking

head -1000 BACKUP.sql | grep INSERT | wc -l

If you want to split all the INSERTs in the batch of 10 per file you can use

awk 'BEGIN {i=1} /^INSERT/ {file="nittywitty_" sprintf("%02d", int((i-1)/10)+1) ".sql"; print >> file; close(file); i++}' BACKUP.sql

You will have number of files there... Consider tasting before copy-pasting the code ;)

Thanks, Kedar.

kedarvj commented 1 month ago

@chinmoybasak, I read your comment re: "testing my codes to work on the database". If you're testing "only" you don't need the whole dump! Just dump a few lines and complete your testing. In your mysqldump you may consider using --where true limit 100 to only dump 100 records of the full table and then perform your tests! If you're going to load 40G data to MySQL it is going to take time!

Thanks