I have a table like this
CREATE TABLE IF NOT EXISTS `dnddata` (
`numbers` varchar(10) NOT NULL,
`opstype` char(1) NOT NULL,
PRIMARY KEY (`numbers`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1
/*!50100 PARTITION BY KEY (numbers)
PARTITIONS 25 */;
i have to insert 300 million records. i am inserting 10 million records each time using load data in file from csv file.
To insert 10 million records taking nearly 5 min first time. Time is increasing each time gradually. after 30 million records it stops inserting and memory using 100% server not responding.
below my my.cnf file setting
bulk_insert_buffer_size = 100M
key_buffer = 100M
sort_buffer_size = 50M
read_buffer = 50M
i am using cpu with 2 G memory.
details for 30 million records
Space usage
Type Usage
Data 545.3 MiB
Index 694.8 MiB
Total 1,240.1 MiB
MySQL client version: 5.5.14
with out index it is inserting fine 10 million in 50 sec.
Please tell me what kind of setting need to change.
Edit based on user answers
I have changed my.cnf setting to below
key_buffer_size = 1G
sort_buffer_size = 4M
read_buffer_size = 1M
read_rnd_buffer_size = 1M
join_buffer_size = 2M
bulk_insert_buffer_size = 32M
myisam_sort_buffer_size = 256M
no use. issue is not resolved.
I have tried below mentioned methods to load data
set autocommit = 0; //for innodb
load data infile into …
COMMIT;
START TRANSACTION;
load data infile into …
COMMIT;
ALTER TABLE dnddata DISABLE KEYS;
load data infile into …
ALTER TABLE dnddata ENABLE KEYS;
No luck..
numbersbeing loaded in order? Or randomly? This will make a big difference in the building ofPRIMARY KEY(numbers). – Rick James Jul 01 '18 at 17:25