A method for rapidly inserting a million pieces of test data in mysql

  • 2020-05-13 03:37:18
  • OfStack

Compare this to 1, which was first done using mysql's stored procedure:
 
mysql>delimiter $ 
mysql>SET AUTOCOMMIT = 0$$ 
mysql> create procedure test() 
begin 
declare i decimal (10) default 0 ; 
dd:loop 
INSERT INTO `million` (`categ_id`, `categ_fid`, `SortPath`, `address`, `p_identifier`, `pro_specification`, `name`, `add_date`, `picture_url`, `thumb_url`, `is_display_front`, `create_html_time`, `hit`, `buy_sum`, `athor`, `templete _style`, `is_hot`, `is_new`, `is_best`) VALUES 
(268, 2, '0,262,268,', 0, '2342', '423423', '123123', '2012-01-09 09:55:43', 'upload/product/20111205153432_53211.jpg', 'upload/product/thumb_20111205153432_53211.jpg', 1, 0, 0, 0, 'admin', '0', 0, 0, 0); 
commit; 
set i = i+1; 
if i= 1000000 then leave dd; 
end if; 
end loop dd ; 
end;$ 
mysql>delimiter ; 
mysql> call test; 

The results of
mysql > call test; Query OK, 0 rows affected (58 min 30.83 sec)
It's very time consuming.
So I found another way
Generate data with PHP code first, then import:
 
<?php 
$t=mktime(); 
set_time_limit(1000); 
$myFile="e:/insert.sql"; 
$fhandler=fopen($myFile,'wb'); 
if($fhandler){ 
$sql="268\t2\t'0,262,268,'\t0\t '2342'\t'423423'\t'123123'\t'23423423'\t'2012-01-09 09:55:43'\t'upload/product/20111205153432_53211.jpg'\t'upload/product/thumb_20111205153432_53211.jpg'\tNULL\tNULL\t38\t' a '\t''\t123\t123\t0"; 
$i=0; 
while($i<1000000)//1,000,000 
{ 
$i++; 
fwrite($fhandler,$sql."\r\n"); 
} 
echo" Write to successful , Time: ",mktime()-$t; 
} 

And then import
 
LOAD DATA local INFILE 'e:/insert.sql' INTO TABLE tenmillion(`categ_id`, `categ_fid`, `SortPath`, `address`, `p_identifier`, `pro_specification`, `name`, `description`, `add_date`, `picture_url`, `thumb_url`, `shop_url`, `shop_thumb_url`, `brand_id`, `unit`, `square_meters_unit`, `market_price`, `true_price`, `square_meters_price`); 

Note that fields are no longer separated by commas, but by \t, and records are separated by \r\n. It turns out that when I insert 10 times, 100W takes an average of 1 minute to complete.
The second method, mysql, omits many intermediate steps in the middle, resulting in much faster insertion than the first method. There is no specific study.

Quickly generate mysql millions of test data
Due to the need of testing, there were only 10,000 pieces of data in the original table, but now the records were randomly copied and inserted, which quickly reached 1 million pieces.

itemid is the primary key.

Run the following code a few times. I'm going to randomly insert 1,000,

insert into downitems (chid,catid,softid,....)
SELECT chid,catid,softid... FROM `downitems` WHERE itemid > = (SELECT floor(RAND() * (SELECT MAX(itemid) FROM `downitems`))) ORDER BY itemid LIMIT 1000;

Then you can change the number 1000. Let's say 5,000 or 10,000. It will soon be a million data sets.

Related articles: