Examples of 4 Methods for Mass Inserting Data in mysql

  • 2021-12-04 20:02:29
  • OfStack

Preface

This article mainly introduces four methods of inserting data in large quantities about mysql, and shares them for your reference and study. The following words are not much to say, let's take a look at the detailed introduction

Method 1: Loop insertion

This is also the most common way, if the amount of data is not very large, you can use it, but it consumes the resources connected to the database every time.

The general thinking is as follows

(I write pseudo-code here, specific writing can be combined with their own business logic or framework syntax)


for($i=1;$i<=100;$i++){
 $sql = 'insert...............';
 //querysql
}
foreach($arr as $key => $value){
$sql = 'insert...............';
 //querysql
}
while($i <= 100){
$sql = 'insert...............';
 //querysql
 $i++
}

Because it is too common and not difficult at the same time, it is not my main writing today, so I won't say much here

Method 2: Reduce connection resources and splice one sql

The pseudo code is as follows


// It is assumed here that arr Adj. key And database fields, in fact, in most frameworks, php This is the design when operating the database 
$arr_keys  = array_keys($arr);
$sql    = 'INSERT INTO tablename (' . implode(',' ,$arr_keys) . ') values';
$arr_values  = array_values($arr);
$sql    .= " ('" . implode("','" ,$arr_values) . "'),";
$sql    = substr($sql ,0 ,-1);
// After splicing, it is probably  INSERT INTO tablename ('username','password') values 
('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx')
.......
//querysql

It is not a big problem to write and insert 10,000 pieces normally, unless the data is very long and it is enough to cope with ordinary batch insertion, such as batch generation of card numbers, batch generation of random codes and so on. . .

Method 3: Use stored procedures

I just use this in my hand to pay sql, and the specific business logic can be combined by everyone.


delimiter $$$
create procedure zqtest()
begin
declare i int default 0;
set i=0;
start transaction;
while i<80000 do
 //your insert sql 
set i=i+1;
end while;
commit;
end
$$$
delimiter;
call zqtest();

This is just a test code, and the specific parameters are defined by everyone

Here I insert 80,000 pieces at a time. Although it is not much, the amount of data per piece is very large, and there are many varchar4000 and text fields
Time consuming 6.524 s

Method 4: Use MYSQL LOCAL_INFILE

I am currently using this, so by the way, the code of pdo is also returned for your reference


// Settings pdo Open MYSQL_ATTR_LOCAL_INFILE
/*hs96.cn@gmail.com
public function pdo_local_info ()
{
  global $system_dbserver;
  $dbname = 'hs96.cn@gmail.com';
  $ip  = 'hs96.cn@gmail.com';
  $user = 'hs96.cn@gmail.com';
  $pwd  = 'hs96.cn@gmail.com';
  $dsn  = 'mysql:dbname=' . $dbname . ';host=' . $ip . ';port=3306';
  $options = [PDO::MYSQL_ATTR_LOCAL_INFILE => true];
  $db  = new PDO($dsn ,$user ,$pwd ,$options);
  return $db;
 }
// The pseudo code is as follows 
public function test(){
  $arr_keys   = array_keys($arr);
  $root_dir   = $_SERVER["DOCUMENT_ROOT"] . '/';
  $my_file   = $root_dir . "hs96.cn@gmail.com/sql_cache/" . $order['OrderNo'] . '.sql';
  $fhandler   = fopen($my_file,'a+');
  if ($fhandler) {
  $sql = implode("\t" ,$arr);
   $i = 1;
   while ($i <= 80000)
   {
    $i++;
    fwrite($fhandler ,$sql . "\r\n");
   }
   $sql = "LOAD DATA local INFILE '" . $myFile . "' INTO TABLE ";
   $sql .= "tablename (" . implode(',' ,$arr_keys) . ")";
   $pdo = $this->pdo_local_info ();
   $res = $pdo->exec($sql);
   if (!$res) {
    //TODO  Insert failed 
   }
   @unlink($my_file);
  }
}

This is also very large per piece of data, with many varchar4000 and text fields

Time consuming 2.160 s

The above meets the basic needs, and the problem of 1 million data is not big. Otherwise, the data is too large, which also involves sub-libraries and sub-tables, or uses queue insertion.

Summarize


Related articles: