Java used JDBC to insert 10W pieces of data into MySQL database batches for of test efficiency

  • 2020-05-27 04:46:33
  • OfStack

How do you improve efficiency when using JDBC to connect to the MySQL database for data insertion, especially for the continuous insertion of large volumes of data (100000)?

Two Statement methods in the JDBC programming interface are particularly noteworthy:

Through the use of addBatch() and executeBatch() This 1 pair method can realize batch data processing.

It's worth noting, however, that you first need to set up a manual submission in the database link, connection.setAutoCommit(false) , and then after Statement connection.commit() .


import java.io.BufferedReader;
import java.io.IOException;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.Date;
import com.mysql.jdbc.Connection;
public class MysqlBatchUtil {
 private String sql="INSERT INTO db_test (param1,param2,param3,param4,param5) VALUES (?,?,?,?,?)"; 
 private String charset="utf-8"; 
 private String connectStr="jdbc:mysql://localhost:3306/test";
 private String username="root"; 
 private String password="123456"; 
 private void doStore() throws ClassNotFoundException, SQLException, IOException { 
  Class.forName("com.mysql.jdbc.Driver"); 
  connectStr += "?useServerPrepStmts=false&rewriteBatchedStatements=true";// This is the test of efficient batch insertion, which is performed after removal of the normal batch insertion 
  Connection conn = (Connection) DriverManager.getConnection(connectStr, username,password); 
  conn.setAutoCommit(false); //  Set manual submission  
  int count = 0; 
  PreparedStatement psts = conn.prepareStatement(sql); 
  String line = null; 
  Date begin=new Date();
  for(int i=0;i<=100000;i++){
   psts.setString(1, i+"param1"); 
   psts.setString(2, i+"param2"); 
   psts.setString(3, i+"param3"); 
   psts.setString(4, i+"param4"); 
   psts.setString(5, i+"param5"); 
   psts.addBatch();   //  Add batch processing  
   count++;  
  } 
  psts.executeBatch(); //  Perform batch processing  
  conn.commit(); //  submit  
  Date end=new Date();
  System.out.println(" The number of ="+count); 
  System.out.println(" The elapsed time ="+(end.getTime()-begin.getTime()));
  conn.close(); 
 } 
 public static void main(String[] args) {
  try {
   new MysqlBatchUtil().doStore();
  } catch (ClassNotFoundException e) {
   e.printStackTrace();
  } catch (SQLException e) {
   e.printStackTrace();
  } catch (IOException e) {
   e.printStackTrace();
  }
 }
}

Test results:

Number = 100001
Running time =4725

1 total 10W, execution time 1 total 47 seconds.

This efficiency is still not high, it does not seem to achieve the desired effect, need to further improve 1 step.

You can also add parameters to the MySQL JDBC connection string,


rewriteBatchedStatements=true

mysql turns batch processing off by default with this parameter, which overrides the SQL statement submitted to the database


useServerPrepStmts=false

If you don't open (useServerPrepStmts = false), using com. mysql. jdbc. PreparedStatement locally SQL assembled, finally to db has been to replace the & # 63; After the final SQL.

A slight improvement here is to add the following statement to the concatenation string (without the comments in the code constructor) :


connectStr += "?useServerPrepStmts=false&rewriteBatchedStatements=true";

The test results are as follows:

Number = 100001
Running time =1213

For the same amount of data, this execution only took 12 seconds, so the processing efficiency is greatly improved, hehe


Related articles: