Solve the problem of slow query time of mybatis plus

  • 2021-10-16 01:41:32
  • OfStack

mybatis-plus queries are slow

1. Phenomena

More than 30,000 id were found

And then use


EntityWrapper ew = new EntityWrapper<>();
 ew.in(TableFieldConstant.F_AUTH_RESULT_ID, ids);

Queries will be slow

2. Reasons

Following 1 mybatis-plus source code


 protected String formatSqlIfNeed(boolean need, String sqlStr, Object... params) {
        if (need && !StringUtils.isEmpty(sqlStr)) {
            if (ArrayUtils.isNotEmpty(params)) {
                for(int i = 0; i < params.length; ++i) {
                    String genParamName = "MPGENVAL" + this.paramNameSeq.incrementAndGet();
                    sqlStr = sqlStr.replace(String.format("{%s}", i), String.format("#{%s.paramNameValuePairs.%s}", this.getParamAlias(), genParamName));
                    this.paramNameValuePairs.put(genParamName, params[i]);
                }
            }
 
            return sqlStr;
        } else {
            return null;
        }
    }

The problem arises in


sqlStr = sqlStr.replace(String.format("{%s}", i), String.format("#{%s.paramNameValuePairs.%s}", this.getParamAlias(), genParamName));

The replace test found that when the data volume is large, the replacement will be very time-consuming. The test traversed 30,000 times and spliced from 1 to 30,000. The replacement took more than 20 seconds

The StringUtis. replace test for apache-commons-lang took more than 7 seconds

3. Summary

Problem solving after changing batch query using mybaits to handwritten sql query

When using mybatis-plus batch operation, be careful to write sql and try to write sql

This problem with mybatis-plus has been solved and can be upgraded to jar version 3. x

mybatis-plus is too slow to process large amounts of data

The method of mass data insertion is foreach splicing SQL of Mybatis

I found that no matter whether it is changed to Mybatis Batch submission or native JDBC Batch, it is still inserting one record at the time of insertion, and the speed is far less than the original Mybatis splicing SQL method.


//  No. 1 1 Step to determine whether to update or add 
 String[] splitUserId = userGroup.getUserId().split(",");
 String[] spiltUserName = userGroup.getUserName().split(",");
 if (StringUtils.isBlank(userGroup.getId())) {
  userGroup.setNum(spiltUserName.length);
  userGroupMapper.insert(userGroup);
 } else {
  userGroup.setNum(spiltUserName.length);
  userGroupMapper.updateById(userGroup);
 }
 /*  No. 1 2 Part deletes intermediate table information, and the fields are redundant  */

 Map<String, Object> columnMap = new HashMap<String, Object>();
 columnMap.put("USER_GROUP_ID", userGroup.getId());
 groupUsersService.removeByMap(columnMap);

 /*  No. 1 3 Step, save the intermediate table in batches  */

 if (splitUserId.length != 0) {
  List<GroupUsers> groupUsersList = Lists.newArrayList();
  for (int i = 0; i < splitUserId.length; i++) {
   GroupUsers gu = new GroupUsers();
   gu.setUserId(splitUserId[i]);
   gu.setUserName(spiltUserName[i]);
   gu.setUserGroupId(userGroup.getId());
   groupUsersList.add(gu);
  }
  groupUsersService.saveBatch(groupUsersList);
 }

1. Even such a scenario is also in line with most development scenarios, but it takes 8 seconds with 1,000 pieces of data, which may have a great relationship with the performance of the computer, but even so, it will not take 8 seconds, so the user experience will be miserable.

2. JDBC connection needs to add 1 parameter in URL string:

rewriteBatchedStatements=true url: jdbc:mysql://192.168.1.143:3306/rt_xxxxxx_test?useUnicode=true & characterEncoding=utf-8 & rewriteBatchedStatements=true

3. rewriteBatchedStatements parameters should be added to JDBC connected to JDBC of MySQL, and drivers of version 5.1. 13 or above should be guaranteed, so as to realize high-performance batch insertion.

4. By default, MySQL JDBC driver will ignore executeBatch () statement, break up a group of sql statements that we expect to execute in batches, and send them to MySQL database one by one. Batch insertion is actually a single insertion, which directly causes low performance.

Only when the rewriteBatchedStatements parameter is set to true, the driver will help you execute SQL in batches

This option is also valid for INSERT/UPDATE/DELETE


Related articles: