Several implementation methods and differences of Java thread pool are introduced in detail

  • 2020-07-21 07:55:36
  • OfStack

Here are some implementation methods and differences of Java thread pool through example code:


import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Random;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
public class TestThreadPool {
 // -newFixedThreadPool with cacheThreadPool Almost. It's also possible reuse Just use it, but you can't create new threads anytime 
 // - It's unique : At any given point in time, there can be no more than a fixed number of active threads. If a new thread is created, it can only be placed in another queue until a thread terminates and is removed from the pool 
 // - and cacheThreadPool Different, FixedThreadPool There is no IDLE Mechanisms (there may be, but since the documentation doesn't mention it, it must be very long and similar to relying on the upper layers TCP or UDP
 // IDLE Mechanisms and so on), so FixedThreadPool Most for 1 Some very stable very fixed normal concurrent thread, more for the server 
 // - From the source code of the method, cache Pool and fixed  The pool calls are the same 1 Three underlying pools, but with different parameters :
 // fixed The number of pool threads is fixed, and is 0 seconds IDLE (no IDLE ) 
 // cache Pool thread count support 0-Integer.MAX_VALUE( Obviously, no consideration is given to the resource capacity of the host.), 60 seconds IDLE
 private static ExecutorService fixedService = Executors.newFixedThreadPool(6);
 // - Cache pool, check to see if there are previously created threads in the pool, if there are, then reuse. If not, build it 1 Five new threads are added to the pool 
 // - Cache pools are commonly used for execution 1 Asynchronous tasks with a very short lifetime 
 //  So in 1 Some connection-oriented daemon type SERVER Not much. 
 // - can reuse The thread must be timeout IDLE Threads in the pool, default timeout is 60s, More than this IDLE The thread instance is terminated and removed from the pool. 
 //  Notice, put in CachedThreadPool The thread does not have to worry about its ending, exceeding TIMEOUT If not active, it will be terminated automatically. 
 private static ExecutorService cacheService = Executors.newCachedThreadPool();
 // - Singleton thread, only available in any pool of time 1 A thread 
 // - Use the and cache Pool and fixed The same underlying pool, but the number of threads is 1-1,0 seconds IDLE (no IDLE ) 
 private static ExecutorService singleService = Executors.newSingleThreadExecutor();
 // - Scheduled thread pools 
 // - The threads in this pool can be pressed schedule In turn, delay Execute, or execute periodically 
 private static ExecutorService scheduledService = Executors.newScheduledThreadPool(10);
 public static void main(String[] args) {
 DateFormat format = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
 List<Integer> customerList = new ArrayList<Integer>();
 System.out.println(format.format(new Date()));
 testFixedThreadPool(fixedService, customerList);
 System.out.println("--------------------------");
 testFixedThreadPool(fixedService, customerList);
 fixedService.shutdown();
 System.out.println(fixedService.isShutdown());
 System.out.println("----------------------------------------------------");
 testCacheThreadPool(cacheService, customerList);
 System.out.println("----------------------------------------------------");
 testCacheThreadPool(cacheService, customerList);
 cacheService.shutdownNow();
 System.out.println("----------------------------------------------------");
 testSingleServiceThreadPool(singleService, customerList);
 testSingleServiceThreadPool(singleService, customerList);
 singleService.shutdown();
 System.out.println("----------------------------------------------------");
 testScheduledServiceThreadPool(scheduledService, customerList);
 testScheduledServiceThreadPool(scheduledService, customerList);
 scheduledService.shutdown();
 } 
 public static void testScheduledServiceThreadPool(ExecutorService service, List<Integer> customerList) {
 List<Callable<Integer>> listCallable = new ArrayList<Callable<Integer>>();
 for (int i = 0; i < 10; i++) {
  Callable<Integer> callable = new Callable<Integer>() {
  @Override
  public Integer call() throws Exception {
   return new Random().nextInt(10);
  }
  };
  listCallable.add(callable);
 }
 try {
  List<Future<Integer>> listFuture = service.invokeAll(listCallable);
  for (Future<Integer> future : listFuture) {
  Integer id = future.get();
  customerList.add(id);
  }
 } catch (Exception e) {
  e.printStackTrace();
 }
 System.out.println(customerList.toString());
 }
 public static void testSingleServiceThreadPool(ExecutorService service, List<Integer> customerList) {
 List<Callable<List<Integer>>> listCallable = new ArrayList<Callable<List<Integer>>>();
 for (int i = 0; i < 10; i++) {
  Callable<List<Integer>> callable = new Callable<List<Integer>>() {
  @Override
  public List<Integer> call() throws Exception {
   List<Integer> list = getList(new Random().nextInt(10));
   boolean isStop = false;
   while (list.size() > 0 && !isStop) {
   System.out.println(Thread.currentThread().getId() + " -- sleep:1000");
   isStop = true;
   }
   return list;
  }
  };
  listCallable.add(callable);
 }
 try {
  List<Future<List<Integer>>> listFuture = service.invokeAll(listCallable);
  for (Future<List<Integer>> future : listFuture) {
  List<Integer> list = future.get();
  customerList.addAll(list);
  }
 } catch (Exception e) {
  e.printStackTrace();
 }
 System.out.println(customerList.toString());
 }
 public static void testCacheThreadPool(ExecutorService service, List<Integer> customerList) {
 List<Callable<List<Integer>>> listCallable = new ArrayList<Callable<List<Integer>>>();
 for (int i = 0; i < 10; i++) {
  Callable<List<Integer>> callable = new Callable<List<Integer>>() {
  @Override
  public List<Integer> call() throws Exception {
   List<Integer> list = getList(new Random().nextInt(10));
   boolean isStop = false;
   while (list.size() > 0 && !isStop) {
   System.out.println(Thread.currentThread().getId() + " -- sleep:1000");
   isStop = true;
   }
   return list;
  }
  };
  listCallable.add(callable);
 }
 try {
  List<Future<List<Integer>>> listFuture = service.invokeAll(listCallable);
  for (Future<List<Integer>> future : listFuture) {
  List<Integer> list = future.get();
  customerList.addAll(list);
  }
 } catch (Exception e) {
  e.printStackTrace();
 }
 System.out.println(customerList.toString());
 }
 public static void testFixedThreadPool(ExecutorService service, List<Integer> customerList) {
 List<Callable<List<Integer>>> listCallable = new ArrayList<Callable<List<Integer>>>();
 for (int i = 0; i < 10; i++) {
  Callable<List<Integer>> callable = new Callable<List<Integer>>() {
  @Override
  public List<Integer> call() throws Exception {
   List<Integer> list = getList(new Random().nextInt(10));
   boolean isStop = false;
   while (list.size() > 0 && !isStop) {
   System.out.println(Thread.currentThread().getId() + " -- sleep:1000");
   isStop = true;
   }
   return list;
  }
  };
  listCallable.add(callable);
 }
 try {
  List<Future<List<Integer>>> listFuture = service.invokeAll(listCallable);
  for (Future<List<Integer>> future : listFuture) {
  List<Integer> list = future.get();
  customerList.addAll(list);
  }
 } catch (Exception e) {
  e.printStackTrace();
 }
 System.out.println(customerList.toString());
 }
 public static List<Integer> getList(int x) {
 List<Integer> list = new ArrayList<Integer>();
 list.add(x);
 list.add(x * x);
 return list;
 }
}

Implementation of thread pool tutorial using: LinkedBlockingQueue


// Such as: corePoolSize=3,maximumPoolSize=6,LinkedBlockingQueue(10)
 
//RejectedExecutionHandler The default processing is: ThreadPoolExecutor.AbortPolicy
 
//ThreadPoolExecutor executorService = new ThreadPoolExecutor(corePoolSize, maximumPoolSize, 1L, TimeUnit.SECONDS, new LinkedBlockingQueue<Runnable>(10));
 
//1. If the thread pool (that is, the call executorService.execute The running thread has not been reached LinkedBlockingQueue.init(10) The number of threads currently executing is :corePoolSize(3) 
 
//2. If you exceed LinkedBlockingQueue.init(10) And the amount of excess >=init(10)+corePoolSize(3) Theta, and less than theta init(10)+maximumPoolSize.  The number of threads currently started is: ( Current number of threads -init(10))
 
//3. If the number of called threads exceeds init(10)+maximumPoolSize  Depending on the RejectedExecutionHandler Rule processing. 

About: RejectedExecutionHandler several default implementations explained


// Default use: ThreadPoolExecutor.AbortPolicy A rejected handler will throw the runtime RejectedExecutionException . 
      RejectedExecutionHandler policy=new ThreadPoolExecutor.AbortPolicy();
//     // in  ThreadPoolExecutor.CallerRunsPolicy  , the thread calls the execute Itself. This strategy provides a simple feedback control mechanism that can slow down the delivery of new tasks. 
//     policy=new ThreadPoolExecutor.CallerRunsPolicy();
//     // in  ThreadPoolExecutor.DiscardPolicy  Tasks that cannot be performed will be deleted. 
//     policy=new ThreadPoolExecutor.DiscardPolicy();
//     // in  ThreadPoolExecutor.DiscardOldestPolicy  If the executor has not been closed, the task at the head of the work queue is removed and then the executor is retried (the process is repeated if it fails again). 
//     policy=new ThreadPoolExecutor.DiscardOldestPolicy();

I hope you found this article helpful


Related articles: