zoukankan      html  css  js  c++  java
  • Java线程池ExecutorService 代码备忘

    ExecutorService fixedThreadPool = Executors.newFixedThreadPool(5)
    创建一个定长线程池,可控制线程最大并发数,超出的线程会在队列中等待
    package UnitTest;
    
    import java.util.ArrayList;
    import java.util.List;
    import java.util.concurrent.ExecutionException;
    import java.util.concurrent.ExecutorService;
    import java.util.concurrent.Executors;
    import java.util.concurrent.Future;
    
    public class test {
        public static void main(String[] args) {
            int flag = 0;
            long a = System.currentTimeMillis();
            ExecutorService fixedThreadPool = Executors.newFixedThreadPool(2);
            List<Future> resultList = new ArrayList<>();
            for (int i = 1; i < 101; i++) {
                Future future = fixedThreadPool.submit(new ThreadWithRunnable(i));
                resultList.add(future);
            }
            fixedThreadPool.shutdown();
    
            for (Future fs : resultList) {
                try {
                    System.out.println(fs.get().toString());
                    flag += Integer.parseInt(fs.get().toString());
                } catch (InterruptedException e) {
                    e.printStackTrace();
                } catch (ExecutionException e) {
                    e.printStackTrace();
                }
            }
    
            System.out.println(flag);
            System.out.println("
    <br>执行耗时 : " + (System.currentTimeMillis() - a) / 1000f + " 秒 ");
    
        }
    }
    package UnitTest;
    
    import java.util.concurrent.Callable;
    
    public class ThreadWithRunnable implements Callable {
        private int number;
        public ThreadWithRunnable(int number){
            this.number = number;
        }
    
        @Override
        public Object call() throws Exception {
            for (int i=1;i<1000;i++){
                Thread.sleep(1);
            }
            return number;
        }
    }
  • 相关阅读:
    Scrapy中的反反爬、logging设置、Request参数及POST请求
    scrapy的CrawlSpider类
    利用scrapy爬取腾讯的招聘信息
    scrapy知识补充--scrapy shell 及Spider
    scrapy的一个简单小项目
    scrapy框架介绍及安装
    并发编程--greenlet与gevent
    并发编程--协程
    并发编程--进程池与线程池-练习3
    并发编程--进程池与线程池-练习2
  • 原文地址:https://www.cnblogs.com/kgdxpr/p/10383291.html
Copyright © 2011-2022 走看看