Summary of some unpopular skills of Java efficiency improvement artifact Stream
- 2021-10-16 01:57:15
- OfStack
Filter
ForeachMapSortedMatchcountreduceparallelStreamIntStream. range (a, b) new Random (). ints () SupplierConsumer 1. accept Method 2. andThen Method ifPresentCollect 1. Function 2. Collector Interface 3. Tool Function 1. toList () 2. joining () 3. groupingBy () 4. reducing () Summary
Stream
Use this method to create an Stream object.
new ArrayList<>().stream()
Filter
Filter, which passes a function. If the return result of this function is true, this element will be retained, otherwise, this element will be discarded.
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
Foreach
Traversal, consumption.
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
Map
This function is also traversal, but it has a return value, while the above Foreach has no return value, just simple consumption. And Foreach cannot be chained because there is no return value, but Map is fine.
stringCollection
.stream()
.map(String::toUpperCase)
.sorted(Comparator.reverseOrder())
.forEach(System.out::println);
Sorted
This method is used to sort, the function passed inside is a comparator, or you can not pass parameters, just use the default.
stringCollection
.stream()
.sorted(( x, y)-> y.length()-x.length())
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
Match
Returns either true or false depending on whether the specified content is contained in a given stream object.
Specifically, there are:
allMatch anyMatch noneMatch
boolean anyStartsWithA = stringCollection
.stream()
.anyMatch((s) -> s.startsWith("a"));
boolean allStartsWithA = stringCollection
.stream()
.allMatch((s) -> s.startsWith("a"));
boolean noneStartsWithZ = stringCollection
.stream()
.noneMatch((s) -> s.startsWith("z"));
count
Counts the number of elements in the collection.
long startsWithB = stringCollection
.stream()
.filter((s) -> s.startsWith("b"))
.count();
reduce
This function is similar to a Fibonacci sequence, passing the result of the previous time and the new element taken from the set each time. For the first time, the first element and the second element are fetched by default.
A simple example is to take 0 for the first time, 1 for the second time, take the result of reduce for the first time as the first parameter, take 2 as the second parameter, and so on.
Optional<String> reduced =
stringCollection
.stream()
.sorted()
.reduce((s1, s2) -> s1 + "#" + s2);
parallelStream
Parallel steam stream can be processed in parallel, which will be more efficient. When using stream. foreach, this traversal has no thread safety problem, but when using parallelStream, it will have thread safety problem. All external variables used in parallelStream, such as Set 1, must use thread safety collection, otherwise it will cause multi-thread safety problem. If you need to ensure security, you need to use reduce and collect, but this is super troublesome to use! ! !
long count = values.parallelStream().sorted().count();
IntStream.range(a,b)
You can directly generate integers from a to b, which still follows most conventions of programming languages, that is, header without tail.
IntStream.range(0, 10)
.forEach(System.out::println);
The result of the output is
0
1
2
3
4
5
6
7
8
9
new Random().ints()
Get 1 series of random values, this interface out of the data is continuous, so need to use limit to limit 1.
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
0
Supplier
Supplier<String> stringSupplier=String::new;
stringSupplier.get();
This interface returns an instance of generic T without passing in any parameters, just like constructing 1 without parameters
Consumer
1. accept method
The 1-only abstract method of the functional interface takes 1 parameter and has no return value.
2. andThen method
Execute the method passing in parameters after executing the caller method.
public class ConsumerTest {
public static void main(String[] args) {
Consumer<Integer> consumer = (x) -> {
int num = x * 2;
System.out.println(num);
};
Consumer<Integer> consumer1 = (x) -> {
int num = x * 3;
System.out.println(num);
};
consumer.andThen(consumer1).accept(10);
}
Executed consumer. accept (10) and then consumer1. accept (10)
ifPresent
For 1 optional, if there is a value, it will be executed otherwise, it will not be executed.
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
3
The result of average execution is one optional
Collect
He has two ways to call
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
4
The following mainly introduces the use of these two methods:
1. Functions
The interface of the first call mode is as follows
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
5
The parameter supplier is to provide a container. It can be seen that the final result of collect operation is an R type variable, and the supplier interface finally needs to return an R type variable, so the container for collecting elements is returned here.
accumulator parameter, see that the definition of this function is to pass in an R container, followed by an element of T type, so you need to put this T into an R container, that is, this step 1 is an operation to add elements to the container.
conbiner This parameter is two containers, that is, how containers are aggregated when multiple containers appear.
A simple example:
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
6
2. Collector interface
The second scenario uses the Collector interface for more advanced usage:
<R, A> R collect(Collector<? super T, A, R> collector);
You can see that he returned a variable of type R, that is, a container.
The Collector interface is the ultimate weapon that makes the collect operation powerful. For most operations, it can be decomposed into its main steps, providing an initial container- > Add elements to the container- > Multi-container aggregation under concurrency- > Operate on the aggregated results
static class CollectorImpl<T, A, R> implements Collector<T, A, R> {
private final Supplier<A> supplier;
private final BiConsumer<A, T> accumulator;
private final BinaryOperator<A> combiner;
private final Function<A, R> finisher;
private final Set<Characteristics> characteristics;
CollectorImpl(Supplier<A> supplier,
BiConsumer<A, T> accumulator,
BinaryOperator<A> combiner,
Function<A,R> finisher,
Set<Characteristics> characteristics) {
this.supplier = supplier;
this.accumulator = accumulator;
this.combiner = combiner;
this.finisher = finisher;
this.characteristics = characteristics;
}
CollectorImpl(Supplier<A> supplier,
BiConsumer<A, T> accumulator,
BinaryOperator<A> combiner,
Set<Characteristics> characteristics) {
this(supplier, accumulator, combiner, castingIdentity(), characteristics);
}
@Override
public BiConsumer<A, T> accumulator() {
return accumulator;
}
@Override
public Supplier<A> supplier() {
return supplier;
}
@Override
public BinaryOperator<A> combiner() {
return combiner;
}
@Override
public Function<A, R> finisher() {
return finisher;
}
@Override
public Set<Characteristics> characteristics() {
return characteristics;
}
}
You can see that we can pass these functions directly to new CollectorImpl, and another simple way is to use Collector. of () to still pass functions directly. new and CollectorImpl are equivalent.
3. Tool functions
1. toList()
Container: ArrayList:: new
Add container action: List:: add
Multi-container merging: left. addAll (right); return left;
stringCollection
.stream()
.filter((s) -> s.startsWith("a"))
.forEach(System.out::println);
9
2.joining()
Container: StringBuilder:: new
Add container operation: StringBuilder:: append
Multi-container merging: r1. append (r2); return r1;
Result operation after polymerization: StringBuilder:: toString
public static Collector<CharSequence, ?, String> joining() {
return new CollectorImpl<CharSequence, StringBuilder, String>(
StringBuilder::new, StringBuilder::append,
(r1, r2) -> { r1.append(r2); return r1; },
StringBuilder::toString, CH_NOID);
}
3.groupingBy()
roupingBy is an advanced mode of toMap, which makes up for the inability of toMap to provide diversified collection operations for values, such as returning Map < T,List < E > > This form of toMap is not so convenient, so the focus of groupingBy is the processing and encapsulation of Key and Value values. Analyze the following codes, in which classifier is the processing of key values, mapFactory is the specific type of container that specifies Map, and downstream is the collection operation of Value.
public static <T, K, D, A, M extends Map<K, D>>
Collector<T, ?, M> groupingBy(Function<? super T, ? extends K> classifier,
Supplier<M> mapFactory,
Collector<? super T, A, D> downstream) {
.......
}
1 Simple example
// Native form
Lists.<Person>newArrayList().stream()
.collect(() -> new HashMap<Integer,List<Person>>(),
(h, x) -> {
List<Person> value = h.getOrDefault(x.getType(), Lists.newArrayList());
value.add(x);
h.put(x.getType(), value);
},
HashMap::putAll
);
//groupBy Form
Lists.<Person>newArrayList().stream()
.collect(Collectors.groupingBy(Person::getType, HashMap::new, Collectors.toList()));
// Because there is an operation on the value , So I can convert values more flexibly
Lists.<Person>newArrayList().stream()
.collect(Collectors.groupingBy(Person::getType, HashMap::new, Collectors.mapping(Person::getName,Collectors.toSet())));
// And 1 A relatively simple way to use it Just pass 1 Parameters are based on key To divide
Map<Integer, List<Person>> personsByAge = persons
.stream()
.collect(Collectors.groupingBy(p -> p.age));
4.reducing()
reducing is a collection of single values, and the return result is not the type of collection family, but the entity class T of single 1
Container: boxSupplier (identity), where an Object [] array of length 1 is used for wrapping, and the reason is naturally an immutable type of pot
Add container action: a [0] = op. apply (a [0], t)
Multi-container merge: a [0] = op. apply (a [0], b [0]); return a;
Result operation after aggregation: the result is naturally the data wrapped in Object [0]- > a[0]
Optimize Operation Status Field: CH_NOID
public static <T> Collector<T, ?, T>
reducing(T identity, BinaryOperator<T> op) {
return new CollectorImpl<>(
boxSupplier(identity),
(a, t) -> { a[0] = op.apply(a[0], t); },
(a, b) -> { a[0] = op.apply(a[0], b[0]); return a; },
a -> a[0],
CH_NOID);
}
Simply put, what this place does is the same as reduce. The first id passes in the initial value of reduce, but it wraps it into an array of length 1.
// Native operation
final Integer[] integers = Lists.newArrayList(1, 2, 3, 4, 5)
.stream()
.collect(() -> new Integer[]{0}, (a, x) -> a[0] += x, (a1, a2) -> a1[0] += a2[0]);
//reducing Operation
final Integer collect = Lists.newArrayList(1, 2, 3, 4, 5)
.stream()
.collect(Collectors.reducing(0, Integer::sum));
// Of course Stream Also provided reduce Operation
final Integer collect = Lists.newArrayList(1, 2, 3, 4, 5)
.stream().reduce(0, Integer::sum)