Detailed explanation of einops tensor operation artifact usage example supporting PyTorch

  • 2021-12-12 05:02:06
  • OfStack

Basic usage of directory advanced usage

Today, when doing visual transformer research, I discovered einops as a magic weapon and decided to wantonly Amway 1 wave.

Look at the link first: https://github.com/arogozhnikov/einops

Installation:


pip install einops

Basic usage

The strength of einops is to visualize the dimensional operation of tensor, so that developers can "think and write". For example:


from einops import rearrange
 
# rearrange elements according to the pattern
output_tensor = rearrange(input_tensor, 'h w c -> c h w')

Using 'h w c- > c h w 'completes dimension swapping, which is similar to permute in pytorch. However, einops's rearrange gameplay can be more advanced:


from einops import rearrange
import torch
 
a = torch.randn(3, 9, 9)  # [3, 9, 9]
output = rearrange(a, 'c (r p) w -> c r p w', p=3)
print(output.shape)   # [3, 3, 3, 9]

This is the advanced usage. Think of the middle dimension as r × p, and then give the value of p, so that the system will automatically disassemble the middle dimension into 3 × 3. This completes [3, 9, 9]- > Dimension conversion for [3, 3, 3, 9].

This function is not comparable to the built-in function of pytorch.

In addition, there are reduce and repeat, which are also very easy to use.


from einops import repeat
import torch
 
a = torch.randn(9, 9)  # [9, 9]
output_tensor = repeat(a, 'h w -> c h w', c=3)  # [3, 9, 9]

By specifying c, you can specify the number of layers to replicate.

Look at reduce again:


from einops import reduce
import torch
 
a = torch.randn(9, 9)  # [9, 9]
output_tensor = reduce(a, 'b c (h h2) (w w2) -> b h w c', 'mean', h2=2, w2=2)

Here 'mean' specifies the pooling method. I believe you can understand it. If you don't understand it, you can leave a message and ask questions ~

Advanced usage

einops can also be nested in layer of pytorch, see:


# example given for pytorch, but code in other frameworks is almost identical  
from torch.nn import Sequential, Conv2d, MaxPool2d, Linear, ReLU
from einops.layers.torch import Rearrange
 
model = Sequential(
    Conv2d(3, 6, kernel_size=5),
    MaxPool2d(kernel_size=2),
    Conv2d(6, 16, kernel_size=5),
    MaxPool2d(kernel_size=2),
    # flattening
    Rearrange('b c h w -> b (c h w)'),  
    Linear(16*5*5, 120), 
    ReLU(),
    Linear(120, 10), 
)

Rearrange here is a subclass of nn. module, which can be directly put into the model as a network layer ~

One word, absolutely.

The above is to support PyTorch einops tensor operation artifact usage example details, more information about einops tensor operation usage please pay attention to other related articles on this site!


Related articles: