Generation and scanning of iOS two dimensional code

  • 2021-11-30 01:40:37
  • OfStack

In this paper, we share the specific code of Android9 palace picture display for your reference. The specific contents are as follows

Attribute


@property (strong,nonatomic)AVCaptureDevice * device;
@property (strong,nonatomic)AVCaptureDeviceInput * input;
@property (strong,nonatomic)AVCaptureMetadataOutput * output;
@property (strong,nonatomic)AVCaptureSession * session;
@property (strong,nonatomic)AVCaptureVideoPreviewLayer * layer; 
@property (nonatomic, strong)UIImageView *imageView;

Generation of 2-D codes


// 1. Create a filter 
  CIFilter *filter = [CIFilter filterWithName:@"CIQRCodeGenerator"];
 
  // 2. Restore default 
  [filter setDefaults];
 
  // 3. Add data to the filter ( Regular expression / Account number and password )
  NSString *dataString = @"http://www.520it.com";
  NSData *data = [dataString dataUsingEncoding:NSUTF8StringEncoding];
  [filter setValue:data forKeyPath:@"inputMessage"];
 
  // 4. Object of the output 2 Dimension code 
  CIImage *outputImage = [filter outputImage];
 
  // Because the generated 2 Dimension code is ambiguous, so pass createNonInterpolatedUIImageFormCIImage:outputImage To get HD 2 Dimension code picture 
 
  // 5. Display 2 Dimension code 
  self.imageView.image = [self createNonInterpolatedUIImageFormCIImage:outputImage withSize:200];

* createNonInterpolatedUIImageFormCIImage: Implementation of the outputImage method


/**
 *  According to CIImage Class of the specified size UIImage
 *
 * @param image CIImage
 * @param size  Picture width 
 */
- (UIImage *)createNonInterpolatedUIImageFormCIImage:(CIImage *)image withSize:(CGFloat) size
{
  CGRect extent = CGRectIntegral(image.extent);
  CGFloat scale = MIN(size/CGRectGetWidth(extent), size/CGRectGetHeight(extent));
 
  // 1. Create bitmap;
  size_t width = CGRectGetWidth(extent) * scale;
  size_t height = CGRectGetHeight(extent) * scale;
  CGColorSpaceRef cs = CGColorSpaceCreateDeviceGray();
  CGContextRef bitmapRef = CGBitmapContextCreate(nil, width, height, 8, 0, cs, (CGBitmapInfo)kCGImageAlphaNone);
  CIContext *context = [CIContext contextWithOptions:nil];
  CGImageRef bitmapImage = [context createCGImage:image fromRect:extent];
  CGContextSetInterpolationQuality(bitmapRef, kCGInterpolationNone);
  CGContextScaleCTM(bitmapRef, scale, scale);
  CGContextDrawImage(bitmapRef, extent, bitmapImage);
 
  // 2. Save bitmap To pictures 
  CGImageRef scaledImage = CGBitmapContextCreateImage(bitmapRef);
  CGContextRelease(bitmapRef);
  CGImageRelease(bitmapImage);
  return [UIImage imageWithCGImage:scaledImage];
}

Scanning of 2-D code


// 1. Create a capture session 
  AVCaptureSession *session = [[AVCaptureSession alloc] init];
  self.session = session;
 
  // 2. Add an input device ( Data input from camera )
  AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
  AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
  [session addInput:input];
 
  // 3. Add output data ( Sample object --> Class object --> Metaclass object --> Root metaclass object )
  AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
  [output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
  [session addOutput:output];
 
  // 3.1. Set the type of input metadata ( Type is 2 Dimension code data )
  [output setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]];
 
  // 4. Add scan layer 
  AVCaptureVideoPreviewLayer *layer = [AVCaptureVideoPreviewLayer layerWithSession:session];
  layer.frame = self.view.bounds;
  [self.view.layer addSublayer:layer];
  self.layer = layer;
 
  // 5. Start scanning 
  [session startRunning];

* The method that will be called when the result is scanned


//  This method is executed when data is scanned 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
  if (metadataObjects.count > 0) {
    // The scan data is obtained, and finally 1 The latest scanned data in 20 hours 
    AVMetadataMachineReadableCodeObject *object = [metadataObjects lastObject];
    NSLog(@"%@", object.stringValue);
 
    //  Stop scanning 
    [self.session stopRunning];
 
    //  Remove the preview layer 
    [self.layer removeFromSuperlayer];
  } else {
    NSLog(@" No data was scanned ");
  }
}

Related articles: