{"id":453,"date":"2013-09-24T09:55:35","date_gmt":"2013-09-24T08:55:35","guid":{"rendered":"http:\/\/www.renaudpradenc.com\/?p=453"},"modified":"2013-09-30T16:39:33","modified_gmt":"2013-09-30T15:39:33","slug":"reading-barcodes-on-ios-7","status":"publish","type":"post","link":"https:\/\/www.renaudpradenc.com\/?p=453","title":{"rendered":"Reading Barcodes on iOS 7"},"content":{"rendered":"<p>iOS 7 introduced APIs to read bar codes using the camera. I could have easily overlooked it without this excellent post from <a href=\"http:\/\/www.doubleencore.com\/2013\/09\/ios-7-additions-omg-finally\/\">doubleencore<\/a>.<\/p>\n<p><a href=\"http:\/\/nshipster.com\/ios7\/\">NSHipster<\/a> provided sample code, but it missed some details to work. Here is a sample which does.<\/p>\n<h2>Sample code<\/h2>\n<pre class=\"lang:objc decode:true crayon-selected\">@import AVFoundation;\r\n\r\n@interface CEViewController () &lt;AVCaptureMetadataOutputObjectsDelegate&gt;\r\n\r\n@property (strong) AVCaptureSession *captureSession;\r\n\r\n@end\r\n\r\n@implementation CEViewController\r\n\r\n- (void)viewDidLoad\r\n{\r\n\t[super viewDidLoad];\r\n\r\n\tself.captureSession = [[AVCaptureSession alloc] init];\r\n\tAVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];\r\n\tNSError *error = nil;\r\n\tAVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&amp;error];\r\n\tif(videoInput)\r\n\t\t[self.captureSession addInput:videoInput];\r\n\telse\r\n\t\tNSLog(@\"Error: %@\", error);\r\n\r\n\tAVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];\r\n\t[self.captureSession addOutput:metadataOutput];\r\n\t[metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];\r\n\t[metadataOutput setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];\r\n\r\n\tAVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];\r\n\tpreviewLayer.frame = self.view.layer.bounds;\r\n\t[self.view.layer addSublayer:previewLayer];\r\n\r\n\t[self.captureSession startRunning];\r\n}\r\n\r\n#pragma mark AVCaptureMetadataOutputObjectsDelegate\r\n\r\n- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection\r\n{\r\n\tfor(AVMetadataObject *metadataObject in metadataObjects)\r\n\t{\r\n\t\tAVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)metadataObject;\r\n\t\tif([metadataObject.type isEqualToString:AVMetadataObjectTypeQRCode])\r\n\t\t{\r\n\t\t\tNSLog(@\"QR Code = %@\", readableObject.stringValue);\r\n\t\t}\r\n\t\telse if ([metadataObject.type isEqualToString:AVMetadataObjectTypeEAN13Code])\r\n\t\t{\r\n\t\t\tNSLog(@\"EAN 13 = %@\", readableObject.stringValue);\r\n\t\t}\r\n\t}\r\n}\r\n\r\n@end<\/pre>\n<h2>Pitfalls<\/h2>\n<p>The issue I had with NSHipster&#8217;s sample code is the delegate method was not called at all. I quickly understood this was because the\u00a0AVCaptureMetadataOutput must be configured to tell which metadata to recognise.<\/p>\n<p>But, and this was not obvious to me, -[AVCaptureMetadataOuput setMetadataObjectTypes:] must be called after -[AVCaptureSession addOutput:]. Otherwise, the following message shows in the console:<\/p>\n<p><code>*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureMetadataOutput setMetadataObjectTypes:] - unsupported type found.\u00a0 Use -availableMetadataObjectTypes.'<\/code><\/p>\n<p>I did try to look at the output of -availableMetadataObjectTypes, and it returns an empty array.<\/p>\n<p>Therefore, -addOuput: must be called before setMetadataObjectTypes. In retrospect, it makes sense: the output object must know it is linked to a video session to know which metadata it may provide.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>iOS 7 introduced APIs to read bar codes using the camera. I could have easily overlooked it without this excellent post from doubleencore. NSHipster provided sample code, but it missed some details to work. Here is a sample which does. Sample code @import AVFoundation; @interface CEViewController () &lt;AVCaptureMetadataOutputObjectsDelegate&gt; @property (strong) AVCaptureSession *captureSession; @end @implementation CEViewController [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[89],"tags":[98,60],"class_list":["post-453","post","type-post","status-publish","format-standard","hentry","category-english","tag-barcode","tag-ios"],"_links":{"self":[{"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=\/wp\/v2\/posts\/453","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=453"}],"version-history":[{"count":10,"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=\/wp\/v2\/posts\/453\/revisions"}],"predecessor-version":[{"id":467,"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=\/wp\/v2\/posts\/453\/revisions\/467"}],"wp:attachment":[{"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=453"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=453"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.renaudpradenc.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=453"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}