-
Notifications
You must be signed in to change notification settings - Fork 472
Creating a Custom Camera View
Accessing the built in Image Picker Controller is a quick and easy way to get image and video capture into your app. However, when you need style and functionality that goes beyond the stock Image Picker Controller you will need to create a Custom Camera View.
Add the following view elements to the ViewController in Storyboard:
- UIView This will serve as the "view finder" of your camera.
- UIImageView This will hold the captured still image after you take a picture.
- UIButton This button will "take a picture".
At the top of your ViewController file, import AVFoundation
Create Outlets for the UIView and UIImageView.
- Name the UIView,
previewView
. - Name the UIImageView,
captureImageView
.
Create an Action for the UIButton.
- Name the method,
didTakePhoto
.
Above the viewDidLoad
method, where you create variables you want to be accessible anywhere in the ViewController file, create the following Instance Variables.
var captureSession: AVCaptureSession!
var stillImageOutput: AVCapturePhotoOutput!
var videoPreviewLayer: AVCaptureVideoPreviewLayer!
@interface CameraViewController ()
@property (nonatomic) AVCaptureSession *session;
@property (nonatomic) AVCapturePhotoOutput *stillImageOutput;
@property (nonatomic) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@end
The bulk of the camera setup will happen in the viewDidAppear
.
- NOTE: Make sure to call
super.viewDidAppear(animated)
also.
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Setup your camera here...
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Setup your camera here...
}
The session will coordinate the input and output data from the devices camera.
Still in viewDidAppear
- Create a new session
- Configure the session for high resolution still photo capture. We'll use a convenient preset to that.
captureSession = AVCaptureSession()
captureSession.sessionPreset = .medium
self.session = [AVCaptureSession new];
self.session.sessionPreset = AVCaptureSessionPresetPhoto;
- NOTE: If you plan to upload your photo to Parse, you will likely need to change your preset to
AVCaptureSession.Preset.High
orAVCaptureSession.Preset.medium
to keep the size under the 10mb Parse max.
In this example, we will be using the rear camera. The front camera and microphone are additional input devices at your disposal. Printing debug comment incase the fetching the rear camera fails.
Still in viewDidAppear
guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video)
else {
print("Unable to access back camera!")
return
}
AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!backCamera) {
NSLog(@"Unable to access back camera!");
return;
}
We now need to make an AVCaptureDeviceInput. The AVCaptureDeviceInput will serve as the "middle man" to attach the input device, backCamera
to the session.
- We will make a new AVCaptureDeviceInput and attempt to associate it with our
backCamera
input device. - There is a chance that the input device might not be available, so we will set up a
try
catch
to handle any potential errors we might encounter. In Objective C, errors will be using the traditional NSError pattern. Still inviewDidAppear
do {
let input = try AVCaptureDeviceInput(device: backCamera)
//Step 9
}
catch let error {
print("Error Unable to initialize back camera: \(error.localizedDescription)")
}
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera
error:&error];
if (!error) {
//Step 9
}
else {
NSLog(@"Error Unable to initialize back camera: %@", error.localizedDescription);
}
If there are no errors from our last step and the session is able to accept input, the go ahead and add input to the Session.
if error == nil && session!.canAddInput(input) {
session!.addInput(input)
// ...
// The remainder of the session setup will go here...
}
// Continue from above
else if (self.session && [self.session canAddInput:input]) {
[self.session addInput:input];
// The remainder of the session setup will go here...
}
Just like we created an AVCaptureDeviceInput to be the "middle man" to attach the input device, we will use AVCaptureStillImageOutput to help us attach the output to the session.
- Create a new AVCaptureStillImageOutput object.
- Set the output data setting to use JPEG format.
//Need to update to AVCapturePhotoOutput
//stillImageOutput = AVCaptureStillImageOutput()
//stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
self.stillImageOutput = [AVCapturePhotoOutput new];
If the session is able to accept our output, then we will attach the output to the session.
if session!.canAddOutput(stillImageOutput) {
session!.addOutput(stillImageOutput)
// ...
// Configure the Live Preview here...
}
if ([self.session canAddOutput:self.stillImageOutput]) {
[self.session addOutput:self.stillImageOutput];
//Configure the Live Preiview here
}
//Also let the ViewController conform to `AVCapturePhotoCaptureDelegate`
#import <UIKit/UIKit.h>
@import AVKit;
@interface CameraViewController : UIViewController <AVCapturePhotoCaptureDelegate>
Now that the input and output are all hooked up with our session, we just need to get our Live Preview going so we can actually display what the camera sees on the screen in our UIView, previewView
.
- Create an AVCaptureVideoPreviewLayer and associate it with our session.
- Configure the Layer to resize while maintaining it's original aspect.
- Fix the orientation to portrait
- Add the preview layer as a sublayer of our
previewView
- Finally, start the session!
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait
previewView.layer.addSublayer(videoPreviewLayer!)
session!.startRunning()
self.videoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.session];
if (self.videoPreviewLayer) {
self.videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
[self.previewView.layer addSublayer:self.videoPreviewLayer];
[self.session startRunning];
}
- Create a
viewDidAppear
method. just like with theviewWillAppear
method, we will want to call thesuper.
of theviewDidAppear
method. - Within the
viewDidAppear
method, set the size and origin of the Preview Layer to fit inside the Preview View. We will do this using the bounds property.
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
videoPreviewLayer!.frame = previewView.bounds
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
self.videoPreviewLayer.frame = self.previewView.bounds;
}
NOTE: The simulator does NOT have a camera so you need to run your app on an Actual Device to see the magic!
- At this point, you should see a live "video" stream of your phone camera's input within your
previewView
.
- Setup capture setting for this particular capture.
- (IBAction)didTakePhoto:(id)sender {
AVCapturePhotoSettings *settings = [AVCapturePhotoSettings photoSettingsWithFormat:@{AVVideoCodecKey: AVVideoCodecTypeJPEG}];
//Capture Photo
}
- Call the
-capturePhotoWithSettings:delegate:
function on thestillImageOutput
.
[self.stillImageOutput capturePhotoWithSettings:settings delegate:self];
//Don't for get to conform to AVCapturePhotoCaptureDelegate
- The AVCapturePhotoCaptureDelegate methods provide us with an instance of AVCapturePhoto object which has all we need to get a UIImage that we can insert into our
captureImageView
and easily use elsewhere in our app.
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(nullable NSError *)error {
NSData *imageData = photo.fileDataRepresentation;
UIImage *image = [UIImage imageWithData:imageData];
// Add the image to captureImageView here...
}
- Get the connection from the
stillImageOutput
.
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
// ...
// Code for photo capture goes here...
}
- Call the
captureStillImageAsynchronouslyFromConnection
function on thestillImageOutput
. - The
sampleBuffer
represents the data that is captured.
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
// ...
// Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView
})
- We will need to to take a few steps to process the image data found in
sampleBuffer
in order to end up with a UIImage that we can insert into ourcaptureImageView
and easily use elsewhere in our app.
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
// ...
// Add the image to captureImageView here...
}
ATTENTION: pay attention here. We are generating a rotated UIImage
instance, but the backed CGImage
will still be rotated in the wrong way.
- Finally, add the image to
captureImageView
.
self.captureImageView.image = image
self.captureImageView.image = image;
NOTE: The simulator does NOT have a camera so you need to run your app on an Actual Device to see the magic!
- At this point, you should see a live "video" stream of your phone camera's input within your
previewView
and the ability to "Snap" a photo and see the still image within yourcaptureImageView
. - NOTE: We are just adding the image to the
captureImageView
to illustrate the technique of capturing a still image. Once you have the still image you can do all kinds of cool things, like save it into your photo library, or upload it to Parse for use elsewhere in your app.