iOS4: Take photos with live video preview using AVFoundation

I’m writing this because – as of April 2011 – Apple’s official documentation is badly wrong. Some of their source code won’t even compile (typos that are obvious if they’d checked them), and some of their instructions are hugely over-complicated and yet simply don’t work.

This is a step-by-step guide to taking photos with live image preview. It’s also a good starting point for doing much more advanced video and image capture on iOS 4.

What are we trying to do?

It’s very easy to write an app that takes photos. It’s quite a lot of code, but it’s been built-in to iOS/iPhone OS for a few years now – and it still works.

But … with iOS 4, the new “AV Foundation” library offers a much more powerful way of taking photos, which lets you put the camera view inside your own app. So, for instance, you can make an app that looks like this:

Gotchas

0. Requires a 3GS, iPod Touch 3, or better…

The entire AV Foundation library is not available on the oldest iPhone and iPod Touch devices. I believe this is because Apple is doing a lot of the work in hardware, making use of features that didn’t exist in the original iPhone chips, and the 3G chips.

Interestingly, the AV Foundation library *is* available on the Simulator – which suggest that Apple certainly *could* have implemented AV F for older phones, but they decided not to. It’s very useful that you can test most of your AV F app on the Simulator (so long as you copy/paste some videos into the Simulator to work with).

1. Apple doesn’t tell you the necessary Frameworks

You need *all* the following frameworks (all come with Xcode, but you have to manually add them to your project):

  1. CoreVideo
  2. CoreMedia
  3. AVFoundation (of course…)
  4. ImageIO
  5. QuartzCore (maybe)

How do we: get live video from camera straight onto the screen?

Create a new UIViewController, add its view to the screen (either in IB or through code – if you don’t know how to add a ViewController’s view, you need to do some much more basic iPhone tutorials first).

Add a UIView object to the NIB (or as a subview), and create a @property in your controller:

@property(nonatomic, retain) IBOutlet UIView *vImagePreview;

Connect the UIView to the outlet above in IB, or assign it directly if you’re using code instead of a NIB.

Then edit your UIViewController, and give it the following viewDidAppear method:

-(void) viewDidAppear:(BOOL)animated
{
	AVCaptureSession *session = [[AVCaptureSession alloc] init];
	session.sessionPreset = AVCaptureSessionPresetMedium;
	
	CALayer *viewLayer = self.vImagePreview.layer;
	NSLog(@"viewLayer = %@", viewLayer);
	
	AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
	
	captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
	[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
	
	AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
	
	NSError *error = nil;
	AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
	if (!input) {
		// Handle the error appropriately.
		NSLog(@"ERROR: trying to open camera: %@", error);
	}
	[session addInput:input];
		
	[session startRunning];
}

Run your app on a device (NB: this will NOT run on Simulator – Apple doesn’t support cameras on the simulator (yet)), and … you should see the live camera view appearing in your subview

Gotchas

2. Apple’s example code for live-video doesn’t work

In the AVFoundation docs, Apple has a whole section on trying to do what we did above. Here’s a link: AV Foundation Programming Guide – Video Preview. But it doesn’t work.

UPDATE: c.f. Robert’s comment below. This method does work, you just have to use it in a different way.

“The method “imageFromSampleBuffer” does work when you send a sample buffer from “AVCaptureVideoDataOutput” which is “32BGRA”. You tried to send a sample buffer from “AVCaptureStillImageOutput” which is “AVVideoCodecJPEG”.”

(more details + source code in Robert’s comment at the end of this post)

If you look in the docs for AVCaptureVideoPreviewLayer, you’ll find a *different* source code example, which works without having to change codecs:

captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

3. Apple’s image-capture docs are also wrong

In the AV Foundation docs, there’s also a section on how to get Images from the camera. This is mostly correct, and then at the last minute it goes horribly wrong.

Apple provides a link to another part of the docs, with the following source code:

{
    ...
    UIImage* image = imageFromSampleBuffer(imageSampleBuffer);
    ...
}

UIImage *imageFromSampleBuffer(CMSampleBufferRef sampleBuffer)
{	
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer.
    CVPixelBufferLockBaseAddress(imageBuffer,0);
	
    // Get the number of bytes per row for the pixel buffer.
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height.
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
	
    // Create a device-dependent RGB color space.
    static CGColorSpaceRef colorSpace = NULL;
    if (colorSpace == NULL) {
        colorSpace = CGColorSpaceCreateDeviceRGB();
		if (colorSpace == NULL) {
            // Handle the error appropriately.
            return nil;
        }
    }
	
    // Get the base address of the pixel buffer.
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // Get the data size for contiguous planes of the pixel buffer.
    size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
	
    // Create a Quartz direct-access data provider that uses data we supply.
    CGDataProviderRef dataProvider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, NULL);
    // Create a bitmap image from data supplied by the data provider.
    CGImageRef cgImage = CGImageCreate(width, height, 8, 32, bytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst | 
kCGBitmapByteOrder32Little, dataProvider, NULL, true, kCGRenderingIntentDefault);
    CGDataProviderRelease(dataProvider);
	
    // Create and return an image object to represent the Quartz image.
    UIImage *image = [UIImage imageWithCGImage:cgImage];
    CGImageRelease(cgImage);
	
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
	
    return image;
}

This code has never worked for me – it always returns an empty 0×0 image, which is useless. That’s 45 lines of useless code, that everyone is required to re-implement in every app they write.

Or maybe not.

Instead, if you look at the WWDC videos, you find an alternate approach, that takes just two lines of source code:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];

Even better … this actually works!

How do we: take a photo of what’s in the live video feed?

There’s two halves to this. Obviously, we’ll need a button to capture a photo, and a UIImageView to display it. Less obviously, we’ll have to alter our existing camera-setup routine.

To make this work, we have to create an “output source” for the camera when we start it, and then later on when we want to take a photo we ask that “output” object to give us a single image.

Part 1: Add buttons and views and image-capture routine

So, create a new @property to hold a reference to our output object:

@property(nonatomic, retain) AVCaptureStillImageOutput *stillImageOutput;

Then make a UIImageView where we’ll display the captured photo. Add this to your NIB, or programmatically.

Hook it up to another @property, or assign it manually, e.g.;

@property(nonatomic, retain) IBOutlet UIImageView *vImage;

Finally, create a UIButton, so that you can take the photo.

Again, add it to your NIB (or programmatically to your screen), and hook it up to the following method:

-(IBAction) captureNow
{
	AVCaptureConnection *videoConnection = nil;
	for (AVCaptureConnection *connection in stillImageOutput.connections)
	{
		for (AVCaptureInputPort *port in [connection inputPorts])
		{
			if ([[port mediaType] isEqual:AVMediaTypeVideo] )
			{
				videoConnection = connection;
				break;
			}
		}
		if (videoConnection) { break; }
	}
	
	NSLog(@"about to request a capture from: %@", stillImageOutput);
	[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
	{
		 CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
		 if (exifAttachments)
		 {
			// Do something with the attachments.
			NSLog(@"attachements: %@", exifAttachments);
		 }
		else
			NSLog(@"no attachments");
		
		NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
		UIImage *image = [[UIImage alloc] initWithData:imageData];

		self.vImage.image = image;
	 }];
}

Part 2: modify the camera-setup routine

Go back to the viewDidAppear method you created at the start of this post. The very last line must REMAIN the last line, so we’ll insert the new code immediately above it. Here’s the new code to insert:

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];

[session addOutput:stillImageOutput];

Run the app, and you should get something like the image I showed at the start, where the part on the left is a live-preview from the camera, and the part on the right updates each time you click the “take photo” button:

83 thoughts on “iOS4: Take photos with live video preview using AVFoundation

  1. Thank for that , I have some question for this tutoring about view –>how change view scale for scale camera to fit view.

  2. i got an error
    Undefined symbols for architecture armv7:
    “_CMGetAttachment”, referenced from:

    do you know why ?

    I used ios5 but i dont think that a big difference, well the first part work without problems, i show the camera, but when i did the method captureNow, i got this little problem :)

    Well it tutorial is great.

  3. Can you tell me how one goes about displaying the preview as indicated in this code, but having the captured still image be the full camera resolution instead of the resolution of the preview?

  4. I am aware that this question has been asked few times here but without answer.
    Why preview camera is not 320px wide? And can it be set somehow to 320px?

  5. Very nice and love the simplicity of it all.

    BUT… the camera FOV is zoomed in so much that it is hard to get a decent picture. While the still image capture outputs a correct FOV.

    I’ve been trying to correct this for my augmented reality app and can’t seem to figure out why this works this way.

  6. I can’t get the video preview to work. I copied the code exactly and I’m not getting any errors. Am I missing an import or something?

  7. I figured out my problem. My UIView was a small rectangle similar to the FaceTime front camera preview. If you dont set the frame it doesn’t appear.

    CGRect frame = CGRectMake(88, 152, 144, 202);
    [vImagePreview setFrame:frame];

  8. Hi all.
    In the beginning of the post Adam sad that is possible to copy a video in some folder of the simulator to better test AVFoundation on simulator.
    Have anyone done this already? It is possible to use this “file” as a substitute to the live camera feed?
    Any tip to documentation or the folder and file name to use will be very appreciated.
    Thanks :)

  9. If you google for it, there’s extensive answers on StackOverflow that show how to copy a video onto the simulator.

    Re: substituting for live feed – this should be easy, just load it as a raw AVAsset, and play it directly.

  10. I am getting one error on above solution. Undeclare identifier kCGImagePropertyExifDictionary. What I have missed, where to put this thing to reduce error?

  11. @Nav – you haven’t imported all your frameworks. CoreGraphics needs CoreGraphics.

  12. I just want to say this tutorial is very straight to the point. Trouble for me with the Apple sample codes is the codes are always involve too many aspects of Obj-C programming, delegate, category…. Conguse the heck of me. I just need a very simple straight forward tutorial to teach me how to setup, use AVFoundation and the camera, this tutorial did exactly that.
    Thank you Adam!! :)

  13. Absolutely awesome tutorial! I would love it if you made one that takes a photo using the front camera in the appDelegate in ‘didFinishLaunchingWithOptions’ section. Thanks for the great tutorial :)

  14. The code above needs to be revised

    What are
    kCGImagePropertyExifDictionary
    self.vImage.image = image;

    ? it gives build error..

  15. Then you’re doing it wrong.

    Check your imports.

    Check you followed every step.

    The code works fine.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>