注册 登录
主题 : <iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images(AVFoundation)
级别: 版主

状态: 连续签到 - [144天]
UID: 491
精华: 6
发帖: 1763
可可豆: 112160 CB
威望: 112807 点
在线时间: 8537(时)
注册时间: 2008-08-19
最后登录: 2019-05-19
0 楼:  发表于: 2010-07-23 21:45    发自: Web Page

<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images(AVFoundation)    (在iOS代码库中浏览本帖)

Q: How do I capture video frames from the camera as images using AV Foundation?

A: To perform a real-time capture, first create a capture session by instantiating an AVCaptureSession object. You use an AVCaptureSession object to coordinate the flow of data from AV input devices to outputs.
Next, create a input data source that provides video data to the capture session by instantiating a AVCaptureDeviceInput object. Call addInput to add that input to the AVCaptureSession object.
Create an output destination by instantiating an AVCaptureVideoDataOutput object , and add it to the capture session using addOutput.
AVCaptureVideoDataOutput is used to process uncompressed frames from the video being captured. An instance of AVCaptureVideoDataOutput produces video frames you can process using other media APIs. You can access the frames with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method. Use setSampleBufferDelegate:queue: to set the sample buffer delegate and the queue on which callbacks should be invoked. The delegate of anAVCaptureVideoDataOutputSampleBuffer object must adopt the AVCaptureVideoDataOutputSampleBufferDelegate protocol. Use thesessionPreset property to customize the quality of the output.
You invoke the capture session startRunning method to start the flow of data from the inputs to the outputs, and stopRunning to stop the flow.
Listing 1 shows an example of this. setupCaptureSession creates a capture session, adds a video input to provide video frames, adds an output destination to access the captured frames, then starts flow of data from the inputs to the outputs. While the capture session is running, the captured video sample buffers are sent to the sample buffer delegate using captureOutput:didOutputSampleBuffer:fromConnection:. Each sample buffer (CMSampleBufferRef) is then converted to a UIImage inimageFromSampleBuffer.
Listing 1: Configuring a capture device to record video with AV Foundation and saving the frames as UIImage objects.





#import 

// Create and configure a capture session and start it running
- (void)setupCaptureSession 
{
    NSError *error = nil;

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                             defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                    error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];

    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [session addOutput:output];

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    // Specify the pixel format
    output.videoSettings = 
                [NSDictionary dictionaryWithObject:
                    [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                    forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 15);

    // Start the session running to start the flow of data
    [session startRunning];

    // Assign session to an ivar.
    [self setSession:session];
}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
         fromConnection:(AVCaptureConnection *)connection
{ 
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];

     < Add your code here that uses the image >

}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    if (!colorSpace) 
    {
        NSLog(@"CGColorSpaceCreateDeviceRGB failure");
        return nil;
    }

    // Get the base address of the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // Get the data size for contiguous planes of the pixel buffer.
    size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer); 

    // Create a Quartz direct-access data provider that uses data we supply
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, 
                                                            NULL);
    // Create a bitmap image from data supplied by our data provider
    CGImageRef cgImage = 
        CGImageCreate(width,
                        height,
                        8,
                        32,
                        bytesPerRow,
                        colorSpace,
                        kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
                        provider,
                        NULL,
                        true,
                        kCGRenderingIntentDefault);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpace);

    // Create and return an image object representing the specified Quartz image
    UIImage *image = [UIImage imageWithCGImage:cgImage];
    CGImageRelease(cgImage);

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    return image;
}


http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html
 Developer
------------------------------------------------------------
Η αγάπη ποτέ δεν αποτυγχάνει.
愛是永不止息。
Love never fails.

    --《圣经.新约》哥林多前书第13章
级别: 骑士
UID: 8698
精华: 0
发帖: 242
可可豆: 2384 CB
威望: 2384 点
在线时间: 2227(时)
注册时间: 2009-09-12
最后登录: 2013-10-08
1 楼:  发表于: 2010-07-23 21:53    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..

难道之前的sdk不能实现 调用摄像头拍照,保存,上传一气呵成的动作 ?

Android早就可以了
级别: 骑士
UID: 16393
精华: 0
发帖: 218
可可豆: 2076 CB
威望: 2076 点
在线时间: 129(时)
注册时间: 2010-03-29
最后登录: 2015-04-30
2 楼:  发表于: 2010-09-01 22:33    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
UIImageWriteToSavedPhotosAlbum  好像不好使用了
http://itunes.apple.com/us/app/autocamera/id472265214?mt=8
级别: 圣骑士

状态: 连续签到 - [3天]
UID: 29429
精华: 0
发帖: 383
可可豆: 2990 CB
威望: 2933 点
在线时间: 575(时)
注册时间: 2010-09-02
最后登录: 2018-08-13
3 楼:  发表于: 2010-09-20 21:39    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
Mark一下。。Thank You
级别: 圣骑士

UID: 31467
精华: 0
发帖: 600
可可豆: 1250 CB
威望: 1330 点
在线时间: 2846(时)
注册时间: 2010-09-21
最后登录: 2019-06-10
4 楼:  发表于: 2010-11-15 11:58    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
- (void)captureOutput:(AVCaptureOutput *)captureOutput
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
         fromConnection:(AVCaptureConnection *)connection


怎么用得?

级别: 新手上路
UID: 17204
精华: 0
发帖: 1
可可豆: 10 CB
威望: 10 点
在线时间: 100(时)
注册时间: 2010-04-09
最后登录: 2016-12-20
5 楼:  发表于: 2010-11-17 18:51    发自: Web Page
Re:Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as imag ..
引用
引用第2楼apple.dev于2010-09-01 22:33发表的 Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( .. :
UIImageWriteToSavedPhotosAlbum  好像不好使用了


UIImageWriteToSavedPhotosAlbum  不能用的问题官方已经解决。Technical Q&A QA1714 How do I take a screenshot of my app that contains both UIKit and Camera elements?
级别: 骑士
UID: 3461
精华: 0
发帖: 64
可可豆: 2755 CB
威望: 2755 点
在线时间: 542(时)
注册时间: 2009-03-21
最后登录: 2018-08-06
6 楼:  发表于: 2011-05-07 01:44    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
mark xia , 3q very much
级别: 侠客
UID: 32972
精华: 0
发帖: 91
可可豆: 806 CB
威望: 806 点
在线时间: 2950(时)
注册时间: 2010-10-09
最后登录: 2018-07-14
7 楼:  发表于: 2011-08-03 16:53    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
请问怎么保存视频?
乐观者与悲观者的区别就在于:一个想为什么不快乐,一个想为什么快乐。
级别: 侠客
UID: 58580
精华: 0
发帖: 105
可可豆: 942 CB
威望: 942 点
在线时间: 115(时)
注册时间: 2011-03-21
最后登录: 2015-09-28
8 楼:  发表于: 2011-10-26 10:25    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
mark............
级别: 侠客
状态: 连续签到 - [3天]
UID: 39445
精华: 0
发帖: 70
可可豆: 616 CB
威望: 608 点
在线时间: 157(时)
注册时间: 2010-11-24
最后登录: 2016-08-31
9 楼:  发表于: 2011-10-27 16:01    发自: Web Page
Re:<iOS4>如何把摄像头的画面保存图片/How to capture video frames from the camera as images( ..
给力啊,这就是我一直想找的截屏方法啊……
学问见识,不以年齿论短长。还请各位多多指教!
QQ:565073940
Blog:披麻之皴

CocoaChina社区转载内容已尽可能注明出处,如未能核实来源或转发内容图片有权利瑕疵的,请及时联系社区进行修改或删除【联系方式QQ : 3442093904 邮箱:support@cocoachina.com】文章内容为作者独立观点,不代表CocoaChina社区立场。版权归原作者所有,如申请授权请联系作者,因文章侵权CocoaChina社区不承担任何法律及连带责任。

描述
快速回复

关注本帖(如果有新回复会站内信通知您)

发帖、回帖都会得到可观的积分奖励。查看论坛积分规则

按"Ctrl+Enter"直接提交
    顶部