Hola quiero configurar AV
captura de sesión para capturar imágenes con resolución específica (y, si es posible, con calidad específica) usando la cámara iphone. aquí está setupping AV
código de sesiónAVCaptureSession especifica la resolución y la calidad de las imágenes capturadas obj-c iphone app
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
self.captureSession = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
// Find a suitable AVCaptureDevice
NSArray *cameras=[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *device;
if ([UserDefaults camera]==UIImagePickerControllerCameraDeviceFront)
{
device =[cameras objectAtIndex:1];
}
else
{
device = [cameras objectAtIndex:0];
};
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input)
{
NSLog(@"PANIC: no media input");
}
[captureSession addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[captureSession addOutput:output];
NSLog(@"connections: %@", output.connections);
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Specify the pixel format
output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
// Assign session to an ivar.
[self setSession:captureSession];
[self.captureSession startRunning];
}
y setSession
:
-(void)setSession:(AVCaptureSession *)session
{
NSLog(@"setting session...");
self.captureSession=session;
NSLog(@"setting camera view");
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//UIView *aView = self.view;
CGRect videoRect = CGRectMake(20.0, 20.0, 280.0, 255.0);
previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view.
[previewLayer setBackgroundColor:[[UIColor grayColor] CGColor]];
[self.view.layer addSublayer:previewLayer];
//[aView.layer addSublayer:previewLayer];
}
y salida métodos:
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
//NSLog(@"captureOutput: didOutputSampleBufferFromConnection");
// Create a UIImage from the sample buffer data
self.currentImage = [self imageFromSampleBuffer:sampleBuffer];
//< Add your code here that uses the image >
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
//NSLog(@"imageFromSampleBuffer: called");
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
Todo es bastante estándar. Pero dónde y qué debería cambiar para especificar la resolución de la imagen capturada y su calidad. Ayuda por favor
[Pedido una pregunta similar] (http: // stackoverflow. com/questions/24758407/ios-capture-high-resolution-photo-while-using-a-low-avcapturesessionpreset-for-v/40609268 # 40609268). Esto podría ayudar. –