概述
花了點時間研究了一下常用的視頻獲取、錄制、壓縮、取幀圖功能,分享給大家了!相信閱讀完本篇文章,會對你有很大的幫助的!
本篇文章研究幾下以個功能:
需要真機測試,才能錄制視頻!
效果圖

視頻錄制
首先,我們彈出系統的視頻錄制界面,也就是UIImagePickerController控制器來實現,但是我們需要驗證用戶授權,只有有錄制視頻的權限,才能繼續往下。
我們還需要判斷UIImagePickerControllerSourceTypeCamera是否支持,比如模擬器就不支持,當然真機是否有不 支持的並不知道,不過更安全的寫法是要這么寫的。視頻錄制可以設置錄制的視頻的質量,也就是分辨率的高低,通過videoQuality屬性來設置。我們 還可以設置錄制視頻的最大時長,通過videoMaximumDuration屬性設置,比如這里設置為5分鍾。
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
|
// 7.0
AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
if (authStatus == AVAuthorizationStatusRestricted
|| authStatus == AVAuthorizationStatusDenied) {
NSLog(@"攝像頭已被禁用,您可在設置應用程序中進行開啟");
return;
}
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.allowsEditing = YES;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.videoQuality = UIImagePickerControllerQualityType640x480; //錄像質量
picker.videoMaximumDuration = 5 * 60.0f; // 限制視頻錄制最多不超過5分鍾
picker.mediaTypes = @[(NSString *)kUTTypeMovie];
[self presentViewController:picker animated:YES completion:NULL];
self.shouldAsync = YES;
} else {
NSLog(@"手機不支持攝像");
}
|
然后實現代理,就可以拿到錄制的視頻了。
從相冊選擇視頻
從相冊選擇視頻與彈出錄制視頻的代碼差不多,只是sourceType不一樣而已。我們一樣要求先判斷權限,用戶是否授權,若不允許,就沒有辦法了。
指定sourceType為UIImagePickerControllerSourceTypeSavedPhotosAlbum就是獲取保存到相冊中的media。我們還要指定mediaTypes,只需要設置為kUTTypeMovie就可以了。
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
|
AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
if (authStatus == AVAuthorizationStatusRestricted
|| authStatus == AVAuthorizationStatusDenied) {
NSLog(@"攝像頭已被禁用,您可在設置應用程序中進行開啟");
return;
}
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum]) {
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.allowsEditing = YES;
picker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
picker.mediaTypes = @[(NSString *)kUTTypeMovie];
[self presentViewController:picker animated:YES completion:NULL];
self.shouldAsync = NO;
} else {
NSLog(@"手機不支持攝像");
}
|
同樣,實現代理方法,就可以取到所選擇的視頻了。
保存視頻到相冊
寫入相冊可以通過ALAssetsLibrary類來實現,它提供了寫入相冊的API,異步寫入,完成是要回到主線程更新UI:
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
|
NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
// 判斷相冊是否兼容視頻,兼容才能保存到相冊
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:videoURL completionBlock:^(NSURL *assetURL, NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
// 寫入相冊
if (error == nil) {
NSLog(@"寫入相冊成功");
} else {
NSLog(@"寫入相冊失敗");
}
}
}];
}
});
|
獲取視頻幀圖
同步獲取幀圖
同步獲取中間幀,需要指定哪個時間點的幀,當獲取到以后,返回來的圖片對象是CFRetained過的,需要外面手動CGImageRelease 一下,釋放內存。通過AVAsset來訪問具體的視頻資源,然后通過AVAssetImageGenerator圖片生成器來生成某個幀圖片:
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
|
// Get the video's center frame as video poster image
- (UIImage *)frameImageFromVideoURL:(NSURL *)videoURL {
// result
UIImage *image = nil;
// AVAssetImageGenerator
AVAsset *asset = [AVAsset assetWithURL:videoURL];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
// calculate the midpoint time of video
Float64 duration = CMTimeGetSeconds([asset duration]);
// 取某個幀的時間,參數一表示哪個時間(秒),參數二表示每秒多少幀
// 通常來說,600是一個常用的公共參數,蘋果有說明:
// 24 frames per second (fps) for film, 30 fps for NTSC (used for TV in North America and
// Japan), and 25 fps for PAL (used for TV in Europe).
// Using a timescale of 600, you can exactly represent any number of frames in these systems
CMTime midpoint = CMTimeMakeWithSeconds(duration / 2.0, 600);
// get the image from
NSError *error = nil;
CMTime actualTime;
// Returns a CFRetained CGImageRef for an asset at or near the specified time.
// So we should mannully release it
CGImageRef centerFrameImage = [imageGenerator copyCGImageAtTime:midpoint
actualTime:&actualTime
error:&error];
if (centerFrameImage != NULL) {
image = [[UIImage alloc] initWithCGImage:centerFrameImage];
// Release the CFRetained image
CGImageRelease(centerFrameImage);
}
return image;
}
|
異步獲取幀圖
異步獲取某個幀的圖片,與同步相比,只是調用API不同,可以傳多個時間點,然后計算出實際的時間並返回圖片,但是返回的圖片不需要我們手動再 release了。有可能取不到圖片,所以還需要判斷是否是AVAssetImageGeneratorSucceeded,是才轉換圖片:
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
|
// 異步獲取幀圖片,可以一次獲取多幀圖片
- (void)centerFrameImageWithVideoURL:(NSURL *)videoURL completion:(void (^)(UIImage *image))completion {
// AVAssetImageGenerator
AVAsset *asset = [AVAsset assetWithURL:videoURL];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
// calculate the midpoint time of video
Float64 duration = CMTimeGetSeconds([asset duration]);
// 取某個幀的時間,參數一表示哪個時間(秒),參數二表示每秒多少幀
// 通常來說,600是一個常用的公共參數,蘋果有說明:
// 24 frames per second (fps) for film, 30 fps for NTSC (used for TV in North America and
// Japan), and 25 fps for PAL (used for TV in Europe).
// Using a timescale of 600, you can exactly represent any number of frames in these systems
CMTime midpoint = CMTimeMakeWithSeconds(duration / 2.0, 600);
// 異步獲取多幀圖片
NSValue *midTime = [NSValue valueWithCMTime:midpoint];
[imageGenerator generateCGImagesAsynchronouslyForTimes:@[midTime] completionHandler:^(CMTime requestedTime, CGImageRef _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
if (result == AVAssetImageGeneratorSucceeded && image != NULL) {
UIImage *centerFrameImage = [[UIImage alloc] initWithCGImage:image];
dispatch_async(dispatch_get_main_queue(), ^{
if (completion) {
completion(centerFrameImage);
}
});
} else {
dispatch_async(dispatch_get_main_queue(), ^{
if (completion) {
completion(nil);
}
});
}
}];
}
|
壓縮並導出視頻
壓縮視頻是因為視頻分辨率過高所生成的視頻的大小太大了,對於移動設備來說,內存是不能太大的,如果不支持分片上傳到服務器,或者不支持流上傳、文件上傳,而只能支持表單上傳,那么必須要限制大小,壓縮視頻。
就像我們在使用某平台的視頻的上傳的時候,到現在還沒有支持流上傳,也不支持文件上傳,只支持表單上傳,導致視頻大一點就會閃退。流上傳是上傳成功 了,但是人家后台不識別,這一次讓某平台坑壞了。直接用file上傳,也傳過去了,上傳進度100%了,但是人家那邊還是作為失敗處理,無奈!
言歸正傳,壓縮、導出視頻,需要通過AVAssetExportSession來實現,我們需要指定一個preset,並判斷是否支持這個preset,只有支持才能使用。
我們這里設置的preset為AVAssetExportPreset640x480,屬於壓縮得比較厲害的了,這需要根據服務器視頻上傳的支持程度而選擇的。然后通過調用異步壓縮並導出視頻:
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
|
- (void)compressVideoWithVideoURL:(NSURL *)videoURL
savedName:(NSString *)savedName
completion:(void (^)(NSString *savedPath))completion {
// Accessing video by URL
AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
// Find compatible presets by video asset.
NSArray *presets = [AVAssetExportSession exportPresetsCompatibleWithAsset:videoAsset];
// Begin to compress video
// Now we just compress to low resolution if it supports
// If you need to upload to the server, but server does't support to upload by streaming,
// You can compress the resolution to lower. Or you can support more higher resolution.
if ([presets containsObject:AVAssetExportPreset640x480]) {
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPreset640x480];
NSString *doc = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
NSString *folder = [doc stringByAppendingPathComponent:@"HYBVideos"];
BOOL isDir = NO;
BOOL isExist = [[NSFileManager defaultManager] fileExistsAtPath:folder isDirectory:&isDir];
if (!isExist || (isExist && !isDir)) {
NSError *error = nil;
[[NSFileManager defaultManager] createDirectoryAtPath:folder
withIntermediateDirectories:YES
attributes:nil
error:&error];
if (error == nil) {
NSLog(@"目錄創建成功");
} else {
NSLog(@"目錄創建失敗");
}
}
NSString *outPutPath = [folder stringByAppendingPathComponent:savedName];
session.outputURL = [NSURL fileURLWithPath:outPutPath];
// Optimize for network use.
session.shouldOptimizeForNetworkUse = true;
NSArray *supportedTypeArray = session.supportedFileTypes;
if ([supportedTypeArray containsObject:AVFileTypeMPEG4]) {
session.outputFileType = AVFileTypeMPEG4;
} else if (supportedTypeArray.count == 0) {
NSLog(@"No supported file types");
return;
} else {
session.outputFileType = [supportedTypeArray objectAtIndex:0];
}
// Begin to export video to the output path asynchronously.
[session exportAsynchronouslyWithCompletionHandler:^{
if ([session status] == AVAssetExportSessionStatusCompleted) {
dispatch_async(dispatch_get_main_queue(), ^{
if (completion) {
completion([session.outputURL path]);
}
});
} else {
dispatch_async(dispatch_get_main_queue(), ^{
if (completion) {
completion(nil);
}
});
}
}];
}
}
|
解決iOS8上錄視頻引起的偏移bug
在iOS8上有這么一樣bug:彈出錄制視頻頁面,再回來發現整個view都往下移動了,可能網上有很多解決辦法,下面只是其中一種:
|
1
2
3
4
5
6
7
|
[picker dismissViewControllerAnimated:YES completion:^{
// for fixing iOS 8.0 problem that frame changed when open camera to record video.
self.tabBarController.view.frame = [[UIScreen mainScreen] bounds];
[self.tabBarController.view layoutIfNeeded];
}];
|
Tip:記得在選擇或者取消的代理中都調用!
小結
做每種需求,都可能會遇到坑,不過再多的坑也抵不過一顆對技術執着追求的心,必定盪平一切的坑。以前也沒有怎么弄過視頻類的需求,而別人超過的路,即使有坑也不會告訴后來的人坑在哪里,往往只是心里記着有個坑就算了。
今天給大家分享出來,是幫助有困難的同志們,這里立了一個牌:坑,請大家不要再跳到坑里了。看完本篇文章,是否有所了解了呢?如果您正在做這方面的需求,代碼完全可以直接Copy過去使用哦!
源代碼
下載源代碼,記得star一下,分享出去:
標哥的技術博客:【VideoCaptureDemo】
