我有一个关于使用AVFoundation的问题 AVPlayer (可能适用于iOS和macOS)。
我正在尝试播放来自标准HTTP直播流以外的频道的音频(未压缩的wav)数据。
案子:
音频数据包在通道中压缩,以及应用程序需要使用的其他数据。例如,视频和音频来自同一个频道,并由标题分隔。
过滤后,我获取音频数据并将其解压缩为WAV格式(此阶段不包含标题)。
一旦数据包准备就绪(每个9600字节用于24k,立体声16位音频),它们将被传递到AVPlayer实例(根据Apple的AVAudioPlayer不适合流式传输音频)。
鉴于AVPlayer(项目或资产)不从内存加载(没有initWithData:(NSData))并且需要HTTP实时流URL或文件URL,我在磁盘上创建文件(macOS或iOS),添加WAV标题并在那里附加未压缩的数据。
回到AVPlayer,我创建了以下内容:
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:tempAudioFile] options:nil];
AVPlayerItem *audioItem = [[AVPlayerItem alloc] initWithAsset:audioAsset];
AVPlayer *audioPlayer = [[AVPlayer alloc] initWithPlayerItem:audioItem];
添加KVO然后尝试开始播放:
[audioPlayer play];
结果是音频播放1-2秒然后停止(AVPlayerItemDidPlayToEndTimeNotification 确切地说,数据继续附加到文件。由于整个事情都处于循环状态,[audioPlayer play]会多次启动和暂停(rate == 0)。
整个概念以简化形式:
-(void)PlayAudioWithData:(NSData *data) //data in encoded format
{
NSData *decodedSound = [AudioDecoder DecodeData:data]; //decodes the data from the compressed format (Opus) to WAV
[Player CreateTemporaryFiles]; //This creates the temporary file by appending the header and waiting for input.
[Player SendDataToPlayer:decodedSound]; //this sends the decoded data to the Player to be stored to file. See below for appending.
Boolean prepared = [Player isPrepared]; //a check if AVPlayer, Item and Asset are initialized
if (!prepared)= [Player Prepare]; //creates the objects like above
Boolean playing = [Player isAudioPlaying]; //a check done on the AVPlayer if rate == 1
if (!playing) [Player startPlay]; //this is actually [audioPlayer play]; on AVPlayer Instance
}
-(void)SendDataToPlayer:(NSData *data)
{
//Two different methods here. First with NSFileHandle — not so sure about this though as it definitely locks the file.
//Initializations and deallocations happen elsewhere, just condensing code to give you an idea
NSFileHandle *audioFile = [NSFileHandle fileHandleForWritingAtPath:_tempAudioFile]; //happens else where
[audioFile seekToEndOfFile];
[audioFile writeData:data];
[audioFile closeFile]; //happens else where
//Second method is
NSOutputStream *audioFileStream = [NSOutputStream outputStreamWithURL:[NSURL fileURLWithPath:_tempStreamFile] append:YES];
[audioFileStream open];
[audioFileStream write:[data bytes] maxLength:data.length];
[audioFileStream close];
}
都 NSFileHandle 和 NSOutputStream 制作QuickTime,iTunes,VLC等播放的WAV文件。
另外,如果我绕过了 [Player SendDataToPlayer:decodingSound] 并且临时音频文件预先加载了标准WAV,它也可以正常播放。
到目前为止,有两个方面:
a)我将音频数据解压缩并准备播放
b)我正确保存数据。
我想做的是 发送的写入后读取 连续。
这使我认为将数据保存到文件,获得对文件资源的独占访问权限,并且不允许AVPlayer继续播放。
任何人都知道如何保持文件可用于NSFileHandle / NSOutputStream和AVPlayer?
甚至更好......有AVPlayer initWithData吗? (呵呵…)
任何帮助深表感谢!
提前致谢。
您可以使用 AVAssetResourceLoader
将您自己的数据和元数据传输到 AVAsset
,然后你可以玩 AVPlayer
,实际上是一个 [[AVPlayer alloc] initWithData:...]
:
- (AVPlayer *)playerWithWavData:(NSData* )wavData {
self.strongDelegateReference = [[NSDataAssetResourceLoaderDelegate alloc] initWithData:wavData contentType:AVFileTypeWAVE];
NSURL *url = [NSURL URLWithString:@"ns-data-scheme://"];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
// or some other queue != main queue
[asset.resourceLoader setDelegate:self.strongDelegateReference queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
return [[AVPlayer alloc] initWithPlayerItem:item];
}
您可以这样使用:
[self setupAudioSession];
NSURL *wavUrl = [[NSBundle mainBundle] URLForResource:@"foo" withExtension:@"wav"];
NSData *wavData = [NSData dataWithContentsOfURL:wavUrl];
self.player = [self playerWithWavData:wavData];
[self.player play];
事情是, AVAssetResourceLoader
是非常强大的(除非你想使用AirPlay),所以你可能比将音频数据提供给 AVPlayer
一次性 - 你可以把它流入 AVAssetResourceLoader
委托,因为它变得可用。
这是简单的“一块” AVAssetResourceLoader
代表。要修改它以进行流式传输,应该足以设置更长的时间 contentLength
而不是您目前拥有的数据量。
头文件:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface NSDataAssetResourceLoaderDelegate : NSObject <AVAssetResourceLoaderDelegate>
- (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType;
@end
实施文件:
@interface NSDataAssetResourceLoaderDelegate()
@property (nonatomic) NSData *data;
@property (nonatomic) NSString *contentType;
@end
@implementation NSDataAssetResourceLoaderDelegate
- (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType {
if (self = [super init]) {
self.data = data;
self.contentType = contentType;
}
return self;
}
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest {
AVAssetResourceLoadingContentInformationRequest* contentRequest = loadingRequest.contentInformationRequest;
// TODO: check that loadingRequest.request is actually our custom scheme
if (contentRequest) {
contentRequest.contentType = self.contentType;
contentRequest.contentLength = self.data.length;
contentRequest.byteRangeAccessSupported = YES;
}
AVAssetResourceLoadingDataRequest* dataRequest = loadingRequest.dataRequest;
if (dataRequest) {
// TODO: handle requestsAllDataToEndOfResource
NSRange range = NSMakeRange((NSUInteger)dataRequest.requestedOffset, (NSUInteger)dataRequest.requestedLength);
[dataRequest respondWithData:[self.data subdataWithRange:range]];
[loadingRequest finishLoading];
}
return YES;
}
@end
您可以使用 AVAssetResourceLoader
将您自己的数据和元数据传输到 AVAsset
,然后你可以玩 AVPlayer
,实际上是一个 [[AVPlayer alloc] initWithData:...]
:
- (AVPlayer *)playerWithWavData:(NSData* )wavData {
self.strongDelegateReference = [[NSDataAssetResourceLoaderDelegate alloc] initWithData:wavData contentType:AVFileTypeWAVE];
NSURL *url = [NSURL URLWithString:@"ns-data-scheme://"];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
// or some other queue != main queue
[asset.resourceLoader setDelegate:self.strongDelegateReference queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
return [[AVPlayer alloc] initWithPlayerItem:item];
}
您可以这样使用:
[self setupAudioSession];
NSURL *wavUrl = [[NSBundle mainBundle] URLForResource:@"foo" withExtension:@"wav"];
NSData *wavData = [NSData dataWithContentsOfURL:wavUrl];
self.player = [self playerWithWavData:wavData];
[self.player play];
事情是, AVAssetResourceLoader
是非常强大的(除非你想使用AirPlay),所以你可能比将音频数据提供给 AVPlayer
一次性 - 你可以把它流入 AVAssetResourceLoader
委托,因为它变得可用。
这是简单的“一块” AVAssetResourceLoader
代表。要修改它以进行流式传输,应该足以设置更长的时间 contentLength
而不是您目前拥有的数据量。
头文件:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface NSDataAssetResourceLoaderDelegate : NSObject <AVAssetResourceLoaderDelegate>
- (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType;
@end
实施文件:
@interface NSDataAssetResourceLoaderDelegate()
@property (nonatomic) NSData *data;
@property (nonatomic) NSString *contentType;
@end
@implementation NSDataAssetResourceLoaderDelegate
- (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType {
if (self = [super init]) {
self.data = data;
self.contentType = contentType;
}
return self;
}
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest {
AVAssetResourceLoadingContentInformationRequest* contentRequest = loadingRequest.contentInformationRequest;
// TODO: check that loadingRequest.request is actually our custom scheme
if (contentRequest) {
contentRequest.contentType = self.contentType;
contentRequest.contentLength = self.data.length;
contentRequest.byteRangeAccessSupported = YES;
}
AVAssetResourceLoadingDataRequest* dataRequest = loadingRequest.dataRequest;
if (dataRequest) {
// TODO: handle requestsAllDataToEndOfResource
NSRange range = NSMakeRange((NSUInteger)dataRequest.requestedOffset, (NSUInteger)dataRequest.requestedLength);
[dataRequest respondWithData:[self.data subdataWithRange:range]];
[loadingRequest finishLoading];
}
return YES;
}
@end