iOS平台RTSP|RTMP直播播放器技术接入说明

 技术背景

大牛直播SDK自2015年发布RTSP、RTMP直播播放模块,迭代从未停止,SmartPlayer功能强大、性能强劲、高稳定、超低延迟、超低资源占用。无需赘述,全自研内核,行业内一致认可的跨平台RTSP、RTMP直播播放器。本文以iOS平台为例,介绍下如何集成RTSP、RTMP播放模块。

技术对接

 系统要求

  • SDK支持iOS 9.0及以上版本;
  • 支持的CPU架构:arm64(真机调试)。

准备工作

  • 相关库:libSmartPlayerSDK.a
  • 相关头文件:
    1. nt_common_media_define.h(如需转发或第三方数据对接)
    2. nt_event_define.h
    3. SmartPlayerSDK.h
  • 如集需要引入的framework
    1. libbz.tbd
    2. Libbz2.tbd
    3. libiconv.tbd
    4. libstdc++.tbd
    5. Libc++.tbd
    6. Accelerate.framework
    7. AssetsLibrary.framework
    8. AudioToolBox.framework
    9. AVFoundation.framework
    10. CoreMedia.framework
    11. Foundation.framework
    12. GLKit.framework
    13. OpenGLES.framework
    14. UIKit.framework
    15. VideoToolBox.framework
  • 如需集成到自己系统测试,请用大牛直播SDK的app name:

Info.plist–>右键Open As–>Source Code

添加或者编辑

<key>CFBundleName</key>
<string>SmartiOSPlayer</string>

  • 快照添加到“照片”权限:

Info.plist–>右键Open As–>Source Code 添加:

<key>NSPhotoLibraryUsageDescription</key>
<string>1</string>

  • 如需后台播放音频(添加后台播放权限):

功能支持

iOS端,RTMP|RTSP直播播放,我们设计实现的功能如下:

  • 音频:AAC/PCMA/PCMU/SPEEX(RTMP);
  • 视频:H.264;
  • 播放协议:RTMP或RTSP;
  • 支持纯音频、纯视频、音视频播放;
  • 支持多实例播放;
  • 支持网络状态、buffer状态等回调;
  • [RTSP协议]支持RTSP TCP/UDP模式设置;
  • [RTSP协议]支持RTSP TCP、UDP模式自动切换;
  • [RTSP协议]支持RTSP超时时间设置,单位:秒;
  • [RTSP协议]支持上报RTSP 401事件,如URL携带鉴权信息,会自动处理;
  • 支持buffer time设置;
  • 支持实时静音、取消静音;
  • 支持首屏秒开功能(需服务器缓存GOP);
  • 支持超低延迟模式; 断网自动重连,支持视频追赶;
  • 支持视频view实时旋转(0° 90° 180° 270°);
  • 支持视频view水平反转、垂直反转;
  • 支持图像等比例缩放绘制;
  • 支持实时快照;
  • 支持实时音量调节;
  • 支持YUV数据回调;
  • 支持H.264|H.265数据回调;
  • 支持AAC/SPEEX/PCMA/PCMU数据回调;
  • 支持RTMP扩展H.265播放(Enhanced RTMP);
  • 支持扩展录像功能;
  • 支持Unity3D接口;
  • 支持H.264扩展SEI接收模块;
  • 支持iOS 9.0及以上版本。

播放模块接口详解

iOS播放端SDK接口详解

调用描述 接口 接口描述
最先调用,创建播放实例,如成功返回player实例 SmartPlayerInitPlayer 初始化,创建player实例,此接口请第一个调用
Event回调 SmartPlayerDelegate 设置event callback,上层由handleSmartPlayerEvent处理
软、硬解码设置 SmartPlayerSetVideoDecoderMode 设置是否用硬解码播放,如硬解码不支持,自动适配到软解码

0: 软解码;

 1: 硬解码.

创建播放view SmartPlayerCreatePlayView x y width height 指定播放位置
设置播放view SmartPlayerSetPlayView 设置播放view到底层SDK
释放播放view SmartPlayeReleasePlayView 释放播放view
视频回调 设置YUV回调 SmartPlayerSetYuvBlock 设置拉流时,视频YUV数据回调
YUV回调 PlayerYuvDataBlock 提供解码后YUV/RGB数据接口,供用户自己render或进一步处理(如视频分析)
播放模式 缓冲时间设置 SmartPlayerSetBuffer 设置播放端缓存数据buffer,单位:毫秒,如不需buffer,设置为0
首屏秒开 SmartPlayerSetFastStartup 设置快速启动后,如果CDN缓存GOP,实现首屏秒开
低延迟模式 SmartPlayerSetLowLatencyMode 针对类似于直播娃娃机等期待超低延迟的使用场景,超低延迟播放模式下,延迟可达到200~400ms
快速切换URL SmartPlayerSwitchPlaybackUrl 快速切换播放url,快速切换时,只换播放source部分,适用于不同数据流之间,快速切换
RTSP TCP/UDP模式设置 SmartPlayerSetRTSPTcpMode 设置RTSP TCP/UDP模式,如不设置,默认UDP模式
RTSP超时时间设置 SmartPlayerSetRTSPTimeout 设置RTSP超时时间,timeout单位为秒,必须大于0
设置RTSP TCP/UDP自动切换 SmartPlayerSetRTSPAutoSwitchTcpUdp 对于RTSP来说,有些可能支持rtp over udp方式,有些可能支持使用rtp over tcp方式

为了方便使用,有些场景下可以开启自动尝试切换开关, 打开后如果udp无法播放,sdk会自动尝试tcp, 如果tcp方式播放不了,sdk会自动尝试udp.

实时静音 SmartPlayerSetMute 实时静音
设置播放音量 SmartPlayerSetAudioVolume 播放端音量实时调节,范围[0,100],0时为静音,100为原始流数据最大音量
视频镜像旋转 旋转 SmartPlayerSetRotation 设置顺时针旋转, 注意除了0度之外, 其他角度都会额外消耗性能,当前支持 0度,90度, 180度, 270度 旋转
水平反转 SmartPlayerSetFlipHorizontal 设置视频水平反转
垂直反转 SmartPlayerSetFlipVertical 设置视频垂直反转
设置URL SmartPlayerSetPlayURL 设置播放或录像的url
开始播放 SmartPlayerStart 开始播放RTSP/RTMP流
停止播放 SmartPlayerStop 停止播放RTSP/RTMP流
销毁播放实例 SmartPlayerUnInitPlayer 结束时必须调用close接口释放资源

录像模块接口详解

如需录像,录像相关的接口如下:

iOS播放端录像SDK接口详解

调用描述 接口 接口描述
录像设置 设置录像目录 SmartPlayerSetRecorderDirectory 设置录像文件目录
设置录像文件大小 SmartPlayerSetRecorderFileMaxSize 设置每个录像文件的大小,比如100M,超过这个大小后,会自动生成下一个录像文件
音频转码 SmartPlayerSetRecorderAudioTranscodeAAC 设置录像时音频转AAC编码的开关

aac比较通用,sdk增加其他音频编码(比如speex, pcmu, pcma等)转aac的功能.

录制视频 SmartPlayerSetRecorderVideo 设置是否录视频,默认的话,如果视频源有视频就录,没有就不录, 但有些场景下可能不想录制视频,只想录音频,所以增加个开关
录制音频 SmartPlayerSetRecorderAudio 设置是否录音频,默认的话,如果视频源有音频就录,没有就不录, 但有些场景下可能不想录制音频,只想录视频,所以增加个开关
开始录像 SmartPlayerStartRecorder 开始录像
停止录像 SmartPlayerStopRecorder 停止录像

Event回调详解

由于iOS播放录像SDK和播放端SDK可组合使用,相关Event同步更新在iOS播放端SDK(如下图):

iOS播放端SDK Event回调说明

事件ID 事件描述
EVENT_DANIULIVE_ERC_PLAYER_STARTED 开始播放
EVENT_DANIULIVE_ERC_PLAYER_CONNECTING 播放端连接中
EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED 播放端连接失败
EVENT_DANIULIVE_ERC_PLAYER_CONNECTED 播放端连接成功
EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED 播放端连接断开
EVENT_DANIULIVE_ERC_PLAYER_STOP 停止播放
EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO 返回视频宽、高信息
EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED 收不到媒体数据(可能是URL错误)
EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL 快速切换URL
EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE 开始一个新的录像文件(param3返回包含录像路径在内的录像文件名)
EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED 已生成一个录像文件(param3返回包含录像路径在内的录像文件名)
EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE 播放端实时快照
EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING 开始缓冲数据
EVENT_DANIULIVE_ERC_PLAYER_BUFFERING 缓冲中(param1参数

会返回缓冲百分比)

EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING 停止缓冲数据
EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED 返回当前RTSP/RTMP流实时下载速度
EVENT_DANIULIVE_ERC_PLAYER_RTSP_STATUS_CODE RTSP收到错误码,可能是用户名、密码不对

逻辑调用

先说开始播放:

//
//  ViewController.m
//  SmartiOSPlayerV2
//
//  Author: daniusdk.com
//  WeChat: xinsheng120
//  Created by daniulive on 2016/01/03.
//
- (void)playBtn:(UIButton *)button {
    
    NSLog(@"playBtn only++");
    
    button.selected = !button.selected;
    
    if (button.selected)
    {
        if(is_playing_)
            return;
        
        [self InitPlayer];
        
        //如需处理回调的用户数据+++++++++
        __weak __typeof(self) weakSelf = self;
        
        _smart_player_sdk.spUserDataCallBack = ^(int data_type, unsigned char *data, unsigned int size, unsigned long long timestamp, unsigned long long reserve1, long long reserve2, unsigned char *reserve3)
        {
            [weakSelf OnUserDataCallBack:data_type data:data size:size timestamp:timestamp reserve1:reserve1 reserve2:reserve2 reserve3:reserve3];
        };
        
        Boolean enableUserDataCallback = YES;
        [_smart_player_sdk SmartPlayerSetUserDataCallback:enableUserDataCallback];
         //如需处理回调的用户数据---------
        
        if(![self StartPlayer])
        {
            NSLog(@"Call StartPlayer failed..");
        }
        
        [playbackButton setTitle:@"停止播放" forState:UIControlStateNormal];
        
        is_playing_ = YES;
    }
    else
    {
        if ( !is_playing_ )
            return;
        
        [self StopPlayer];
        
        if(!is_recording_)
        {
            [self UnInitPlayer];
        }
        
        [playbackButton setTitle:@"开始播放" forState:UIControlStateNormal];
        
        is_mute_ = NO;
        [muteButton setTitle:@"实时静音" forState:UIControlStateNormal];
        
        is_playing_ = NO;
    }
}

其中,InitPlayer实现如下:

-(bool)InitPlayer
{
    NSLog(@"InitPlayer++");
    
    if(is_inited_player_)
    {
        NSLog(@"InitPlayer: has inited before..");
        return true;
    }
    
    //NSString* in_cid = @"";
    //NSString* in_key = @"";
    
    //[SmartPlayerSDK SmartPlayerSetSDKClientKey:in_cid in_key:in_key reserve1:0 reserve2:nil];
    
    _smart_player_sdk = [[SmartPlayerSDK alloc] init];
    
    if (_smart_player_sdk ==nil ) {
        NSLog(@"SmartPlayerSDK init failed..");
        return false;
    }
    
    if (playback_url_.length == 0) {
        NSLog(@"playback url is nil..");
        return false;
    }
    
    if (_smart_player_sdk.delegate == nil)
    {
        _smart_player_sdk.delegate = self;
        NSLog(@"SmartPlayerSDK _player.delegate:%@", _smart_player_sdk);
    }
    
    NSInteger initRet = [_smart_player_sdk SmartPlayerInitPlayer];
    if ( initRet != DANIULIVE_RETURN_OK )
    {
        NSLog(@"SmartPlayerSDK call SmartPlayerInitPlayer failed, ret=%ld", (long)initRet);
        return false;
    }
    
    [_smart_player_sdk SmartPlayerSetPlayURL:playback_url_];
    //[self try_set_rtsp_url:playback_url_];
    
    //超低延迟模式设置
    [_smart_player_sdk SmartPlayerSetLowLatencyMode:(NSInteger)is_low_latency_mode_];
    
    //buffer time设置
    if(buffer_time_ >= 0)
    {
        [_smart_player_sdk SmartPlayerSetBuffer:buffer_time_];
    }
    
    //快速启动模式设置
    [_smart_player_sdk SmartPlayerSetFastStartup:(NSInteger)is_fast_startup_];
    
    NSLog(@"[SmartPlayerV2]is_fast_startup_:%d, buffer_time_:%ld", is_fast_startup_, (long)buffer_time_);
    
    //RTSP TCP还是UDP模式
    [_smart_player_sdk SmartPlayerSetRTSPTcpMode:is_rtsp_tcp_mode_];
 
    //设置RTSP超时时间
    NSInteger rtsp_timeout = 10;
    [_smart_player_sdk SmartPlayerSetRTSPTimeout:rtsp_timeout];
    
    //设置RTSP TCP/UDP自动切换
    NSInteger is_tcp_udp_auto_switch = 1;
    [_smart_player_sdk SmartPlayerSetRTSPAutoSwitchTcpUdp:is_tcp_udp_auto_switch];
    
    //快照设置 如需快照 参数传1
    [_smart_player_sdk SmartPlayerSaveImageFlag:save_image_flag_];
    
    //如需查看实时流量信息,可打开以下接口
    NSInteger is_report = 1;
    NSInteger report_interval = 3;
    [_smart_player_sdk SmartPlayerSetReportDownloadSpeed:is_report report_interval:report_interval];
    
    //录像端音频,是否转AAC后保存
    NSInteger is_transcode = 1;
    [_smart_player_sdk SmartPlayerSetRecorderAudioTranscodeAAC:is_transcode];
    
    //录制MP4文件 是否录制视频
    NSInteger is_record_video = 1;
    [_smart_player_sdk SmartPlayerSetRecorderVideo:is_record_video];
    
    //录制MP4文件 是否录制音频
    NSInteger is_record_audio = 1;
    [_smart_player_sdk SmartPlayerSetRecorderAudio:is_record_audio];
    
    
    is_inited_player_ = YES;
    
    NSLog(@"InitPlayer--");
    return true;
}

停止播放StopPlayer实现如下:

-(bool)StopPlayer
{
    NSLog(@"StopPlayer++");
    
    if (_smart_player_sdk != nil)
    {
        [_smart_player_sdk SmartPlayerStop];
    }
    
    if (!is_audio_only_) {
        if (_glView != nil) {
            [_glView removeFromSuperview];
            [SmartPlayerSDK SmartPlayeReleasePlayView:(__bridge void *)(_glView)];
            _glView = nil;
        }
    }
    
    NSLog(@"StopPlayer--");
    return true;
}

UnInitPlayer实现如下:

-(bool)UnInitPlayer
{
    NSLog(@"UnInitPlayer++");
    
    if (_smart_player_sdk != nil)
    {
        [_smart_player_sdk SmartPlayerUnInitPlayer];
        
        if (_smart_player_sdk.delegate != nil)
        {
            _smart_player_sdk.delegate = nil;
        }
        
        _smart_player_sdk = nil;
    }
    
    is_inited_player_ = NO;
    
    NSLog(@"UnInitPlayer--");
    return true;
}

实时录像:

- (void)RecorderBtn:(UIButton *)button {
    
    NSLog(@"record Stream only++");
    
    button.selected = !button.selected;
    
    if (button.selected)
    {
        if(is_recording_)
            return;
        
        [self InitPlayer];
        
        //设置录像目录
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *recorderDir = [paths objectAtIndex:0];
        
        if([_smart_player_sdk SmartPlayerSetRecorderDirectory:recorderDir] != DANIULIVE_RETURN_OK)
        {
            NSLog(@"Call SmartPlayerSetRecorderDirectory failed..");
        }
        
        //每个录像文件大小
        NSInteger size = 200;
        if([_smart_player_sdk SmartPlayerSetRecorderFileMaxSize:size] != DANIULIVE_RETURN_OK)
        {
            NSLog(@"Call SmartPlayerSetRecorderFileMaxSize failed..");
        }
        
        [_smart_player_sdk SmartPlayerStartRecorder];
        [recButton setTitle:@"停止录像" forState:UIControlStateNormal];
        
        is_recording_ = YES;
    }
    else
    {
        [_smart_player_sdk SmartPlayerStopRecorder];
        [recButton setTitle:@"开始录像" forState:UIControlStateNormal];
        
        if(!is_playing_)
        {
            [self UnInitPlayer];
        }
        
        is_recording_ = NO;
    }
}

实时快照:

- (void)SaveImageBtn:(UIButton *)button {
    if ( _smart_player_sdk != nil )
    {
        //设置快照目录
        NSLog(@"[SaveImageBtn] path++");
        
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *saveImageDir = [paths objectAtIndex:0];
        
        NSLog(@"[SaveImageBtn] path: %@", saveImageDir);
        
        NSString* symbol = @"/";
        
        NSString* png = @".png";
        
        // 1.创建时间
        NSDate *datenow = [NSDate date];
        // 2.创建时间格式化
        NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
        // 3.指定格式
        formatter.dateFormat = @"yyyyMMdd_HHmmss";
        // 4.格式化时间
        NSString *timeSp = [formatter stringFromDate:datenow];
        
        NSString* image_name =  [saveImageDir stringByAppendingString:symbol];
        
        image_name = [image_name stringByAppendingString:timeSp];
        
        image_name = [image_name stringByAppendingString:png];
        
        NSLog(@"[SaveImageBtn] image_name: %@", image_name);
        
        [_smart_player_sdk SmartPlayerSaveCurImage:image_name];
    }
}

Event回调处理如下:

- (NSInteger) handleSmartPlayerEvent:(NSInteger)nID param1:(unsigned long long)param1 param2:(unsigned long long)param2 param3:(NSString*)param3 param4:(NSString*)param4 pObj:(void *)pObj;
{
    NSString* player_event = @"";
    NSString* lable = @"";
    
    if (nID == EVENT_DANIULIVE_ERC_PLAYER_STARTED) {
        player_event = @"[event]开始播放..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_CONNECTING)
    {
        player_event = @"[event]连接中..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED)
    {
        player_event = @"[event]连接失败..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_CONNECTED)
    {
        player_event = @"[event]已连接..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED)
    {
        player_event = @"[event]断开连接..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_STOP)
    {
        player_event = @"[event]停止播放..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO)
    {
        NSString *str_w = [NSString stringWithFormat:@"%ld", (long)param1];
        NSString *str_h = [NSString stringWithFormat:@"%ld", (long)param2];
        
        lable = @"[event]视频解码分辨率信息: ";
        player_event = [lable stringByAppendingFormat:@"%@*%@", str_w, str_h];
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED)
    {
        player_event = @"[event]收不到RTMP数据..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL)
    {
        player_event = @"[event]快速切换url..";
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE)
    {
        if ((int)param1 == 0)
        {
            NSLog(@"[event]快照成功: %@", param3);
            lable = @"[event]快照成功:";
            player_event = [lable stringByAppendingFormat:@"%@", param3];
            
            tmp_path_ = param3;
            
            image_path_ = [ UIImage imageNamed:param3];
            
            UIImageWriteToSavedPhotosAlbum(image_path_, self, @selector(image:didFinishSavingWithError:contextInfo:), NULL);
        }
        else
        {
            lable = @"[event]快照失败";
            player_event = [lable stringByAppendingFormat:@"%@", param3];
        }
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE)
    {
        lable = @"[event]录像写入新文件..文件名:";
        player_event = [lable stringByAppendingFormat:@"%@", param3];
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED)
    {
        lable = @"一个录像文件完成..文件名:";
        player_event = [lable stringByAppendingFormat:@"%@", param3];
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING)
    {
        //NSLog(@"[event]开始buffer..");
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_BUFFERING)
    {
        NSLog(@"[event]buffer百分比: %lld", param1);
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING)
    {
        //NSLog(@"[event]停止buffer..");
    }
    else if (nID == EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED)
    {
        NSInteger speed_kbps = (NSInteger)param1*8/1000;
        NSInteger speed_KBs = (NSInteger)param1/1024;
        
        lable = @"[event]download speed :";
        player_event = [lable stringByAppendingFormat:@"%ld kbps - %ld KB/s", (long)speed_kbps, (long)speed_KBs];
    }
    else if(nID == EVENT_DANIULIVE_ERC_PLAYER_RTSP_STATUS_CODE)
    {
        lable = @"[event]RTSP status code received:";
        player_event = [lable stringByAppendingFormat:@"%ld", (long)param1];
        
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
            dispatch_async(dispatch_get_main_queue(), ^{
                UIAlertController *aleView=[UIAlertController alertControllerWithTitle:@"RTSP错误状态" message:player_event preferredStyle:UIAlertControllerStyleAlert];
                UIAlertAction *action_ok=[UIAlertAction actionWithTitle:@"确定" style:UIAlertActionStyleCancel handler:nil];
                [aleView addAction:action_ok];
                
                [self presentViewController:aleView animated:YES completion:nil];
            });
        });
    }
    else if(nID == EVENT_DANIULIVE_ERC_PLAYER_NEED_KEY)
    {
        player_event = @"[event]RTMP加密流,请设置播放需要的Key..";
    }
    else if(nID == EVENT_DANIULIVE_ERC_PLAYER_KEY_ERROR)
    {
        player_event = @"[event]RTMP加密流,Key错误,请重新设置..";
    }
    else
        NSLog(@"[event]nID:%lx", (long)nID);
    
    NSString* player_event_tag = @"当前状态:";
    NSString* event = [player_event_tag stringByAppendingFormat:@"%@", player_event];
    
    if ( player_event.length != 0)
    {
        NSLog(@"%@", event);
    }
    
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
            dispatch_async(dispatch_get_main_queue(), ^{
                self.textPlayerEventLabel.text = event;
            });
    });
    
    return 0;
}

总结

iOS平台RTSP、RTMP直播播放模块,延迟低、资源占有少,性能优异。由于设备和系统比较单一,优先考虑硬解码,除了基础播放外,我们还实现了实时快照、实时录像、实时回调YUV数据、实时音量调节等,实际体验下来,iOS平台RTMP和RTSP,可以轻松毫秒级。

Android平台实现屏幕录制(屏幕投影)|音频播放采集|麦克风采集并推送RTMP或轻量级RTSP服务

技术背景

好多开发者,希望我们能系统的介绍下无纸化同屏的原理和集成步骤,以Android平台为例,无纸化同屏将Android设备上的屏幕内容实时投射到另一个显示设备(如Windows终端、国产化操作系统或另一台Android设备)上,从而实现多屏互动和内容的无缝共享。

技术考量指标

本文以大牛直播SDK Android同屏采集推送为例,介绍下我们前些年做Android同屏采集推送的时候,一些注意点:

  1. 声明所需权限:在Android应用的AndroidManifest.xml文件中声明必要的权限;
  2. 获取MediaProjectionManager服务:在你Activity或Service,通过getSystemService方法获取MediaProjectionManager服务;
  3. 创建并启动屏幕捕获Intent:使用MediaProjectionManager的createScreenCaptureIntent方法创建一个Intent,该Intent会启动一个系统对话框,请求用户授权屏幕捕获;
  4. 处理用户授权结果:在onActivityResult回调中,根据用户授权的结果来获取MediaProjection对象;
  5. 创建VirtualDisplay并捕获屏幕:获得了MediaProjection对象,就可以使用它来创建一个VirtualDisplay,这个VirtualDisplay会捕获屏幕内容并将其发送到指定的Surface;
  6. 资源释放:当屏幕捕获不再需要时,确保释放MediaProjection和VirtualDisplay对象,以避免资源泄露;
  7. 视频编码:通过上述步骤,捕获带的屏幕内容需要进行视频编码,以便在网络中传输。如H.264、H.265等,以及设置合适的分辨率、帧率、码率,以适应不同的网络环境和接收设备的性能;
  8. 流媒体协议:为了将编码后的视频流实时传输到接收端,Android无纸化同屏技术通常采用RTMP推流模式或轻量级RTSP服务。

技术实现

本文以大牛直播SDK的Android的SmartServicePublisherV2的同屏demo为例,Android采集计时器,编码打包分别启动RTMP推送和轻量级RTSP服务,Windows过来分别拉取RTMP和RTSP的流,整体延迟毫秒级:

启动APP后,先选择需要采集的分辨率(如果选原始分辨率,系统不做缩放),然后选择“启动媒体投影”,并分别启动音频播放采集、采集麦克风。如果音频播放采集和采集麦克风都打开,可以通过右侧下拉框,推送过程中,音频播放采集和麦克风采集实时切换。需要注意的是,Android采集音频播放的audio,音频播放采集是依赖屏幕投影的,屏幕投影关闭后,音频播放也就采不到了。

编码的话,考虑到屏幕分辨率一般不会太低,我们可以缩放后再推送,默认我们开启了原始分辨率、标准分辨率、低分辨率选项设置。一般建议标准分辨率即可。如果对画质和分辨率要求比较高,可以选择原始分辨率。设备支持硬编码,优先选择H.264硬编,如果是H.265硬编,需要RTMP服务器支持扩展H.265(或Enhanced RTMP)。

都选择好后,设置RTMP推送的URL,点开始RTMP推送按钮即可。

如果需要通过轻量级RTSP服务,发布RTSP流,先点击启动RTSP服务按钮,RTSP服务启动后,再点击启动RTSP流,RTSP流发布成功后,界面会回调上来RTSP拉流的URL。

下面从代码逻辑实现角度,介绍下同屏的具体流程:

启动媒体服务,进入系统后,我们会自动启动媒体服务,对应的实现逻辑如下:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private void start_media_service() {
	Intent intent = new Intent(getApplicationContext(), StreamMediaDemoService.class);
	if (Build.VERSION.SDK_INT >= 26) {
		Log.i(TAG, "startForegroundService");
		startForegroundService(intent);
	} else
		startService(intent);

	bindService(intent, service_connection_, Context.BIND_AUTO_CREATE);
	button_stop_media_service_.setText("停止媒体服务");
}

private void stop_media_service() {
	if (media_engine_callback_ != null)
		media_engine_callback_.reset(null);

	if (media_engine_ != null) {
		media_engine_.unregister_callback(media_engine_callback_);
		media_engine_ = null;
	}

	media_engine_callback_ = null;

	if (media_binder_ != null) {
		media_binder_ = null;
		unbindService(service_connection_);
	}

	Intent intent = new Intent(getApplicationContext(), StreamMediaDemoService.class);
	stopService(intent);
	button_stop_media_service_.setText("启动媒体服务");
}

需要注意的是,Android 6.0及以上版本,动态获取Audio权限:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private boolean check_record_audio_permission() {
	//6.0及以上版本,动态获取Audio权限
	if (PackageManager.PERMISSION_GRANTED == checkPermission(android.Manifest.permission.RECORD_AUDIO, Process.myPid(), Process.myUid()))
		return true;

	return false;
}

private void request_audio_permission() {
	if (Build.VERSION.SDK_INT < 23)
		return;

	Log.i(TAG, "requestPermissions RECORD_AUDIO");
	ActivityCompat.requestPermissions(this, new String[] {android.Manifest.permission.RECORD_AUDIO}, REQUEST_AUDIO_CODE);
}

@Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
	switch(requestCode){
		case REQUEST_AUDIO_CODE:
			if (grantResults != null && grantResults.length > 0 && PackageManager.PERMISSION_GRANTED == grantResults[0]) {
				Log.i(TAG, "RECORD_AUDIO permission has been granted");
			}else {
				Toast.makeText(this, "请开启录音权限!", Toast.LENGTH_SHORT).show();
			}
			break;
	}
}

启动、停止媒体投影:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private class ButtonStartMediaProjectionListener implements OnClickListener {
	public void onClick(View v) {
		if (null == media_engine_)
			return;

		if (media_engine_.is_video_capture_running()) {
			media_engine_.stop_audio_playback_capture();
			media_engine_.stop_video_capture();
			resolution_selector_.setEnabled(true);
			button_capture_audio_playback_.setText("采集音频播放");
			button_start_media_projection_.setText("启动媒体投影");
			return;
		}

		Intent capture_intent;
		capture_intent = media_projection_manager_.createScreenCaptureIntent();

		startActivityForResult(capture_intent, REQUEST_MEDIA_PROJECTION);
		Log.i(TAG, "startActivityForResult request media projection");
	}
}

启动媒体投影后,选择“采集音频播放”,如果需要采集麦克风,可以点击“采集麦克风”:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private class ButtonCaptureAudioPlaybackListener implements OnClickListener {
	public void onClick(View v) {
		if (null == media_engine_)
			return;

		if (media_engine_.is_audio_playback_capture_running()) {
			media_engine_.stop_audio_playback_capture();
			button_capture_audio_playback_.setText("采集音频播放");
			return;
		}

		if (!media_engine_.start_audio_playback_capture(44100, 1))
			Log.e(TAG, "start_audio_playback_capture failed");
		else
			button_capture_audio_playback_.setText("停止音频播放采集");
	}
}

private class ButtonStartAudioRecordListener implements OnClickListener {
	public void onClick(View v) {
		if (null == media_engine_)
			return;

		if (media_engine_.is_audio_record_running()) {
			media_engine_.stop_audio_record();
			button_start_audio_record_.setText("采集麦克风");
			return;
		}

		if (!media_engine_.start_audio_record(44100, 1))
			Log.e(TAG, "start_audio_record failed");
		else
			button_start_audio_record_.setText("停止麦克风");
	}
}

启动、停止RTMP推送:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private class ButtonRTMPPublisherListener implements OnClickListener {
	@Override
	public void onClick(View v) {
		if (null == media_engine_)
			return;

		if (media_engine_.is_rtmp_stream_running()) {
			media_engine_.stop_rtmp_stream();
			button_rtmp_publisher_.setText("开始RTMP推送");
			text_view_rtmp_url_.setText("RTMP URL: ");
			Log.i(TAG, "stop rtmp stream");
			return;
		}

		if (!media_engine_.is_video_capture_running())
			return;

		String rtmp_url;
		if (input_rtmp_url_ != null && input_rtmp_url_.length() > 1) {
			rtmp_url = input_rtmp_url_;
			Log.i(TAG, "start, input rtmp url:" + rtmp_url);
		} else {
			rtmp_url = baseURL + String.valueOf((int) (System.currentTimeMillis() % 1000000));
			Log.i(TAG, "start, generate random url:" + rtmp_url);
		}

		media_engine_.set_fps(fps_);
		media_engine_.set_gop(gop_);
		media_engine_.set_video_encoder_type(video_encoder_type);

		if (!media_engine_.start_rtmp_stream(rtmp_url))
			return;

		button_rtmp_publisher_.setText("停止RTMP推送");
		text_view_rtmp_url_.setText("RTMP URL:" + rtmp_url);
		Log.i(TAG, "RTMP URL:" + rtmp_url);
	}
}

启动RTSP服务:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private class ButtonRTSPServiceListener implements OnClickListener {
	public void onClick(View v) {
		if (null == media_engine_)
			return;

		if (media_engine_.is_rtsp_server_running()) {
			media_engine_.stop_rtsp_stream();
			media_engine_.stop_rtsp_server();
			button_rtsp_publisher_.setText("启动RTSP流");
			button_rtsp_service_.setText("启动RTSP服务");
			text_view_rtsp_url_.setText("RTSP URL:");
			return;
		}

		if (!media_engine_.start_rtsp_server(rtsp_port_, null, null))
			return;

		button_rtsp_service_.setText("停止RTSP服务");
	}
}

发布RTSP流:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
private class ButtonRtspPublisherListener implements OnClickListener {
	public void onClick(View v) {
		if (null == media_engine_)
			return;

		if (media_engine_.is_rtsp_stream_running()) {
			media_engine_.stop_rtsp_stream();
			button_rtsp_publisher_.setText("启动RTSP流");
			text_view_rtsp_url_.setText("RTSP URL:");
			return;
		}

		if (!media_engine_.is_video_capture_running())
			return;

		media_engine_.set_fps(fps_);
		media_engine_.set_gop(gop_);
		media_engine_.set_video_encoder_type(video_encoder_type);

		if (!media_engine_.start_rtsp_stream("stream1"))
			return;

		button_rtsp_publisher_.setText("停止RTSP流");
	}
}

RTSP流发布成功后,底层会把RTSP拉流的URL回调上来:

/*
 * MainActivity.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
@Override
public void on_nt_rtsp_stream_url(String url) {
	Log.i(TAG, "on_nt_rtsp_stream_url: " + url);

	MainActivity activity = get_activity();
	if (activity != null) {
		activity.runOnUiThread(new Runnable() {
			MainActivity activity_;
			String url_;

			@Override
			public void run() {
			   activity_.text_view_rtsp_url_.setText("RTSP URL:" + url_);
			}

			public Runnable set(MainActivity activity, String url) {
				this.activity_ = activity;
				this.url_ = url;
				return this;
			}
		}.set(activity, url));
	}
}

可以看到,上述操作,都是在MainActivity.java调用的,如果是需要做demo版本集成,只需要关注MainActivity.java的业务逻辑即可,为了便于开发者对接,我们做了接口的二次封装,除了常规的RTMP推送、轻量级RTSP服务设计外,如果需要录像,只要在MainActivity.java调用这里的接口逻辑即可,非常方便:

/*
 * NTStreamMediaEngine.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
package com.daniulive.smartpublisher;

public interface NTStreamMediaEngine {
    void register_callback(Callback callback);

    void unregister_callback(Callback callback);

    void set_resolution_level(int level);

    int get_resolution_level();

    /*
    * 启动媒体投影
     */
    boolean start_video_capture(int token_code, android.content.Intent token_data);

    boolean is_video_capture_running();

    void stop_video_capture();

    /*
    * 启动麦克风
     */
    boolean start_audio_record(int sample_rate, int channels);

    boolean is_audio_record_running();

    void stop_audio_record();

    /*
     *  Android 10及以上支持, Android10以下设备调用直接返回false
     *  需要有RECORD_AUDIO权限
     *  要开启媒体投影
     */
    boolean start_audio_playback_capture(int sample_rate, int channels);

    boolean is_audio_playback_capture_running();

    void stop_audio_playback_capture();

    /*
     * 输出的音频类型
     *  0: 不输出音频
     *  1: 输出麦克风
     *  2: 输出audio playback(Android 10及以上支持)
     */
    boolean set_audio_output_type(int type);

    int get_audio_output_type();

    void set_fps(int fps);

    void set_gop(int gop);

    boolean set_video_encoder_type(int video_encoder_type);

    int get_video_encoder_type();

    /*
    * 推送RTMP
     */
    boolean start_rtmp_stream(String url);

    boolean is_rtmp_stream_running();

    String get_rtmp_stream_url();

    void stop_rtmp_stream();

    /*
    * 启动RTSP Server, 需要设置端口,用户名和密码可选
     */
    boolean start_rtsp_server(int port, String user_name, String password);

    boolean is_rtsp_server_running();

    void stop_rtsp_server();

    /*
    * 发布RTSP流
     */
    boolean start_rtsp_stream(String stream_name);

    boolean is_rtsp_stream_running();

    String get_rtsp_stream_url();

    void stop_rtsp_stream();

    /*
    * 启动本地录像
     */
    boolean start_stream_record(String record_directory, int file_max_size);

    boolean is_stream_recording();

    void stop_stream_record();

    boolean is_stream_running();

    interface Callback {
        void on_nt_video_capture_stop();
        void on_nt_rtsp_stream_url(String url);
    }
}

如果对音视频这块相对了解的开发者,可以继续到NTStreamMediaProjectionEngineImpl.java文件,查看或修改相关的技术实现:

/*
 * NTStreamMediaProjectionEngineImpl.java
 * Created by daniusdk.com on 2017/04/19.
 * WeChat: xinsheng120
 */
package com.daniulive.smartpublisher;

import android.app.Activity;
import android.app.Application;
import android.app.Service;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.Point;
import android.graphics.Rect;
import android.media.Image;
import android.media.projection.MediaProjection;
import android.media.projection.MediaProjectionManager;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Process;
import android.util.Log;
import android.util.Size;
import android.view.Surface;
import android.view.WindowManager;
import android.view.WindowMetrics;

import com.eventhandle.NTSmartEventCallbackV2;
import com.eventhandle.NTSmartEventID;
import com.voiceengine.NTAudioRecordV2;
import com.voiceengine.NTAudioRecordV2Callback;
import com.videoengine.NTMediaProjectionCapture;
import com.voiceengine.NTAudioPlaybackCapture;

import java.lang.ref.WeakReference;
import java.nio.ByteBuffer;
import java.util.concurrent.CopyOnWriteArrayList;
import java.util.concurrent.atomic.AtomicReference;

public class NTStreamMediaProjectionEngineImpl implements AutoCloseable, NTStreamMediaEngine,
        NTVirtualDisplaySurfaceSinker.Callback, NTMediaProjectionCapture.Callback {
    private static final String TAG = "NTLogProjectionEngine";

    private static final Size DEFAULT_SIZE = new Size(1920, 1080);

    public static final int RESOLUTION_LOW = 0;
    public static final int RESOLUTION_MEDIUM = 1;
    public static final int RESOLUTION_HIGH = 2;

    private final Application application_;

    private final long image_thread_id_;
    private final long running_thread_id_;

    private final Handler image_handler_;
    private final Handler running_handler_;

    private final WindowManager window_manager_;
    private final MediaProjectionManager projection_manager_;
    private int screen_density_dpi_ = android.util.DisplayMetrics.DENSITY_DEFAULT;

    private final SmartPublisherJniV2 lib_publisher_;
    private final LibPublisherWrapper.RTSPServer rtsp_server_;
    private final LibPublisherWrapper stream_publisher_;

    private final CopyOnWriteArrayList<NTStreamMediaEngine.Callback> callbacks_ = new CopyOnWriteArrayList<>();

    private final AtomicReference<VideoSinkerCapturePair> video_capture_pair_ = new AtomicReference<>();

    private final AudioRecordCallbackImpl audio_record_callback_;
    private final AudioPlaybackCaptureCallbackImpl audio_playback_capture_callback_;

    private final AtomicReference<NTAudioRecordV2> audio_record_ = new AtomicReference<>();
    private final AtomicReference<NTAudioPlaybackCapture> audio_playback_capture_ = new AtomicReference<>();
	
	...
}

以Android平台RTMP推送模块为例,我们主要实现了如下功能:

  • 音频编码:AAC/SPEEX;
  • 视频编码:H.264、H.265;
  • 推流协议:RTMP;
  • [音视频]支持纯音频/纯视频/音视频推送;
  • [摄像头]支持采集过程中,前后摄像头实时切换;
  • 支持帧率、关键帧间隔(GOP)、码率(bit-rate)设置;
  • 支持RTMP推送 live|record模式设置;
  • 支持前置摄像头镜像设置;
  • 支持软编码、特定机型硬编码;
  • 支持横屏、竖屏推送;
  • 支持Android屏幕采集推送;
  • 支持自建标准RTMP服务器或CDN;
  • 支持断网自动重连、网络状态回调;
  • 支持实时动态水印;
  • 支持实时快照;
  • 支持降噪处理、自动增益控制;
  • 支持外部编码前音视频数据对接;
  • 支持外部编码后音视频数据对接;
  • 支持RTMP扩展H.265(需设备支持H.265特定机型硬编码)和Enhanced RTMP;
  • 支持实时音量调节;
  • 支持扩展录像模块;
  • 支持Unity接口;
  • 支持H.264扩展SEI发送模块;
  • 支持Android 5.1及以上版本。

轻量级RTSP服务,在上述非RTMP协议依赖的基础上,增加了如下功能:

  •  [音频格式]AAC;
  •  [视频格式]H.264、H.265;
  •  [协议类型]RTSP;
  •  [传输模式]支持单播和组播模式;
  •  [端口设置]支持RTSP端口设置;
  •  [鉴权设置]支持RTSP鉴权用户名、密码设置;
  •  [获取session连接数]支持获取当前RTSP服务会话连接数;
  •  [多服务支持]支持同时创建多个内置RTSP服务;
  •  [RTSP url回调]支持设置后的rtsp url通过event回调到上层。

总结

以上是Android平台屏幕采集、音频播放声音采集、麦克风采集编码打包推送到RTMP和轻量级RTSP服务的相关技术实现,做成高稳定低延迟的同屏系统,还需要有配套好的RTMP、RTSP直播播放器,整体部署,内网大并发环境下,还需要考虑到如何组网等诸多因素。做demo容易,做个成熟的模块还是有一定的难度,以上抛砖引玉,感兴趣的开发者,可以单独跟我沟通探讨。

Android平台GB28181接入模块技术接入说明

 技术背景

今天,我们主要讲讲Android平台GB28181接入模块的技术对接,Android平台GB28181接入模块设计的目的,可实现不具备国标音视频能力的 Android终端,通过平台注册接入到现有的GB/T28181—2016服务,可用于如智能监控、智慧零售、智慧教育、远程办公、生产运输、智慧交通、车载或执法记录仪等场景。

Android终端除支持常规的音视频数据接入外,还可以支持移动设备位置(MobilePosition)订阅和通知、语音广播和语音对讲、云台控制回调和预置位查询,支持对接数据类型如下:

  • 编码前数据(目前支持的有YV12/NV21/NV12/I420/RGB24/RGBA32/RGB565等数据类型);
  • 编码后数据(如无人机等264/HEVC数据,或者本地解析的MP4音视频数据);
  • 拉取RTSP或RTMP流并接入至GB28181平台(比如其他IPC的RTSP流,可通过Android平台GB28181接入到国标平台)。

功能支持

  •  ​[视频格式]H.264/H.265(Android H.265硬编码);
  •  [音频格式]G.711 A律、AAC;
  •  [音量调节]Android平台采集端支持实时音量调节;
  •  [H.264硬编码]支持H.264特定机型硬编码;
  •  [H.265硬编码]支持H.265特定机型硬编码;
  •  [软硬编码参数配置]支持gop间隔、帧率、bit-rate设置;
  •  [软编码参数配置]支持软编码profile、软编码速度、可变码率设置;
  • 支持纯视频、音视频PS打包传输;
  • 支持RTP OVER UDP和RTP OVER TCP被动模式;
  • 支持信令通道网络传输协议TCP/UDP设置;
  • 支持注册、注销,支持注册刷新及注册有效期设置;
  • 支持设备目录查询应答;
  • 支持心跳机制,支持心跳间隔、心跳检测次数设置;
  • 支持移动设备位置(MobilePosition)订阅和通知;
  •  适用国家标准:GB/T 28181—2016;
  • 支持语音广播;
  • 支持语音对讲;
  • 支持图像抓拍;
  • 支持历史视音频文件检索;
  • 支持历史视音频文件下载;
  • 支持历史视音频文件回放;
  • 支持云台控制和预置位查询;
  •  [实时水印]支持动态文字水印、png水印;
  •  [镜像]Android平台支持前置摄像头实时镜像功能;
  •  [实时静音]支持实时静音/取消静音;
  •  [实时快照]支持实时快照;
  •  [降噪]支持环境音、手机干扰等引起的噪音降噪处理、自动增益、VAD检测;
  •  [外部编码前视频数据对接]支持YUV数据对接;
  •  [外部编码前音频数据对接]支持PCM对接;
  •  [外部编码后视频数据对接]支持外部H.264数据对接;
  •  [外部编码后音频数据对接]外部AAC数据对接;
  •  [扩展录像功能]支持和录像SDK组合使用,录像相关功能。​

系统要求

  • SDK支持Android 5.1及以上版本;
  • 支持的CPU架构:armv7, arm64, x86, x86_64。

准备工作

  • 确保SmartPublisherJniV2.java放到com.daniulive.smartpublisher包名下(可在其他包名下调用);
  • 如需集成语音广播、语音对讲功能,确保SmartPlayerJniV2.java放到com.daniulive.smartplayer包名下(可在其他包名下调用);
  • smartavengine.jar和smartgbsipagent.jar加入到工程;
  • 拷贝libSmartPublisher.so和libSmartPlayer.so(如需语音广播或语音对讲)到工程;
  • AndroidManifast.xml添加相关权限:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" ></uses-permission>
<uses-permission android:name="android.permission.INTERNET" ></uses-permission>
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"></uses-permission>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"></uses-permission>

  • Load相关so:
static {  
    System.loadLibrary("SmartPublisher");
    System.loadLibrary("SmartPlayer");
}

  • build.gradle配置32/64位库:
splits {
    abi {
        enable true
        reset()
        // Specifies a list of ABIs that Gradle should create APKs for
        include 'armeabi-v7a', 'arm64-v8a', 'x86', 'x86_64' //select ABIs to build APKs for
        // Specify that we do not want to also generate a universal APK that includes all ABIs
        universalApk true
    }
}

  • 如需集成到自己系统测试,请用大牛直播SDK的app name,授权版按照授权app name正常使用即可;
  • 如何改app-name,strings.xml做以下修改:
<string name="app_name">SmartPublisherSDKDemo</string>

接口详解

以Android平台Camera2对接为例,信令部分需要实现如下标红接口:

public class MainActivity extends Activity implements ViewTreeObserver.OnGlobalLayoutListener, Camera2Listener,
        GBSIPAgentListener, GBSIPAgentPlayListener, GBSIPAgentAudioBroadcastListener,
        GBSIPAgentDeviceControlListener, GBSIPAgentQueryCommandListener, 
        GBSIPAgentTalkListener, 
        GBSIPAgentQueryRecordInfoListener{
}

媒体数据处理接口,可参照SmartPublisherJniV2.java,如需语音广播或语音对讲,可参照SmartPlayerJniV2.java。

信令处理

GBSIPAgentListener主要系GB28181注册、心跳、DevicePosition等,如注册成功、注册超时、注册网络传输层错误、心跳异常、设备位置请求处理:

public interface GBSIPAgentListener
{
    /*注册成功
    * @param dateString: 服务器日期,用来校准设备端时间,用户自行决定是否校准设备时间
    */
    void ntsRegisterOK(String dateString);

    /*
    *注册超时
    */
    void ntsRegisterTimeout();

    /*
    *注册网络传输层异常
    */
    void ntsRegisterTransportError(String errorInfo);

    /*
    *心跳达到异常次数
    */
    void ntsOnHeartBeatException(int exceptionCount, String lastExceptionInfo);

    /*
     * 设备位置请求, 这个主要用在移动设备位置订阅上
     * @param interval 请求间隔, 单位是毫秒
     */
    void ntsOnDevicePositionRequest(String deviceId, int interval);
}

GBSIPAgentPlayListener主要系GB28181的Invite、Ack、Bye等处理:

public interface GBSIPAgentPlayListener {

    /*
     *收到s=Play的实时视音频点播
     */
    void ntsOnInvitePlay(String deviceId, SessionDescription sessionDescription);

    /*
     *发送play invite response 异常
     */
    void ntsOnPlayInviteResponseException(String deviceId, int statusCode, String errorInfo);

    /*
     * 收到CANCEL play INVITE请求
     */
    void ntsOnCancelPlay(String deviceId);

    /*
     * 收到Ack
     */
    void ntsOnAckPlay(String deviceId);

    /*
     * 收到Bye
     */
    void ntsOnByePlay(String deviceId);

    /*
     * 不是在收到BYE Message情况下, 终止Play
     */
    void ntsOnTerminatePlay(String deviceId);

    /*
     * Play会话对应的对话终止, 一般不会出发这个回调,目前只有在响应了200K, 但在64*T1时间后还没收到ACK,才可能会出发
    收到这个, 请做相关清理处理
    */
    void ntsOnPlayDialogTerminated(String deviceId);
}

GBSIPAgentAudioBroadcastListener主要系GB28181语音广播处理相关,如有语音广播相关需求,可参照demo实例实现:

public interface GBSIPAgentAudioBroadcastListener {

    /*
     *收到语音广播通知
     */
    void ntsOnNotifyBroadcastCommand(String fromUserName, String fromUserNameAtDomain, String sn, String sourceID, String targetID);

    /*
     *需要准备接受语音广播的SDP内容
     */
    void ntsOnAudioBroadcast(String commandFromUserName, String commandFromUserNameAtDomain, String sourceID, String targetID);

    /*
     *音频广播, 发送Invite请求异常
     */
    void ntsOnInviteAudioBroadcastException(String sourceID, String targetID, String errorInfo);

    /*
     *音频广播, 等待Invite响应超时
     */
    void ntsOnInviteAudioBroadcastTimeout(String sourceID, String targetID);

    /*
     *音频广播, 收到Invite消息最终响应
     */
    void ntsOnInviteAudioBroadcastResponse(String sourceID, String targetID, int statusCode, SessionDescription sessionDescription);

    /*
     * 音频广播, 收到BYE Message
     */
    void ntsOnByeAudioBroadcast(String sourceID, String targetID);


    /*
     * 不是在收到BYE Message情况下, 终止音频广播
     */
    void ntsOnTerminateAudioBroadcast(String sourceID, String targetID);
}

GBSIPAgentDeviceControlListener主要系GB28181设备控制相关,比如远程启动、云台控制:

public interface GBSIPAgentDeviceControlListener {

    /*
     * 收到远程启动控制命令
     */
    void ntsOnDeviceControlTeleBootCommand(String deviceId, String teleBootValue);

    /*
    * 云台控制
     */
    void ntsOnDeviceControlPTZCmd(String deviceId, String typeValue);
}

GBSIPAgentQueryCommandListener主要系GB28181查询命令,如预置位查询:

public interface GBSIPAgentQueryCommandListener {

    /*
     * 设备预置位查询
     */
    void ntsOnDevicePresetQueryCommand(String fromUserName, String fromUserNameAtDomain, String sn, String deviceId);
}

GBSIPAgentTalkListener主要系GB28181语音对讲相关处理:

public interface GBSIPAgentTalkListener {
    /*
     *收到s=Talk 语音对讲
     */
    void ntsOnInviteTalk(String deviceId, SessionDescription sessionDescription);

    /*
     *发送talk invite response 异常
     */
    void ntsOnTalkInviteResponseException(String deviceId, int statusCode, String errorInfo);

    /*
     * 收到CANCEL Talk INVITE请求
     */
    void ntsOnCancelTalk(String deviceId);

    /*
     * 收到Ack
     */
    void ntsOnAckTalk(String deviceId);

    /*
     * 收到Bye
     */
    void ntsOnByeTalk(String deviceId);

    /*
     * 不是在收到BYE Message情况下, 终止Talk
     */
    void ntsOnTerminateTalk(String deviceId);

    /*
     * Talk会话对应的对话终止, 一般不会出发这个回调,目前只有在响应了200K, 但在64*T1时间后还没收到ACK,才可能会出发
    收到这个, 请做相关清理处理
    */
    void ntsOnTalkDialogTerminated(String deviceId);
}

GBSIPAgentPlaybackListener系历史视音频回放相关:

public interface GBSIPAgentPlaybackListener {
    void ntsOnInvitePlayback(long var1, String var3, SessionDescription var4);

    void ntsOnPlaybackInviteResponseException(long var1, String var3, int var4, String var5);

    void ntsOnCancelPlayback(long var1, String var3);

    void ntsOnAckPlayback(long var1, String var3);

    void ntsOnPlaybackMANSRTSPPlayCommand(long var1, String var3);

    void ntsOnPlaybackMANSRTSPPauseCommand(long var1, String var3);

    void ntsOnPlaybackMANSRTSPScaleCommand(long var1, String var3, double var4);

    void ntsOnPlaybackMANSRTSPSeekCommand(long var1, String var3, double var4);

    void ntsOnPlaybackMANSRTSPTeardownCommand(long var1, String var3);

    void ntsOnByePlayback(long var1, String var3);

    void ntsOnTerminatePlayback(long var1, String var3);

    void ntsOnPlaybackDialogTerminated(long var1, String var3);
}

GBSIPAgentDownloadListen系历史视音频下载相关:

public interface GBSIPAgentDownloadListener {
    void ntsOnInviteDownload(long var1, String var3, SessionDescription var4);

    void ntsOnDownloadInviteResponseException(long var1, String var3, int var4, String var5);

    void ntsOnCancelDownload(long var1, String var3);

    void ntsOnAckDownload(long var1, String var3);

    void ntsOnDownloadMANSRTSPScaleCommand(long var1, String var3, double var4);

    void ntsOnByeDownload(long var1, String var3);

    void ntsOnTerminateDownload(long var1, String var3);

    void ntsOnDownloadDialogTerminated(long var1, String var3);
}

媒体数据处理

RTP数据发送

RTP Sender(SmartPublisherJniV2.java)相关接口设计:

/*
 * SmartPublisherJniV2.java
 * Author: https://daniusdk.com
 */
/*
 * 创建RTP Sender实例
 *
 * @param reserve:保留参数传0
 *
 * @return RTP Sender 句柄,0表示失败
 */
public native long CreateRTPSender(int reserve);

/**
 *设置 RTP Sender传输协议
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param transport_protocol, 0:UDP, 1:TCP, 默认是UDP
 *
 * @return {0} if successful
 */
public native int SetRTPSenderTransportProtocol(long rtp_sender_handle, int transport_protocol);

/**
 *设置 RTP Sender IP地址类型
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param ip_address_type, 0:IPV4, 1:IPV6, 默认是IPV4, 当前仅支持IPV4
 *
 * @return {0} if successful
 */
public native int SetRTPSenderIPAddressType(long rtp_sender_handle, int ip_address_type);

/**
 *设置 RTP Sender RTP Socket本地端口
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param port, 必须是偶数,设置0的话SDK会自动分配, 默认值是0
 *
 * @return {0} if successful
 */
public native int SetRTPSenderLocalPort(long rtp_sender_handle, int port);

/**
 *设置 RTP Sender SSRC
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param ssrc, 如果设置的话,这个字符串要能转换成uint32类型, 否则设置失败
 *
 * @return {0} if successful
 */
public native int SetRTPSenderSSRC(long rtp_sender_handle, String ssrc);

/**
 *设置 RTP Sender RTP socket 发送Buffer大小
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param buffer_size, 必须大于0, 默认是512*1024, 当前仅对UDP socket有效, 根据视频码率考虑设置合适的值
 *
 * @return {0} if successful
 */
public native int SetRTPSenderSocketSendBuffer(long rtp_sender_handle, int buffer_size);

/**
 *设置 RTP Sender RTP时间戳时钟频率
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param clock_rate, 必须大于0, 对于GB28181 PS规定是90kHz, 也就是90000
 *
 * @return {0} if successful
 */
public native int SetRTPSenderClockRate(long rtp_sender_handle, int clock_rate);

/**
 *设置 RTP Sender 目的IP地址, 注意当前用在GB2818推送上,只设置一个地址,将来扩展如果用在其他地方,可能要设置多个目的地址,到时候接口可能会调整
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param address, IP地址
 * @param port, 端口
 *
 * @return {0} if successful
 */
public native int SetRTPSenderDestination(long rtp_sender_handle, String address, int port);

/**
 * 设置是否开启 RTP Receiver
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param is_enable, 0表示不收RTP包, 1表示收RTP包, SDK默认值为0.
 * @return
 */
public native int EnableRTPSenderReceive(long rtp_sender_handle, int is_enable);

/**
 *设置RTP Receiver SSRC
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param ssrc, 如果设置的话,这个字符串要能转换成uint32类型, 否则设置失败
 *
 * @return {0} if successful
 */
public native int SetRTPSenderReceiveSSRC(long rtp_sender_handle, String ssrc);

/**
 *设置RTP Receiver Payload 相关信息
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 *
 * @param payload_type, 请参考 RFC 3551
 *
 * @param encoding_name, 编码名, 请参考 RFC 3551, 如果payload_type不是动态的, 可能传null就好
 *
 * @param media_type, 媒体类型, 请参考 RFC 3551, 1 是视频, 2是音频
 *
 * @param clock_rate, 请参考 RFC 3551
 *
 * @return {0} if successful
 */
public native int SetRTPSenderReceivePayloadType(long rtp_sender_handle, int payload_type, String encoding_name, int media_type, int clock_rate);

/**
 *设置RTP Receiver PS的pts和dts clock frequency
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 *
 * @param ps_clock_frequency, 默认是90000, 一些特殊场景需要设置
 *
 * @return {0} if successful
 */
public native int SetRTPSenderReceivePSClockFrequency(long rtp_sender_handle, int ps_clock_frequency);

/**
 *设置 RTP Receiver 音频采样率
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param sampling_rate, 音频采样率
 *
 * @return {0} if successful
 */
public native int SetRTPSenderReceiveAudioSamplingRate(long rtp_sender_handle, int sampling_rate);

/**
 *设置 RTP Receiver 音频通道数
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param channels, 音频通道数
 *
 * @return {0} if successful
 */
public native int SetRTPSenderReceiveAudioChannels(long rtp_sender_handle, int channels);

/**
 *初始化RTP Sender, 初始化之前先调用上面的接口配置相关参数
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 *
 * @return {0} if successful
 */
public native int InitRTPSender(long rtp_sender_handle);

/**
 *获取RTP Sender RTP Socket本地端口
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 *
 * @return 失败返回0, 成功的话返回响应的端口, 请在InitRTPSender返回成功之后调用
 */
public native int GetRTPSenderLocalPort(long rtp_sender_handle);

/**
 * UnInit RTP Sender
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 *
 * @return {0} if successful
 */
public native int UnInitRTPSender(long rtp_sender_handle);

/**
 * 释放RTP Sender, 释放之后rtp_sender_handle就无效了,请不要再使用
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 *
 * @return {0} if successful
 */
public native int DestoryRTPSender(long rtp_sender_handle);

RTP数据接收

对应RTP Receiver(SmartPlayerJniV2.java)相关接口设计,如无语音广播或语音对讲相关技术需求,这部分可忽略:

/*
 * SmartPlayerJniV2.java
 * Author: https://daniusdk.com
 */
/*
 * 创建RTP Receiver
 *
 * @param reserve:保留参数传0
 *
 * @return RTP Receiver 句柄,0表示失败
 */
public native long CreateRTPReceiver(int reserve);


/**
 *设置 RTP Receiver传输协议
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param transport_protocol, 0:UDP, 1:TCP, 默认是UDP
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverTransportProtocol(long rtp_receiver_handle, int transport_protocol);


/**
 *设置 RTP Receiver IP地址类型
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param ip_address_type, 0:IPV4, 1:IPV6, 默认是IPV4
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverIPAddressType(long rtp_receiver_handle, int ip_address_type);


/**
 *设置 RTP Receiver RTP Socket本地端口
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param port, 必须是偶数,设置0的话SDK会自动分配, 默认值是0
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverLocalPort(long rtp_receiver_handle, int port);


/**
 *设置 RTP Receiver SSRC
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param ssrc, 如果设置的话,这个字符串要能转换成uint32类型, 否则设置失败
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverSSRC(long rtp_receiver_handle, String ssrc);


/**
 *创建 RTP Receiver 会话
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param reserve, 保留值,目前传0
 *
 * @return {0} if successful
 */
public native int CreateRTPReceiverSession(long rtp_receiver_handle, int reserve);


/**
 *获取 RTP Receiver RTP Socket本地端口
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 *
 * @return 失败返回0, 成功的话返回响应的端口, 请在CreateRTPReceiverSession返回成功之后调用
 */
public native int GetRTPReceiverLocalPort(long rtp_receiver_handle);


/**
 *设置 RTP Receiver Payload 相关信息
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 *
 * @param payload_type, 请参考 RFC 3551
 *
 * @param encoding_name, 编码名, 请参考 RFC 3551, 如果payload_type不是动态的, 可能传null就好
 *
 * @param media_type, 媒体类型, 请参考 RFC 3551, 1 是视频, 2是音频
 *
 * @param clock_rate, 请参考 RFC 3551
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverPayloadType(long rtp_receiver_handle, int payload_type, String encoding_name, int media_type, int clock_rate);


/**
 *设置 RTP Receiver 音频采样率
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param sampling_rate, 音频采样率
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverAudioSamplingRate(long rtp_receiver_handle, int sampling_rate);

/**
 *设置 RTP Receiver 音频通道数
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param channels, 音频通道数
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverAudioChannels(long rtp_receiver_handle, int channels);


/**
 *设置 RTP Receiver 远端地址
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 * @param address, IP地址
 * @param port, 端口
 *
 * @return {0} if successful
 */
public native int SetRTPReceiverRemoteAddress(long rtp_receiver_handle, String address, int port);

/**
 *初始化 RTP Receiver
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 *
 * @return {0} if successful
 */
public native int InitRTPReceiver(long rtp_receiver_handle);

/**
 *UnInit RTP Receiver
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 *
 * @return {0} if successful
 */
public native int UnInitRTPReceiver(long rtp_receiver_handle);


/**
 *Destory RTP Receiver Session
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 *
 * @return {0} if successful
 */
public native int DestoryRTPReceiverSession(long rtp_receiver_handle);


/**
 *Destory RTP Receiver
 *
 * @param rtp_receiver_handle, CreateRTPReceiver
 *
 * @return {0} if successful
 */
public native int DestoryRTPReceiver(long rtp_receiver_handle);

PostAudioPacket(SmartPlayerJniV2.java),投递音频包给外部Live source,目前仅于语音对讲使用:

/*
 * SmartPlayerJniV2.java
 * Author: https://daniusdk.com
 */
/**
 * 投递音频包给外部Live source, 注意ByteBuffer对象必须是DirectBuffer
 *
 * @param handle: return value from SmartPlayerOpen()
 *
 * @return {0} if successful
 */
public native int PostAudioPacket(long handle, int codec_id,
                          java.nio.ByteBuffer packet, int offset, int size, long pts, boolean is_pts_discontinuity,
                          java.nio.ByteBuffer extra_data, int extra_data_offset, int extra_data_size, int sample_rate, int channels);

GB28181接口调用

对应GB28181相关接口调用相关设计如下:

/*
 * SmartPublisherJniV2.java
 * Author: https://daniusdk.com
 */
/**
 * 设置GB28181 RTP Sender
 *
 * @param rtp_sender_handle, CreateRTPSender返回值
 * @param rtp_payload_type, 对于GB28181 PS, 协议定义是96, 具体以SDP为准,  RFC 3551有定义
 * @param encoding_name, 编码名, 请参考 RFC 3551, 当前仅支持: "PS", 其他值返回失败
 * @return {0} if successful
 */
public native int SetGB28181RTPSender(long handle, long rtp_sender_handle, int rtp_payload_type, String encoding_name);

/**
 * 设置GB28181 RTP 收到的音频包回调
 * @param handle
 * @param audio_packet_callback
 * @return
 */
public native int SetGB28181ReceiveAudioPacketCallback(long handle, NTAudioPacketCallback audio_packet_callback);

/**
 * 启动 GB28181 媒体流
 *
 * @return {0} if successful
 */
public native int StartGB28181MediaStream(long handle);

/**
 * 停止 GB28181 媒体流
 *
 * @return {0} if successful
 */
public native int StopGB28181MediaStream(long handle);

总结

以上是大牛直播SDK发布的Android平台GB28181设备接入模块的相关说明,除了上述接口设计外,模块还可以扩展实现实时静音、实时快照、按需录像、实时音量调节等,可扩展性非常好。

基于arm64架构国产操作系统|Linux下的RTMP|RTSP低延时直播播放器开发探究

技术背景

2014年4月8日起,美国微软公司停止了对Windows XP SP3操作系统提供服务支持,这引起了社会和广大用户的广泛关注和对信息安全的担忧。而2020年对Windows7服务支持的终止再一次推动了国产系统的发展。工信部对此表示,将继续加大力度,支持Linux的国产操作系统的研发和应用,并希望用户可以使用国产操作系统。

为什么要发展国产操作系统?

  1. 技术自主性和信息安全:国家和政府机构通常认为拥有自主研发和掌握核心技术是保障国家信息安全和加强自主创新能力的重要举措。通过开发和商业化国产操作系统,可以降低对外国技术和软件的依赖,并减少潜在的信息泄露和安全风险。
  2. 降低技术依赖风险:依赖外国操作系统和软件可能面临技术封锁、供应中断或不稳定的问题。商业化国产操作系统可以减少对外部技术供应链的依赖,为国内企业和用户提供更可靠和稳定的软件解决方案,降低技术依赖风险。
  3. 满足国内市场需求:随着国内科技产业的快速发展,对于操作系统的需求也日益增长。国产操作系统可以更好地满足国内用户的需求,提供更贴近本地文化和习惯的用户界面和功能,提高用户的使用体验。
  4. 促进产业发展:国产操作系统的开发和应用可以带动相关产业的发展,包括硬件制造、软件开发、系统集成等。这有助于提升整个产业链的技术水平和竞争力,促进国内科技产业的升级和转型。
  5. 生态系统建设:发展国产操作系统有助于构建和完善自主可控的软件生态系统。通过鼓励和支持国内软件开发者在国产操作系统上进行应用开发,可以丰富应用生态,提高系统的可用性和易用性。

此外,随着云计算、大数据、人工智能等新兴技术的发展,操作系统作为基础设施的重要性日益凸显。发展国产操作系统可以为这些新兴技术提供更安全、更可靠的运行环境,推动相关产业的发展和创新。

综上所述,发展国产操作系统对于保障国家信息安全、降低技术依赖风险、满足国内市场需求、促进产业发展以及构建完善的生态系统等方面都具有重要意义。

技术实现

顺势而为,在发布arm64架构的国产操作系统|Linux平台的RTMP|RTSP直播播放SDK之前,大牛直播SDK(官方)的直播播放SDK用一句比较流行的广告语叫遥遥领先,我们更是在前几年已经发布了Linux X86_64架构的播放器,并得到了广泛的应用。

本次发布的可用于国产操作系统和Linux上的的RTMP|RTSP直播播放SDK, video输出基于X协议,audio输出采用PulseAudio和Alsa Lib实现。除了常规功能如实时静音、快照、buffer time设定、网络自动重连等,RTMP支持扩展H265播放(支持Enhanced RTMP H.265播放), RTSP也支持H265播放。

大牛直播SDK发布的Linux平台播放器SDK支持多实例播放,相关代码如下:

/*
 * multi_player_demo.cpp
 * 
 * Author: daniusdk.com
 *
 * Copyright © 2017~2024 DaniuLive. All rights reserved.
 */
#include <stdio.h>
#include <string.h>
#include <assert.h>
#include <poll.h>
#include <errno.h>

#include <string>
#include <sstream>

#include <X11/Xlib.h>
#include <X11/keysym.h>

#include "nt_sdk_linux_smart_log.h"
#include "nt_linux_smart_player_sdk.h"
#include "nt_player_sdk_wrapper.h"

....

const char* players_url_[]
{
	"rtsp://admin:daniulive12345@192.168.0.120:554/h264/ch1/main/av_stream",
	"rtsp://admin:daniulive12345@192.168.0.120:554/h264/ch1/main/av_stream",
	"rtsp://admin:admin123456@192.168.0.121:554/cam/realmonitor?channel=1&subtype=0",
	"rtsp://admin:admin123456@192.168.0.121:554/cam/realmonitor?channel=1&subtype=0",
};

int main(int argc, char *argv[])
{
	XInitThreads(); // X支持多线程, 必须调用

	NT_SDKLogInit();

	// SDK初始化
	SmartPlayerSDKAPI player_api;
	if (!NT_PlayerSDKInit(player_api))
	{
		fprintf(stderr, "SDK init failed.\n");
		return 0;
	}

	auto display = XOpenDisplay(nullptr);
	if (!display)
	{
		fprintf(stderr, "Cannot connect to X server\n");
		player_api.UnInit();
		return 0;
	}

	auto screen = DefaultScreen(display);
	auto root = XRootWindow(display, screen);

	XWindowAttributes root_win_att;
	if (!XGetWindowAttributes(display, root, &root_win_att))
	{
		fprintf(stderr, "Get Root window attri failed\n");
		player_api.UnInit();
		XCloseDisplay(display);
		return 0;
	}

	if (root_win_att.width < 100 || root_win_att.height < 100)
	{
		fprintf(stderr, "Root window size error.\n");
		player_api.UnInit();
		XCloseDisplay(display);
		return 0;
	}

	fprintf(stdout, "Root Window Size:%d*%d\n", root_win_att.width, root_win_att.height);

	int main_w = root_win_att.width / 2, main_h = root_win_att.height/2;

	auto black_pixel = BlackPixel(display, screen);
	auto white_pixel = WhitePixel(display, screen);

	auto main_wid = XCreateSimpleWindow(display, root, 0, 0, main_w, main_h, 0, white_pixel, black_pixel);
	if (!main_wid)
	{
		player_api.UnInit();
		XCloseDisplay(display);
		fprintf(stderr, "Cannot create main windows\n");
		return 0;
	}

	XSelectInput(display, main_wid, StructureNotifyMask | KeyPressMask);

	XMapWindow(display, main_wid);
	XStoreName(display, main_wid, win_base_title);

	std::vector<std::shared_ptr<NT_PlayerSDKWrapper> > players;

	for (auto url: players_url_)
	{
		auto i = std::make_shared<NT_PlayerSDKWrapper>(&player_api);
		i->SetDisplay(display);
		i->SetScreen(screen);
		i->SetURL(url);
		players.push_back(i);

		if ( players.size() > 3 )
			break;
	}

	auto border_w = 2;

	std::vector<NT_LayoutRect> layout_rects;
	SubWindowsLayout(main_w, main_h, border_w, static_cast<int>(players.size()), layout_rects);

	for (auto i = 0; i < static_cast<int>(players.size()); ++i)
	{
		assert(players[i]);
		players[i]->SetWindow(CreateSubWindow(display, screen, main_wid, layout_rects[i], border_w));
	}

	for (const auto& i : players)
	{
		assert(i);
		if (i->GetWindow())
			XMapWindow(display, i->GetWindow());
	}

	for (auto i = 0; i < static_cast<int>(players.size()); ++i)
	{
		assert(players[i]);
		// 第一路不静音, 其他全部静音
		players[i]->Start(0, i!=0, 1, false);
		//players[i]->Start(0, false, 1, false);
	}

	while (true)
	{
		while (MY_X11_Pending(display, 10))
		{
			XEvent xev;
			memset(&xev, 0, sizeof(xev));
			XNextEvent(display, &xev);

			if (xev.type == ConfigureNotify)
			{
				if (xev.xconfigure.window == main_wid)
				{
					if (xev.xconfigure.width != main_w || xev.xconfigure.height != main_h)
					{
						main_w = xev.xconfigure.width;
						main_h = xev.xconfigure.height;

						SubWindowsLayout(main_w, main_h, border_w, static_cast<int>(players.size()), layout_rects);

						for (auto i = 0; i < static_cast<int>(players.size()); ++i)
						{
							if (players[i]->GetWindow())
							{
								XMoveResizeWindow(display, players[i]->GetWindow(), layout_rects[i].x_, layout_rects[i].y_, layout_rects[i].w_, layout_rects[i].h_);
							}
						}
					}
				}
				else
				{
					for (const auto& i: players)
					{
						assert(i);
						if (i->GetWindow() && i->GetWindow() == xev.xconfigure.window)
						{
							i->OnWindowSize(xev.xconfigure.width, xev.xconfigure.height);
						}
					}
				}
			}
			else if (xev.type == KeyPress)
			{
				if (xev.xkey.keycode == XKeysymToKeycode(display, XK_Escape))
				{
					fprintf(stdout, "ESC Key Press\n");

					for (const auto& i : players)
					{
						i->Stop();

						if (i->GetWindow())
						{
							XDestroyWindow(display, i->GetWindow());
							i->SetWindow(None);
						}
					}

					players.clear();
					
					XDestroyWindow(display, main_wid);
					XCloseDisplay(display);

					player_api.UnInit();

					fprintf(stdout, "Close Players....\n");
					return 0;
				}
			}
		}
	}
}

开始播放、停止播放封装实现

bool NT_PlayerSDKWrapper::Start(int buffer, bool is_mute, int render_scale_mode, bool is_only_dec_key_frame)
{
	if (is_playing_)
		return false;

	if (url_.empty())
		return false;

	if (!OpenHandle(url_, buffer))
		return false;

	assert(handle_ && handle_->Handle());

	// 音频参数
	player_api_->SetMute(handle_->Handle(), is_mute ? 1 : 0);
	player_api_->SetIsOutputAudioDevice(handle_->Handle(), 1);
	player_api_->SetAudioOutputLayer(handle_->Handle(), 0); // 使用pluse 或者 alsa播放, 两个可以选择一个

	// 视频参数
	player_api_->SetVideoSizeCallBack(handle_->Handle(), this, &NT_Player_SDK_WRAPPER_OnVideoSizeHandle);
	//player_api_->SetXDisplayName(handle_->Handle(), NULL);
	player_api_->SetXScreenNumber(handle_->Handle(),screen_);
	player_api_->SetRenderXWindow(handle_->Handle(), window_);
	player_api_->SetRenderScaleMode(handle_->Handle(), render_scale_mode);
	player_api_->SetRenderTextureScaleFilterMode(handle_->Handle(), 3);

	player_api_->SetOnlyDecodeVideoKeyFrame(handle_->Handle(), is_only_dec_key_frame ? 1 : 0);

	auto ret = player_api_->StartPlay(handle_->Handle());
	if (NT_ERC_OK != ret)
	{
		ResetHandle();
		return false;
	}

	is_playing_ = true;

	return true;
}

void NT_PlayerSDKWrapper::Stop()
{
	if (!is_playing_)
		return;

	assert(handle_);
	player_api_->StopPlay(handle_->Handle());

	video_width_ = 0;
	video_height_ = 0;

	ResetHandle();

	is_playing_ = false;
}

Event回调

bool NT_PlayerSDKWrapper::AttachHandle(const std::shared_ptr<NT_SDK_HandleWrapper>& handle)
{
	if (is_playing_)
		return false;

	handle_ = handle;

	if (handle_)
	{
		handle_->AddEventHandler(shared_from_this());
	}

	return true;
}

视频分辨率回调

extern "C" NT_VOID NT_CALLBACK NT_Player_SDK_WRAPPER_OnVideoSizeHandle(NT_HANDLE handle, NT_PVOID user_data,
	NT_INT32 width, NT_INT32 height)
{
	auto sdk_wrapper = reinterpret_cast<NT_PlayerSDKWrapper*>(user_data);
	if (nullptr == sdk_wrapper)
		return;

	sdk_wrapper->VideoSizeHandle(handle, width, height);
}

实时快照回调

extern "C" NT_VOID NT_CALLBACK NT_Player_SDK_WRAPPER_OnCaptureImageCallBack(NT_HANDLE handle, NT_PVOID user_data, NT_UINT32 result, NT_PCSTR file_name)
{
	auto sdk_wrapper = reinterpret_cast<NT_PlayerSDKWrapper*>(user_data);
	if (nullptr == sdk_wrapper)
		return;

	sdk_wrapper->CaptureImageHandle(handle, result, file_name);
}

总结

arm64架构的国产操作系统|Linux下的RTMP、RTSP直播播放,延迟依然毫秒级,随着国产操作系统在传统行业的推进,越来越多的场景需要高稳定性高延迟低的RTMP|RTSP播放器,本文抛砖引玉,感兴趣的开发者可以跟我单独探讨。

Android平台轻量级RTSP服务模块技术接入说明

技术背景

为满足内网无纸化/电子教室等内网超低延迟需求,避免让用户配置单独的服务器,大牛直播SDK在推送端发布了轻量级RTSP服务SDK。

轻量级RTSP服务解决的核心痛点是避免用户或者开发者单独部署RTSP或者RTMP服务,实现本地的音视频数据(如摄像头、麦克风),编码后,汇聚到内置RTSP服务,对外提供可供拉流的RTSP URL,轻量级RTSP服务,适用于内网环境下,对并发要求不高的场景,支持H.264/H.265,支持RTSP鉴权、单播、组播模式,考虑到单个服务承载能力,我们支持同时创建多个RTSP服务,并支持获取当前RTSP服务会话连接数。

轻量级RTSP服务数据源,支持编码前、编码后数据对接:

  • 编码前数据(目前支持的有YV12/NV21/NV12/I420/RGB24/RGBA32/RGB565等数据类型);
  • 编码后数据(如无人机等264/HEVC数据,或者本地解析的MP4音视频数据);
  • 拉取RTSP或RTMP流并注入轻量级RTSP服务模块,组合形成内置RTSP网关模块。

技术对接

 系统要求

  • SDK支持Android5.1及以上版本;
  • 支持的CPU架构:armv7, arm64, x86, x86_64。

准备工作

  • 确保SmartPublisherJniV2.java放到com.daniulive.smartpublisher包名下(可在其他包名下调用);
  • smartavengine.jar加入到工程;
  • 拷贝libSmartPublisher.so到工程;
  • AndroidManifast.xml添加相关权限:
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS"/>
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.CHANGE_WIFI_MULTICAST_STATE" />
<uses-permission android:name="android.permission.VIBRATE" />

  • Load相关so:
static {  
    System.loadLibrary("SmartPublisher");
}

  • build.gradle配置32/64位库:
splits {
    abi {
        enable true
        reset()
        // Specifies a list of ABIs that Gradle should create APKs for
        include 'armeabi-v7a', 'arm64-v8a', 'x86', 'x86_64' //select ABIs to build APKs for
        // Specify that we do not want to also generate a universal APK that includes all ABIs
        universalApk true
    }
}

  • 如需集成到自己系统测试,请用大牛直播SDK的app name,授权版按照授权app name正常使用即可;
  • 如何改app-name,strings.xml做以下修改:
<string name="app_name">SmartPublisherSDKDemo</string>

接口设计

Android内置轻量级RTSP服务SDK接口详解
调用描述 接口 接口描述
SmartRTSPServerSDK
初始化RTSP Server InitRtspServer Init rtsp server(和UnInitRtspServer配对使用,即便是启动多个RTSP服务,也只需调用一次InitRtspServer,请确保在OpenRtspServer之前调用)
创建一个rtsp server OpenRtspServer 创建一个rtsp server,返回rtsp server句柄
设置端口 SetRtspServerPort 设置rtsp server 监听端口, 在StartRtspServer之前必须要设置端口
设置鉴权用户名、密码 SetRtspServerUserNamePassword 设置rtsp server 鉴权用户名和密码, 这个可以不设置,只有需要鉴权的再设置
获取rtsp server当前会话数 GetRtspServerClientSessionNumbers 获取rtsp server当前的客户会话数, 这个接口必须在StartRtspServer之后再调用
启动rtsp server StartRtspServer 启动rtsp server
停止rtsp server StopRtspServer 停止rtsp server
关闭rtsp server CloseRtspServer 关闭rtsp server
UnInit rtsp server UnInitRtspServer UnInit rtsp server(和InitRtspServer配对使用,即便是启动多个RTSP服务,也只需调用一次UnInitRtspServer)
SmartRTSPServerSDK供Publisher调用的接口
设置rtsp的流名称 SetRtspStreamName 设置rtsp的流名称
给要发布的rtsp流设置rtsp server AddRtspStreamServer 给要发布的rtsp流设置rtsp server, 一个流可以发布到多个rtsp server上,rtsp server的创建启动请参考OpenRtspServer和StartRtspServer接口
清除设置的rtsp server ClearRtspStreamServer 清除设置的rtsp server
启动rtsp流 StartRtspStream 启动rtsp流
停止rtsp流 StopRtspStream 停止rtsp流

功能支持

  •  ​[视频格式]H.264/H.265(Android H.265硬编码);
  •  [音频格式]G.711 A律、AAC;
  • 协议:RTSP;
  •  [音量调节]Android平台采集端支持实时音量调节;
  •  [H.264硬编码]支持H.264特定机型硬编码;
  •  [H.265硬编码]支持H.265特定机型硬编码;
  • [音视频]支持纯音频/纯视频/音视频;
  • [摄像头]支持采集过程中,前后摄像头实时切换;
  • 支持帧率、关键帧间隔(GOP)、码率(bit-rate)设置;
  • [实时水印]支持动态文字水印、png水印;
  • [实时快照]支持实时快照;
  • [降噪]支持环境音、手机干扰等引起的噪音降噪处理、自动增益、VAD检测;
  • [外部编码前视频数据对接]支持YUV数据对接;
  • [外部编码前音频数据对接]支持PCM对接;
  • [外部编码后视频数据对接]支持外部H.264、H.265数据对接;
  • [外部编码后音频数据对接]外部AAC数据对接;
  • [扩展录像功能]支持和录像SDK组合使用,录像相关功能。​
  • 支持RTSP端口设置;
  • 支持RTSP鉴权用户名、密码设置;
  • 支持获取当前RTSP服务会话连接数;
  • 支持Android 5.1及以上版本。

接口调用详解

本文以大牛直播SDK Android平台Camera2Demo为例,启动RTSP服务、发布RTSP流之前,可以先选择视频分辨率、软编还是硬编码,音频是PCMA还是AAC编码等基础设置,其他参数的设置,可以参考下面InitAndSetConfig()。

以Android平台Camera2对接为例,先初始化RTSP Server:

/*
 * MainActivity.java
 * Author: daniusdk.com
 */
@Override
protected void onCreate(Bundle savedInstanceState) {
	super.onCreate(savedInstanceState);
	setContentView(R.layout.activity_main);
	
	...

	context_ = this.getApplicationContext();
	
	libPublisher = new SmartPublisherJniV2();

	libPublisher.InitRtspServer(context_);      //和UnInitRtspServer配对使用,即便是启动多个RTSP服务,也只需调用一次InitRtspServer,请确保在OpenRtspServer之前调用
}

启动、停止RTSP服务:

//启动/停止RTSP服务
class ButtonRtspServiceListener implements View.OnClickListener {
	public void onClick(View v) {
		if (isRTSPServiceRunning) {
			stopRtspService();

			btnRtspService.setText("启动RTSP服务");
			btnRtspPublisher.setEnabled(false);

			isRTSPServiceRunning = false;
			return;
		}

		Log.i(TAG, "onClick start rtsp service..");

		rtsp_handle_ = libPublisher.OpenRtspServer(0);

		if (rtsp_handle_ == 0) {
			Log.e(TAG, "创建rtsp server实例失败! 请检查SDK有效性");
		} else {
			int port = 8554;
			if (libPublisher.SetRtspServerPort(rtsp_handle_, port) != 0) {
				libPublisher.CloseRtspServer(rtsp_handle_);
				rtsp_handle_ = 0;
				Log.e(TAG, "创建rtsp server端口失败! 请检查端口是否重复或者端口不在范围内!");
			}

			if (libPublisher.StartRtspServer(rtsp_handle_, 0) == 0) {
				Log.i(TAG, "启动rtsp server 成功!");
			} else {
				libPublisher.CloseRtspServer(rtsp_handle_);
				rtsp_handle_ = 0;
				Log.e(TAG, "启动rtsp server失败! 请检查设置的端口是否被占用!");
			}

			btnRtspService.setText("停止RTSP服务");
			btnRtspPublisher.setEnabled(true);

			isRTSPServiceRunning = true;
		}
	}
}

stopRtspService()实现如下:

//停止RTSP服务
private void stopRtspService() {
	if(!isRTSPServiceRunning)
	{
		return;
	}
	if (libPublisher != null && rtsp_handle_ != 0) {
		libPublisher.StopRtspServer(rtsp_handle_);
		libPublisher.CloseRtspServer(rtsp_handle_);
		rtsp_handle_ = 0;
	}
}

发布、停止RTSP流:

//发布/停止RTSP流
class ButtonRtspPublisherListener implements View.OnClickListener {
	public void onClick(View v) {
		if (stream_publisher_.is_rtsp_publishing()) {
			stopRtspPublisher();

			btnRtspPublisher.setText("发布RTSP流");
			btnGetRtspSessionNumbers.setEnabled(false);
			btnRtspService.setEnabled(true);
			return;
		}

		Log.i(TAG, "onClick start rtsp publisher..");

		InitAndSetConfig();

		String rtsp_stream_name = "stream1";
		stream_publisher_.SetRtspStreamName(rtsp_stream_name);
		stream_publisher_.ClearRtspStreamServer();

		stream_publisher_.AddRtspStreamServer(rtsp_handle_);

		if (!stream_publisher_.StartRtspStream()) {
			stream_publisher_.try_release();
			Log.e(TAG, "调用发布rtsp流接口失败!");
			return;
		}

		startAudioRecorder();
		startLayerPostThread();

		btnRtspPublisher.setText("停止RTSP流");
		btnGetRtspSessionNumbers.setEnabled(true);
		btnRtspService.setEnabled(false);
	}
}

stopRtspPublisher()实现如下:

//停止发布RTSP流
private void stopRtspPublisher() {
	stream_publisher_.StopRtspStream();
	stream_publisher_.try_release();

	if (!stream_publisher_.is_publishing())
		stopAudioRecorder();
}

其中,InitAndSetConfig()实现如下,通过调研SmartPublisherOpen()接口,生成推送实例句柄。

/*
 * MainActivity.java
 * Author: daniusdk.com
 */
private void InitAndSetConfig() {
	if (null == libPublisher)
		return;

	if (!stream_publisher_.empty())
		return;

	Log.i(TAG, "InitAndSetConfig video width: " + video_width_ + ", height" + video_height_ + " imageRotationDegree:" + cameraImageRotationDegree_);

	int audio_opt = 1;
	long handle = libPublisher.SmartPublisherOpen(context_, audio_opt, 3,  video_width_, video_height_);
	if (0==handle) {
		Log.e(TAG, "sdk open failed!");
		return;
	}

	Log.i(TAG, "publisherHandle=" + handle);

	int fps = 25;
	int gop = fps * 3;

	initialize_publisher(libPublisher, handle, video_width_, video_height_, fps, gop);

	stream_publisher_.set(libPublisher, handle);
}

对应的initialize_publisher()实现如下,设置软硬编码、帧率、关键帧间隔等。

private boolean initialize_publisher(SmartPublisherJniV2 lib_publisher, long handle, int width, int height, int fps, int gop) {
	if (null == lib_publisher) {
		Log.e(TAG, "initialize_publisher lib_publisher is null");
		return false;
	}

	if (0 == handle) {
		Log.e(TAG, "initialize_publisher handle is 0");
		return false;
	}

	if (videoEncodeType == 1) {
		int kbps = LibPublisherWrapper.estimate_video_hardware_kbps(width, height, fps, true);
		Log.i(TAG, "h264HWKbps: " + kbps);
		int isSupportH264HWEncoder = lib_publisher.SetSmartPublisherVideoHWEncoder(handle, kbps);
		if (isSupportH264HWEncoder == 0) {
			lib_publisher.SetNativeMediaNDK(handle, 0);
			lib_publisher.SetVideoHWEncoderBitrateMode(handle, 1); // 0:CQ, 1:VBR, 2:CBR
			lib_publisher.SetVideoHWEncoderQuality(handle, 39);
			lib_publisher.SetAVCHWEncoderProfile(handle, 0x08); // 0x01: Baseline, 0x02: Main, 0x08: High

			// lib_publisher.SetAVCHWEncoderLevel(handle, 0x200); // Level 3.1
			// lib_publisher.SetAVCHWEncoderLevel(handle, 0x400); // Level 3.2
			// lib_publisher.SetAVCHWEncoderLevel(handle, 0x800); // Level 4
			lib_publisher.SetAVCHWEncoderLevel(handle, 0x1000); // Level 4.1 多数情况下,这个够用了
			//lib_publisher.SetAVCHWEncoderLevel(handle, 0x2000); // Level 4.2

			// lib_publisher.SetVideoHWEncoderMaxBitrate(handle, ((long)h264HWKbps)*1300);

			Log.i(TAG, "Great, it supports h.264 hardware encoder!");
		}
	} else if (videoEncodeType == 2) {
		int kbps = LibPublisherWrapper.estimate_video_hardware_kbps(width, height, fps, false);
		Log.i(TAG, "hevcHWKbps: " + kbps);
		int isSupportHevcHWEncoder = lib_publisher.SetSmartPublisherVideoHevcHWEncoder(handle, kbps);
		if (isSupportHevcHWEncoder == 0) {
			lib_publisher.SetNativeMediaNDK(handle, 0);
			lib_publisher.SetVideoHWEncoderBitrateMode(handle, 1); // 0:CQ, 1:VBR, 2:CBR
			lib_publisher.SetVideoHWEncoderQuality(handle, 39);

			// libPublisher.SetVideoHWEncoderMaxBitrate(handle, ((long)hevcHWKbps)*1200);

			Log.i(TAG, "Great, it supports hevc hardware encoder!");
		}
	}

	boolean is_sw_vbr_mode = true;
	//H.264 software encoder
	if (is_sw_vbr_mode) {
		int is_enable_vbr = 1;
		int video_quality = LibPublisherWrapper.estimate_video_software_quality(width, height, true);
		int vbr_max_kbps = LibPublisherWrapper.estimate_video_vbr_max_kbps(width, height, fps);
		lib_publisher.SmartPublisherSetSwVBRMode(handle, is_enable_vbr, video_quality, vbr_max_kbps);
	}

	if (is_pcma_) {
		lib_publisher.SmartPublisherSetAudioCodecType(handle, 3);
	} else {
		lib_publisher.SmartPublisherSetAudioCodecType(handle, 1);
	}

	lib_publisher.SetSmartPublisherEventCallbackV2(handle, new EventHandlerPublisherV2().set(handler_, record_executor_));

	lib_publisher.SmartPublisherSetSWVideoEncoderProfile(handle, 3);

	lib_publisher.SmartPublisherSetSWVideoEncoderSpeed(handle, 2);

	lib_publisher.SmartPublisherSetGopInterval(handle, gop);

	lib_publisher.SmartPublisherSetFPS(handle, fps);

	// lib_publisher.SmartPublisherSetSWVideoBitRate(handle, 600, 1200);

	boolean is_noise_suppression = true;
	lib_publisher.SmartPublisherSetNoiseSuppression(handle, is_noise_suppression ? 1 : 0);

	boolean is_agc = false;
	lib_publisher.SmartPublisherSetAGC(handle, is_agc ? 1 : 0);

	int echo_cancel_delay = 0;
	lib_publisher.SmartPublisherSetEchoCancellation(handle, 1, echo_cancel_delay);

	return true;
}

发布RTSP流成功后,会回调上来可供拉流的RTSP URL:

private static class EventHandlerPublisherV2 implements NTSmartEventCallbackV2 {
	@Override
	public void onNTSmartEventCallbackV2(long handle, int id, long param1, long param2, String param3, String param4, Object param5) {

		switch (id) {
			...
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_RTSP_URL:
				publisher_event = "RTSP服务URL: " + param3;
				break;
		}
	}
}

获取RTSP Session会话数:

//获取RTSP会话数
class ButtonGetRtspSessionNumbersListener implements View.OnClickListener {
	public void onClick(View v) {
		if (libPublisher != null && rtsp_handle_ != 0) {
			int session_numbers = libPublisher.GetRtspServerClientSessionNumbers(rtsp_handle_);

			Log.i(TAG, "GetRtspSessionNumbers: " + session_numbers);

			PopRtspSessionNumberDialog(session_numbers);
		}
	}
}

//当前RTSP会话数弹出框
private void PopRtspSessionNumberDialog(int session_numbers) {
	final EditText inputUrlTxt = new EditText(this);
	inputUrlTxt.setFocusable(true);
	inputUrlTxt.setEnabled(false);

	String session_numbers_tag = "RTSP服务当前客户会话数: " + session_numbers;
	inputUrlTxt.setText(session_numbers_tag);

	AlertDialog.Builder builderUrl = new AlertDialog.Builder(this);
	builderUrl
			.setTitle("内置RTSP服务")
			.setView(inputUrlTxt).setNegativeButton("确定", null);
	builderUrl.show();
}

数据投递如下(以Camera2采集为例,如果是其他视频格式,也可以正常对接):

@Override
public void onCameraImageData(Image image) {
	....
	for (LibPublisherWrapper i : publisher_array_)
		i.PostLayerImageYUV420888ByteBuffer(0, 0, 0,
			planes[0].getBuffer(), y_offset, planes[0].getRowStride(),
			planes[1].getBuffer(), u_offset, planes[1].getRowStride(),
			planes[2].getBuffer(), v_offset, planes[2].getRowStride(), planes[1].getPixelStride(),
			w, h, 0, 0,
			scale_w, scale_h, scale_filter_mode, rotation_degree);

}

音频采集投递设计如下:

void startAudioRecorder() {
	if (audio_recorder_ != null)
		return;

	audio_recorder_ = new NTAudioRecordV2(this);

	Log.i(TAG, "startAudioRecorder call audio_recorder_.start()+++...");

	audio_recorder_callback_ = new NTAudioRecordV2CallbackImpl(stream_publisher_, null);

	audio_recorder_.AddCallback(audio_recorder_callback_);

	if (!audio_recorder_.Start(is_pcma_ ? 8000 : 44100, 1) ) {
		audio_recorder_.RemoveCallback(audio_recorder_callback_);
		audio_recorder_callback_ = null;

		audio_recorder_ = null;

		Log.e(TAG, "startAudioRecorder start failed.");
	}
	else {
		Log.i(TAG, "startAudioRecorder call audio_recorder_.start() OK---...");
	}
}

void stopAudioRecorder() {
	if (null == audio_recorder_)
		return;

	Log.i(TAG, "stopAudioRecorder+++");

	audio_recorder_.Stop();

	if (audio_recorder_callback_ != null) {
		audio_recorder_.RemoveCallback(audio_recorder_callback_);
		audio_recorder_callback_ = null;
	}

	audio_recorder_ = null;

	Log.i(TAG, "stopAudioRecorder---");
}

回调Audio数据的地方,直接投递出去:

private static class NTAudioRecordV2CallbackImpl implements NTAudioRecordV2Callback {
	private WeakReference<LibPublisherWrapper> publisher_0_;
	private WeakReference<LibPublisherWrapper> publisher_1_;

	public NTAudioRecordV2CallbackImpl(LibPublisherWrapper publisher_0) {
		if (publisher_0 != null)
			publisher_0_ = new WeakReference<>(publisher_0);
	}

	private final LibPublisherWrapper get_publisher_0() {
		if (publisher_0_ !=null)
			return publisher_0_.get();

		return null;
	}

	@Override
	public void onNTAudioRecordV2Frame(ByteBuffer data, int size, int sampleRate, int channel, int per_channel_sample_number) {

		LibPublisherWrapper publisher_0 = get_publisher_0();
		if (publisher_0 != null)
			publisher_0.OnPCMData(data, size, sampleRate, channel, per_channel_sample_number);
	}
}

onDestroy() 的时候,调研UnInitRtspServer()即可:

@Override
protected void onDestroy() {
	Log.i(TAG, "activity destory!");

	stopAudioRecorder();

	stopRtspPublisher();
	stopRtspService();
	isRTSPServiceRunning = false;

	stream_publisher_.release();

	if (libPublisher != null)
		libPublisher.UnInitRtspServer();      //如已启用内置服务功能(InitRtspServer),调用UnInitRtspServer, 注意,即便是启动多个RTSP服务,也只需调用UnInitRtspServer一次

	stopLayerPostThread();

	if (camera2Helper != null) {
		camera2Helper.release();
	}

	super.onDestroy();
}

总结

以上是Android平台轻量级RTSP服务模块详细的对接说明,除了可以对接编码前音视频数据外,模块还支持对接编码后音视频数据,并实现本地录像、快照等功能组合使用。感兴趣的开发者,可以单独跟我们探讨。

Android平台RTSP|RTMP直播播放器技术接入说明

技术背景

大牛直播SDK自2015年发布RTSP、RTMP直播播放模块,迭代从未停止,SmartPlayer功能强大、性能强劲、高稳定、超低延迟、超低资源占用。无需赘述,全自研内核,行业内一致认可的跨平台RTSP、RTMP直播播放器。本文以Android平台为例,介绍下如何集成RTSP、RTMP播放模块。

技术对接

 系统要求

  • SDK支持Android5.1及以上版本;
  • 支持的CPU架构:armv7, arm64, x86, x86_64。

准备工作

  • 确保SmartPlayerJniV2.java放到com.daniulive.smartplayer包名下(可在其他包名下调用);
  • Smartavengine.jar加入到工程;
  • 拷贝SmartPlayerV2\app\src\main\jniLibs\armeabi-v7a、 SmartPlayerV2\app\src\main\jniLibs\arm64-v8a、SmartPlayerV2\app\src\main\jniLibs\x86和SmartPlayerV2\app\src\main\jniLibs\x86_64 下 libSmartPlayer.so到工程;
  • AndroidManifast.xml添加相关权限:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" >
</uses-permission>
<uses-permission android:name="android.permission.INTERNET" >
</uses-permission>
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />

  • Load相关so:
static {  
    System.loadLibrary("SmartPlayer");
}

  • build.gradle配置32/64位库:
splits {
    abi {
        enable true
        reset()
        // Specifies a list of ABIs that Gradle should create APKs for
        include 'armeabi-v7a', 'arm64-v8a', 'x86', 'x86_64' //select ABIs to build APKs for
        // Specify that we do not want to also generate a universal APK that includes all ABIs
        universalApk true
    }
}

  • 如需集成到自己系统测试,请用大牛直播SDK的app name,授权版按照授权app name正常使用即可;
  • 如何改app-name,strings.xml做以下修改:
<string name="app_name">SmartPlayerSDKDemo</string>

接口设计

Android RTSP|RTMP播放端SDK接口详解
调用描述 接口 接口描述
最先调用,如成功返回播放实例 SmartPlayerOpen player初始化,设置上下文信息,返回player句柄
Event回调 SetSmartPlayerEventCallbackV2 设置event callback
硬解码设置H.264 SetSmartPlayerVideoHWDecoder 设置是否用H.264硬解码播放,如硬解码不支持,自动适配到软解码
硬解码设置H.265 SetSmartPlayerVideoHevcHWDecoder 设置是否用H.265硬解码播放,如硬解码不支持,自动适配到软解码
视频画面

填充模式

SmartPlayerSetRenderScaleMode 设置视频画面的填充模式,如填充整个view、等比例填充view,如不设置,默认填充整个view
设置SurfaceView模式下render类型 SmartPlayerSetSurfaceRenderFormat 设置SurfaceView模式下(NTRenderer.CreateRenderer第二个参数传false的情况),render类型

0: RGB565格式,如不设置,默认此模式; 1: ARGB8888格式

设置SurfaceView模式下抗锯齿效果 SmartPlayerSetSurfaceAntiAlias 设置SurfaceView模式下(NTRenderer.CreateRenderer第二个参数传false的情况),抗锯齿效果,注意:抗锯齿模式开启后,可能会影像性能,请慎用
设置播放的surface SmartPlayerSetSurface 设置播放的surface,如果为null,则播放纯音频
设置视频硬解码下Mediacodec自行绘制模式 SmartPlayerSetHWRenderMode 此种模式下,硬解码兼容性和效率更好,回调YUV/RGB快照和图像等比例缩放功能将不可用
更新硬解码surface SmartPlayerUpdateHWRenderSurface 设置更新硬解码surface
音频回调 YUV/RGB SmartPlayerSetExternalRender 提供解码后YUV/RGB数据接口,供用户自己render或进一步处理(如视频分析)
Audio SmartPlayerSetExternalAudioOutput 回调audio数据到上层(供二次处理之用)
audio输出类型 SmartPlayerSetAudioOutputType 如果use_audiotrack设置为0,将会自动选择输出设备,如果设置为1,使用audiotrack模式,一对一回音消除模式下,请选用audiotrack模式
Video输出类型 NTRenderer.CreateRenderer(上层demo内) 第二个参数,如果是true,用openGLES绘制,false则用默认surfaceView
播放模式 缓冲时间设置 SmartPlayerSetBuffer 设置播放端缓存数据buffer,单位:毫秒,如不需buffer,设置为0
首屏秒开 SmartPlayerSetFastStartup 设置快速启动后,如果CDN缓存GOP,实现首屏秒开
低延迟模式 SmartPlayerSetLowLatencyMode 针对类似于直播娃娃机等期待超低延迟的使用场景,超低延迟播放模式下,延迟可达到200~400ms
快速切换URL SmartPlayerSwitchPlaybackUrl 快速切换播放url,快速切换时,只换播放source部分,适用于不同数据流之间,快速切换(如娃娃机双摄像头切换或高低分辨率流切换)
RTSP TCP/UDP模式设置 SmartPlayerSetRTSPTcpMode 设置RTSP TCP/UDP模式,如不设置,默认UDP模式
RTSP超时时间设置 SmartPlayerSetRTSPTimeout 设置RTSP超时时间,timeout单位为秒,必须大于0
设置RTSP TCP/UDP自动切换 SmartPlayerSetRTSPAutoSwitchTcpUdp 对于RTSP来说,有些可能支持rtp over udp方式,有些可能支持使用rtp over tcp方式

为了方便使用,有些场景下可以开启自动尝试切换开关, 打开后如果udp无法播放,sdk会自动尝试tcp, 如果tcp方式播放不了,sdk会自动尝试udp.

设置RTSP用户名和密码 SetRTSPAuthenticationInfo 如果RTSP URL已包含用户名和密码, 此接口设置的用户名和密码将无效. 就是说要用这个接口设置的用户名和密码去做认证, RTSP URL不能包含用户名和密码.
实时静音 SmartPlayerSetMute 实时静音
设置播放音量 SmartPlayerSetAudioVolume 播放端音量实时调节,范围[0,100],0时为静音,100为原始流数据最大音量
设置是否禁用 Enhanced

 RTMP

DisableEnhancedRTMP disable enhanced RTMP, SDK默认是开启enhanced RTMP的
实时截图 CaptureImage 支持JPEG和PNG两种格式
视频镜像旋转 旋转 SmartPlayerSetRotation 设置顺时针旋转, 注意除了0度之外, 其他角度都会额外消耗性能,当前支持 0度,90度, 180度, 270度 旋转
水平反转 SmartPlayerSetFlipHorizontal 设置视频水平反转
垂直反转 SmartPlayerSetFlipVertical 设置视频垂直反转
设置URL SmartPlayerSetUrl 设置需要播放或录像的RTMP/RTSP url
开始播放 SmartPlayerStartPlay 开始播放RTSP/RTMP流
停止播放 SmartPlayerStopPlay 停止播放RTSP/RTMP流
关闭播放实例 SmartPlayerClose 结束时必须调用close接口释放资源

功能支持

  • 音频:AAC/Speex(RTMP)/PCMA/PCMU;
  • 视频:H.264、H.265;
  • 播放协议:RTSP|RTMP;
  • 支持纯音频、纯视频、音视频播放;
  • 支持多实例播放;
  • 支持软解码,特定机型硬解码;
  • 支持RTSP TCP、UDP模式设置;
  • 支持RTSP TCP、UDP模式自动切换;
  • 支持RTSP超时时间设置,单位:秒;
  • 支持buffer时间设置,单位:毫秒;
  • 支持超低延迟模式;
  • 支持断网自动重连、视频追赶,支持buffer状态等回调;
  • 支持视频view实时旋转(0° 90° 180° 270°);
  • 支持视频view水平反转、垂直反转;
  • 支持Surfaceview/OpenGL ES/TextureView绘制;
  • 支持视频画面填充模式设置;
  • 音频支持AudioTrack、OpenSL ES模式;
  • 支持jpeg、png实时截图;
  • 支持实时音量调节;
  • 支持解码前音视频数据回调;
  • 支持解码后YUV/RGB数据回调;
  • 支持Enhanced RTMP;
  • 支持扩展录像功能;
  • 支持Android 5.1及以上版本。

接口调用详解

本文以大牛直播SDK Android平台SmartPlayerV2为例,播放之前,设置初始化参数配置(软解还是硬解、buffer time等)和需要播放的RTSP或RTMP URL,点开始播放即可。

onCreate()时,先new SmartPlayerJniV2():

/*
 * SmartPlayer.java
 * Author: daniusdk.com
 */
@Override
protected void onCreate(Bundle savedInstanceState) {
	super.onCreate(savedInstanceState);
	setContentView(R.layout.activity_smart_player);
	
	...

    libPlayer = new SmartPlayerJniV2();
    myContext = this.getApplicationContext();
}

开始播放、停止播放实现,开始播放的时候,调用InitAndSetConfig(),完成常规参数初始化,然后调用仅播放相关的其他接口。

btnStartStopPlayback.setOnClickListener(new Button.OnClickListener() {

	// @Override
	public void onClick(View v) {

		if (isPlaying) {
			Log.i(TAG, "Stop playback stream++");

			int iRet = libPlayer.SmartPlayerStopPlay(playerHandle);

			if (iRet != 0) {
				Log.e(TAG, "Call SmartPlayerStopPlay failed..");
				return;
			}

			btnHardwareDecoder.setEnabled(true);
			btnLowLatency.setEnabled(true);

			if (!isRecording) {
				btnPopInputUrl.setEnabled(true);
				btnSetPlayBuffer.setEnabled(true);
				btnFastStartup.setEnabled(true);

				btnRecoderMgr.setEnabled(true);
				libPlayer.SmartPlayerClose(playerHandle);
				playerHandle = 0;
			}

			isPlaying = false;
			btnStartStopPlayback.setText("开始播放 ");

			if (is_enable_hardware_render_mode && sSurfaceView != null) {
				sSurfaceView.setVisibility(View.GONE);
				sSurfaceView.setVisibility(View.VISIBLE);
			}

			Log.i(TAG, "Stop playback stream--");
		} else {
			Log.i(TAG, "Start playback stream++");

			if (!isRecording) {
				InitAndSetConfig();
			}

			// 如果第二个参数设置为null,则播放纯音频
			libPlayer.SmartPlayerSetSurface(playerHandle, sSurfaceView);

			libPlayer.SmartPlayerSetRenderScaleMode(playerHandle, 1);

			//int render_format = 1;
			//libPlayer.SmartPlayerSetSurfaceRenderFormat(playerHandle, render_format);

			//int is_enable_anti_alias = 1;
			//libPlayer.SmartPlayerSetSurfaceAntiAlias(playerHandle, is_enable_anti_alias);

			if (isHardwareDecoder && is_enable_hardware_render_mode) {
				libPlayer.SmartPlayerSetHWRenderMode(playerHandle, 1);
			}

			// External Render test
			//libPlayer.SmartPlayerSetExternalRender(playerHandle, new RGBAExternalRender(imageSavePath));
			//libPlayer.SmartPlayerSetExternalRender(playerHandle, new I420ExternalRender(imageSavePath));

			libPlayer.SmartPlayerSetUserDataCallback(playerHandle, new UserDataCallback());
			//libPlayer.SmartPlayerSetSEIDataCallback(playerHandle, new SEIDataCallback());

			libPlayer.SmartPlayerSetAudioOutputType(playerHandle, 1);

			if (isMute) {
				libPlayer.SmartPlayerSetMute(playerHandle, isMute ? 1
						: 0);
			}

			if (isHardwareDecoder) {
				int isSupportHevcHwDecoder = libPlayer.SetSmartPlayerVideoHevcHWDecoder(playerHandle, 1);

				int isSupportH264HwDecoder = libPlayer
						.SetSmartPlayerVideoHWDecoder(playerHandle, 1);

				Log.i(TAG, "isSupportH264HwDecoder: " + isSupportH264HwDecoder + ", isSupportHevcHwDecoder: " + isSupportHevcHwDecoder);
			}

			libPlayer.SmartPlayerSetLowLatencyMode(playerHandle, isLowLatency ? 1
					: 0);

			libPlayer.SmartPlayerSetFlipVertical(playerHandle, is_flip_vertical ? 1 : 0);

			libPlayer.SmartPlayerSetFlipHorizontal(playerHandle, is_flip_horizontal ? 1 : 0);

			libPlayer.SmartPlayerSetRotation(playerHandle, rotate_degrees);

			libPlayer.SmartPlayerSetAudioVolume(playerHandle, curAudioVolume);

			int iPlaybackRet = libPlayer
					.SmartPlayerStartPlay(playerHandle);

			if (iPlaybackRet != 0) {
				Log.e(TAG, "Call SmartPlayerStartPlay failed..");
				return;
			}

			btnStartStopPlayback.setText("停止播放 ");

			btnPopInputUrl.setEnabled(false);
			btnPopInputKey.setEnabled(false);
			btnSetPlayBuffer.setEnabled(false);
			btnLowLatency.setEnabled(false);
			btnFastStartup.setEnabled(false);
			btnRecoderMgr.setEnabled(false);

			isPlaying = true;
			Log.i(TAG, "Start playback stream--");
		}
	}
});

由于RTSP、RTMP播放模块,除了常规的直播播放外,也可能录像、或者实时拉流转发到RTMP服务器或轻量级RTSP服务,所以,和录像、转发相关的播放端基础参数配置,放到InitAndSetConfig()实现:

private void InitAndSetConfig() {
	playerHandle = libPlayer.SmartPlayerOpen(myContext);

	if (playerHandle == 0) {
		Log.e(TAG, "surfaceHandle with nil..");
		return;
	}

	libPlayer.SetSmartPlayerEventCallbackV2(playerHandle,
			new EventHandeV2());

	libPlayer.SmartPlayerSetBuffer(playerHandle, playBuffer);

	// set report download speed(默认2秒一次回调 用户可自行调整report间隔)
	libPlayer.SmartPlayerSetReportDownloadSpeed(playerHandle, 1, 2);

	libPlayer.SmartPlayerSetFastStartup(playerHandle, isFastStartup ? 1 : 0);

	//设置RTSP超时时间
	int rtsp_timeout = 10;
	libPlayer.SmartPlayerSetRTSPTimeout(playerHandle, rtsp_timeout);

	//设置RTSP TCP/UDP模式自动切换
	int is_auto_switch_tcp_udp = 1;
	libPlayer.SmartPlayerSetRTSPAutoSwitchTcpUdp(playerHandle, is_auto_switch_tcp_udp);

	libPlayer.SmartPlayerSaveImageFlag(playerHandle, 1);

	// It only used when playback RTSP stream..
	// libPlayer.SmartPlayerSetRTSPTcpMode(playerHandle, 1);

	// playbackUrl = "rtmp://localhost:1935/live/stream1";

	if (playbackUrl == null) {
		Log.e(TAG, "playback URL with NULL...");
		return;
	}

	libPlayer.SmartPlayerSetUrl(playerHandle, playbackUrl);
	// try_set_rtsp_url(playbackUrl);
}

EventHandle播放端事件回调处理,是底层状态反馈非常重要的媒介,除了网络状态、buffering状态回调外、还有录像状态、快照状态等回调:

class EventHandeV2 implements NTSmartEventCallbackV2 {
	@Override
	public void onNTSmartEventCallbackV2(long handle, int id, long param1,
										 long param2, String param3, String param4, Object param5) {

		String player_event = "";

		switch (id) {
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_STARTED:
				player_event = "开始..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTING:
				player_event = "连接中..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED:
				player_event = "连接失败..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTED:
				player_event = "连接成功..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED:
				player_event = "连接断开..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_STOP:
				player_event = "停止播放..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO:
				player_event = "分辨率信息: width: " + param1 + ", height: " + param2;
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED:
				player_event = "收不到媒体数据,可能是url错误..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL:
				player_event = "切换播放URL..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE:
				player_event = "快照: " + param1 + " 路径:" + param3;

				if (param1 == 0)
					player_event = player_event + ", 截取快照成功";
				 else
					player_event = player_event + ", 截取快照失败";

				if (param4 != null && !param4.isEmpty())
					player_event += (", user data:" + param4);

				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE:
				player_event = "[record]开始一个新的录像文件 : " + param3;
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED:
				player_event = "[record]已生成一个录像文件 : " + param3;
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING:
				Log.i(TAG, "Start Buffering");
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_BUFFERING:
				Log.i(TAG, "Buffering:" + param1 + "%");
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING:
				Log.i(TAG, "Stop Buffering");
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED:
				player_event = "download_speed:" + param1 + "Byte/s" + ", "
						+ (param1 * 8 / 1000) + "kbps" + ", " + (param1 / 1024)
						+ "KB/s";
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_RTSP_STATUS_CODE:
				Log.e(TAG, "RTSP error code received, please make sure username/password is correct, error code:" + param1);
				player_event = "RTSP error code:" + param1;
				break;
		}

		if (player_event.length() > 0) {
			Log.i(TAG, player_event);
			Message message = new Message();
			message.what = PLAYER_EVENT_MSG;
			message.obj = player_event;
			handler.sendMessage(message);
		}
	}
}

如果RTSP、RTMP流需要录像:

btnStartStopRecorder.setOnClickListener(new Button.OnClickListener() {

	// @Override
	public void onClick(View v) {

		if (isRecording) {

			int iRet = libPlayer.SmartPlayerStopRecorder(playerHandle);

			if (iRet != 0) {
				Log.e(TAG, "Call SmartPlayerStopRecorder failed..");
				return;
			}

			if (!isPlaying) {
				btnPopInputUrl.setEnabled(true);
				btnSetPlayBuffer.setEnabled(true);
				btnFastStartup.setEnabled(true);
				btnRecoderMgr.setEnabled(true);

				libPlayer.SmartPlayerClose(playerHandle);
				playerHandle = 0;
			}

			btnStartStopRecorder.setText(" 开始录像");

			isRecording = false;
		} else {
			Log.i(TAG, "onClick start recorder..");

			if (!isPlaying) {
				InitAndSetConfig();
			}

			ConfigRecorderFunction();

			int startRet = libPlayer.SmartPlayerStartRecorder(playerHandle);

			if (startRet != 0) {
				Log.e(TAG, "Failed to start recorder.");
				return;
			}

			btnPopInputUrl.setEnabled(false);
			btnSetPlayBuffer.setEnabled(false);
			btnFastStartup.setEnabled(false);
			btnRecoderMgr.setEnabled(false);

			isRecording = true;
			btnStartStopRecorder.setText("停止录像");
		}
	}
});

其中,录像参数配置选项设置如下,除了下面演示接口外,还可以设置仅录视频或音频:

void ConfigRecorderFunction() {
	if (libPlayer != null) {
		int is_rec_trans_code = 1;
		libPlayer.SmartPlayerSetRecorderAudioTranscodeAAC(playerHandle, is_rec_trans_code);

		if (recDir != null && !recDir.isEmpty()) {
			int ret = libPlayer.SmartPlayerCreateFileDirectory(recDir);
			if (0 == ret) {
				if (0 != libPlayer.SmartPlayerSetRecorderDirectory(
						playerHandle, recDir)) {
					Log.e(TAG, "Set recoder dir failed , path:" + recDir);
					return;
				}

				if (0 != libPlayer.SmartPlayerSetRecorderFileMaxSize(
						playerHandle, 200)) {
					Log.e(TAG,
							"SmartPublisherSetRecorderFileMaxSize failed.");
					return;
				}

			} else {
				Log.e(TAG, "Create recorder dir failed, path:" + recDir);
			}
		}
	}
}

如需播放过程中实时截图:

btnCaptureImage.setOnClickListener(new Button.OnClickListener() {
	@SuppressLint("SimpleDateFormat")
	public void onClick(View v) {
		if (0 == playerHandle)
			return;

		if (null == capture_image_date_format_)
			capture_image_date_format_ = new SimpleDateFormat("yyyyMMdd_HHmmss_SSS");

		String timestamp = capture_image_date_format_.format(new Date());
		String imageFileName = timestamp;

		String image_path = imageSavePath + "/" + imageFileName;

		int quality;
		boolean is_jpeg = true;
		if (is_jpeg) {
			image_path += ".jpeg";
			quality = 100;
		}
		else {
			image_path += ".png";
			quality = 100;
		}

		int capture_ret = libPlayer.CaptureImage(playerHandle,is_jpeg?0:1, quality, image_path, "test cix");
		Log.i(TAG, "capture image ret:" + capture_ret + ", file:" + image_path);
	}
});

如需对视频view做水平、垂直翻转或旋转:

btnFlipVertical.setOnClickListener(new Button.OnClickListener() {
	public void onClick(View v) {
		is_flip_vertical = !is_flip_vertical;

		if (is_flip_vertical) {
			btnFlipVertical.setText("取消反转");
		} else {
			btnFlipVertical.setText("垂直反转");
		}

		if (playerHandle != 0) {
			libPlayer.SmartPlayerSetFlipVertical(playerHandle,
					is_flip_vertical ? 1 : 0);
		}
	}
});

btnFlipHorizontal.setOnClickListener(new Button.OnClickListener() {
	public void onClick(View v) {
		is_flip_horizontal = !is_flip_horizontal;

		if (is_flip_horizontal) {
			btnFlipHorizontal.setText("取消反转");
		} else {
			btnFlipHorizontal.setText("水平反转");
		}

		if (playerHandle != 0) {
			libPlayer.SmartPlayerSetFlipHorizontal(playerHandle,
					is_flip_horizontal ? 1 : 0);
		}
	}
});

btnRotation.setOnClickListener(new Button.OnClickListener() {
	public void onClick(View v) {

		rotate_degrees += 90;
		rotate_degrees = rotate_degrees % 360;

		if (0 == rotate_degrees) {
			btnRotation.setText("旋转90度");
		} else if (90 == rotate_degrees) {
			btnRotation.setText("旋转180度");
		} else if (180 == rotate_degrees) {
			btnRotation.setText("旋转270度");
		} else if (270 == rotate_degrees) {
			btnRotation.setText("不旋转");
		}

		if (playerHandle != 0) {
			libPlayer.SmartPlayerSetRotation(playerHandle,
					rotate_degrees);
		}
	}
});

onDestroy() 的时候,停掉播放、录像、释放播放端实例句柄:

@Override
protected void onDestroy() {
	Log.i(TAG, "Run into activity destory++");

	if (playerHandle != 0) {
		if (isPlaying) {
			libPlayer.SmartPlayerStopPlay(playerHandle);
		}

		if (isRecording) {
			libPlayer.SmartPlayerStopRecorder(playerHandle);
		}

		libPlayer.SmartPlayerClose(playerHandle);
		playerHandle = 0;
	}
	super.onDestroy();
	finish();
	System.exit(0);
}

以上是大概的流程,如果需要播放多实例,可以做个简单的封装,多实例效果如下:

编辑

LibPlayerWrapper.java参考封装代码如下,如需额外功能,只要按照设计框架,添加进去即可:

/*
 * LibPlayerWrapper.java.java
 * Author: daniusdk.com
 */
package com.daniulive.smartplayer;

import android.content.Context;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceView;
import android.view.View;

import com.eventhandle.NTSmartEventCallbackV2;
import java.lang.ref.WeakReference;
import java.util.concurrent.locks.ReadWriteLock;
import java.util.concurrent.locks.ReentrantReadWriteLock;

public class LibPlayerWrapper {
    private static String TAG = "NTLogLibPlayerW";
    private static final int OK = 0;

    private WeakReference<Context> context_;
    private final ReadWriteLock rw_lock_ = new ReentrantReadWriteLock(true);
    private final java.util.concurrent.locks.Lock write_lock_ = rw_lock_.writeLock();
    private final java.util.concurrent.locks.Lock read_lock_ = rw_lock_.readLock();

    private SmartPlayerJniV2 lib_player_;
    private volatile long native_handle_;
    private View view_;

    private volatile boolean is_playing_;
    private volatile boolean is_recording_;

    private WeakReference<EventListener> event_listener_;

    public LibPlayerWrapper(SmartPlayerJniV2 lib_player, Context context, EventListener listener) {
        if (!empty())
            throw new IllegalStateException("it is not empty");

        if (null == lib_player)
            throw new NullPointerException("lib_player is null");

        this.lib_player_ = lib_player;

        if (context != null)
            this.context_ = new WeakReference<>(context);

        if (listener == null ) {
            this.event_listener_ = null;
        }
        else {
            this.event_listener_ = new WeakReference<>(listener);
        }
    }

    private void clear_all_playing_flags() {
        this.is_playing_ = false;
        this.is_recording_ = false;
    }

    public void set(long handle) {
        if (!empty())
            throw new IllegalStateException("it is not empty");

        write_lock_.lock();
        try {
            clear_all_playing_flags();
            this.native_handle_ = handle;
        } finally {
            write_lock_.unlock();
        }

        Log.i(TAG, "set native_handle:" + handle);
    }

    public void SetView(View view) {
        Log.i(TAG, "SetView: " + view);
        this.view_ = view;
    }

    @Override
    protected void finalize() throws Throwable {
        try {
            if (check_native_handle()) {
                if(is_playing()) {
                    lib_player_.SmartPlayerStopPlay(get());
                    this.is_playing_ = false;
                }

                if(is_recording()) {
                    lib_player_.SmartPlayerStopRecorder(get());
                    this.is_recording_ = false;
                }

                lib_player_.SmartPlayerClose(this.native_handle_);
                Log.i(TAG, "finalize close handle:" + this.native_handle_);
                this.native_handle_ = 0;
            }
        }catch (Exception e) {

        }

        super.finalize();
    }

    public void release() {
        if (empty())
            return;

        if(is_playing())
            StopPlayer();

        if (is_recording())
            StopRecorder();

        long handle;
        write_lock_.lock();
        try {
            handle = this.native_handle_;
            this.native_handle_ = 0;
            clear_all_playing_flags();
        } finally {
            write_lock_.unlock();
        }

        if (lib_player_ != null && handle != 0)
            lib_player_.SmartPlayerClose(handle);
    }

    public boolean try_release() {
        if (empty())
            return false;

        if (is_player_running()) {
            Log.i(TAG, "try_release it is running, native_handle:" + get());
            return false;
        }

        long handle;
        write_lock_.lock();
        try {
            if (is_player_running())
                return false;

            handle = this.native_handle_;
            this.native_handle_ = 0;
        } finally {
            write_lock_.unlock();
        }

        if (lib_player_ != null && handle != 0)
            lib_player_.SmartPlayerClose(handle);

        return true;
    }

    public final boolean empty() { return 0 == this.native_handle_; }

    public final long get() { return this.native_handle_; }

    public View get_view() {return this.view_;}

    public final boolean check_native_handle() {
        return this.lib_player_ != null && this.native_handle_ != 0;
    }

    public final boolean is_playing() { return is_playing_; }

    public final boolean is_recording() { return is_recording_; }

    public final boolean is_player_running() { return is_playing_ || is_recording_; }

    private boolean isValidRtspOrRtmpUrl(String url) {
        if (url == null || url.isEmpty()) {
            return false;
        }
        return url.trim().startsWith("rtsp://") || url.startsWith("rtmp://");
    }

    private EventListener getListener() {
        if ( this.event_listener_ == null )
            return null;

        return this.event_listener_.get();
    }

    protected final Context application_context() {
        if (null == context_)
            return null;

        return context_.get();
    }

    public boolean OpenPlayerHandle(String playback_url, int play_buffer, int is_using_tcp) {

        if (check_native_handle())
            return true;

        if(!isValidRtspOrRtmpUrl(playback_url))
            return false;

        long handle = lib_player_.SmartPlayerOpen(application_context());
        if (0==handle) {
            Log.e(TAG, "sdk open failed!");
            return false;
        }

        lib_player_.SetSmartPlayerEventCallbackV2(handle, new EventHandleV2());

        lib_player_.SmartPlayerSetBuffer(handle, play_buffer);

        // set report download speed(默认2秒一次回调 用户可自行调整report间隔)
        lib_player_.SmartPlayerSetReportDownloadSpeed(handle, 1, 4);

        boolean isFastStartup = true;
        lib_player_.SmartPlayerSetFastStartup(handle, isFastStartup ? 1 : 0);

        //设置RTSP超时时间
        int rtsp_timeout = 10;
        lib_player_.SmartPlayerSetRTSPTimeout(handle, rtsp_timeout);

        //设置RTSP TCP/UDP模式自动切换
        int is_auto_switch_tcp_udp = 1;
        lib_player_.SmartPlayerSetRTSPAutoSwitchTcpUdp(handle, is_auto_switch_tcp_udp);

        lib_player_.SmartPlayerSaveImageFlag(handle, 1);

        // It only used when playback RTSP stream..
        lib_player_.SmartPlayerSetRTSPTcpMode(handle, is_using_tcp);

        lib_player_.DisableEnhancedRTMP(handle, 0);

        lib_player_.SmartPlayerSetUrl(handle, playback_url);

        set(handle);

        return true;
    }

    private void SetPlayerParam(boolean is_hardware_decoder, boolean is_enable_hardware_render_mode, boolean is_mute)
    {
         Surface surface = null;
         int surface_codec_media_color_format = 0;

         if (view_ != null && view_ instanceof SurfaceView && ((SurfaceView) view_).getHolder() != null)
             surface = ((SurfaceView) view_).getHolder().getSurface();

         lib_player_.SetSurface(get(), surface, surface_codec_media_color_format, 0, 0);

        lib_player_.SmartPlayerSetRenderScaleMode(get(), 1);

        //int render_format = 1;
        //lib_player.SmartPlayerSetSurfaceRenderFormat(handle, render_format);

        //int is_enable_anti_alias = 1;
        //lib_player.SmartPlayerSetSurfaceAntiAlias(handle, is_enable_anti_alias);

        if (is_hardware_decoder && is_enable_hardware_render_mode) {
            lib_player_.SmartPlayerSetHWRenderMode(get(), 1);
        }

        lib_player_.SmartPlayerSetAudioOutputType(get(), 1);

        lib_player_.SmartPlayerSetMute(get(), is_mute ? 1 : 0);

        if (is_hardware_decoder) {
            int isSupportHevcHwDecoder = lib_player_.SetSmartPlayerVideoHevcHWDecoder(get(), 1);

            int isSupportH264HwDecoder = lib_player_.SetSmartPlayerVideoHWDecoder(get(), 1);

            Log.i(TAG, "isSupportH264HwDecoder: " + isSupportH264HwDecoder + ", isSupportHevcHwDecoder: " + isSupportHevcHwDecoder);
        }

        boolean isLowLatency = true;
        lib_player_.SmartPlayerSetLowLatencyMode(get(), isLowLatency ? 1 : 0);

        boolean is_flip_vertical = false;
        lib_player_.SmartPlayerSetFlipVertical(get(), is_flip_vertical ? 1 : 0);

        boolean is_flip_horizontal = false;
        lib_player_.SmartPlayerSetFlipHorizontal(get(), is_flip_horizontal ? 1 : 0);

        int rotate_degrees = 0;
        lib_player_.SmartPlayerSetRotation(get(), rotate_degrees);

        int curAudioVolume = 100;
        lib_player_.SmartPlayerSetAudioVolume(get(), curAudioVolume);
    }

    class EventHandleV2 implements NTSmartEventCallbackV2 {
        @Override
        public void onNTSmartEventCallbackV2(long handle, int id, long param1,
                                             long param2, String param3, String param4, Object param5) {

            if(event_listener_.get() != null)
            {
                event_listener_.get().onPlayerEventCallback(handle, id, param1, param2, param3, param4, param5);
            }
        }
    }

    public boolean SetMute(boolean is_mute) {
        if (!check_native_handle())
            return false;

        return OK == lib_player_.SmartPlayerSetMute(get(), is_mute? 1 : 0);
    }

    public boolean SetInputAudioVolume(int volume) {
        if (!check_native_handle())
            return false;

        return OK == lib_player_.SmartPlayerSetAudioVolume(get(), volume);
    }

    public boolean CaptureImage(int compress_format, int quality, String file_name, String user_data_string) {
        if (!check_native_handle())
            return false;

        return OK == lib_player_.CaptureImage(get(), compress_format, quality, file_name, user_data_string);
    }

    public boolean StartPlayer(boolean is_hardware_decoder, boolean is_enable_hardware_render_mode, boolean is_mute) {
        if (is_playing()) {
            Log.e(TAG, "already playing, native_handle:" + get());
            return false;
        }

        SetPlayerParam(is_hardware_decoder, is_enable_hardware_render_mode, is_mute);

        int ret = lib_player_.SmartPlayerStartPlay(get());
        if (ret != OK) {
            Log.e(TAG, "call StartPlay failed, native_handle:" + get() + ", ret:" + ret);
            return false;
        }

        write_lock_.lock();
        try {
            this.is_playing_ = true;
        } finally {
            write_lock_.unlock();
        }

        Log.i(TAG, "call StartPlayer OK, native_handle:" + get());
        return true;
    }

    public boolean StopPlayer() {
        if (!check_native_handle())
            return false;

        if (!is_playing()) {
            Log.w(TAG, "it's not playing, native_handle:" + get());
            return false;
        }

        boolean is_need_call = false;
        write_lock_.lock();
        try {
            if (this.is_playing_) {
                this.is_playing_ = false;
                is_need_call = true;
            }
        } finally {
            write_lock_.unlock();
        }

        if (is_need_call)
            lib_player_.SmartPlayerStopPlay(get());

        return true;
    }

    public boolean ConfigRecorderParam(String rec_dir, int file_max_size, int is_transcode_aac,
                                       int is_record_video, int is_record_audio) {

        if(!check_native_handle())
            return false;

        if (null == rec_dir || rec_dir.isEmpty())
            return false;

        int ret = lib_player_.SmartPlayerCreateFileDirectory(rec_dir);
        if (ret != 0) {
            Log.e(TAG, "Create record dir failed, path:" + rec_dir);
            return false;
        }

        if (lib_player_.SmartPlayerSetRecorderDirectory(get(), rec_dir) != 0) {
            Log.e(TAG, "Set record dir failed , path:" + rec_dir);
            return false;
        }

        if (lib_player_.SmartPlayerSetRecorderFileMaxSize(get(),file_max_size) != 0) {
            Log.e(TAG, "SmartPlayerSetRecorderFileMaxSize failed.");
            return false;
        }

        lib_player_.SmartPlayerSetRecorderAudioTranscodeAAC(get(), is_transcode_aac);

        // 更细粒度控制录像的, 一般情况无需调用
        lib_player_.SmartPlayerSetRecorderVideo(get(), is_record_video);
        lib_player_.SmartPlayerSetRecorderAudio(get(), is_record_audio);
        return true;
    }

    public boolean StartRecorder() {

        if (is_recording()) {
            Log.e(TAG, "already recording, native_handle:" + get());
            return false;
        }

        int ret = lib_player_.SmartPlayerStartRecorder(get());
        if (ret != OK) {
            Log.e(TAG, "call SmartPlayerStartRecorder failed, native_handle:" + get() + ", ret:" + ret);
            return false;
        }

        write_lock_.lock();
        try {
            this.is_recording_ = true;
        } finally {
            write_lock_.unlock();
        }

        Log.i(TAG, "call SmartPlayerStartRecorder OK, native_handle:" + get());
        return true;
    }

    public boolean StopRecorder() {
        if (!check_native_handle())
            return false;

        if (!is_recording()) {
            Log.w(TAG, "it's not recording, native_handle:" + get());
            return false;
        }

        boolean is_need_call = false;
        write_lock_.lock();
        try {
            if (this.is_recording_) {
                this.is_recording_ = false;
                is_need_call = true;
            }
        } finally {
            write_lock_.unlock();
        }

        if (is_need_call)
            lib_player_.SmartPlayerStopRecorder(get());

        return true;
    }

    private static boolean is_null_or_empty(String val) {
        return null == val || val.isEmpty();
    }
}

总结

以上是Android平台RTSP、RTMP直播播放模块对接说明,在此之前,我们针对SmartPlayer做过一些技术方面的探讨,从低延迟、音视频同步处理、多实例实现、解码效率、性能占用、解码后数据对接、实时截图、录像、网络抖动处理等各个维度,做过相关的技术分享。感兴趣的开发者,可以单独跟我们探讨。

 

Android平台RTMP直播推送模块技术接入说明

技术背景

大牛直播SDK跨平台RTMP直播推送模块,始于2015年,支持Windows、Linux(x64_64架构|aarch64)、Android、iOS平台,支持采集推送摄像头、屏幕、麦克风、扬声器、编码前、编码后数据对接,功能强大,性能优异,配合大牛直播SDK的SmartPlayer播放器,轻松实现毫秒级的延迟体验,满足大多数行业的使用场景。

RTMP直播推送模块数据源,支持编码前、编码后数据对接:

  • 编码前数据(目前支持的有YV12/NV21/NV12/I420/RGB24/RGBA32/RGB565等数据类型);
  • 编码后数据(如无人机等264/HEVC数据,或者本地解析的MP4音视频数据)。

技术对接

 系统要求

  • SDK支持Android5.1及以上版本;
  • 支持的CPU架构:armv7, arm64, x86, x86_64。

准备工作

  • 确保SmartPublisherJniV2.java放到com.daniulive.smartpublisher包名下(可在其他包名下调用);
  • smartavengine.jar加入到工程;
  • 拷贝libSmartPublisher.so到工程;
  • AndroidManifast.xml添加相关权限:
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS"/>
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.CHANGE_WIFI_MULTICAST_STATE" />
<uses-permission android:name="android.permission.VIBRATE" />

  • Load相关so:
static {  
    System.loadLibrary("SmartPublisher");
}

  • build.gradle配置32/64位库:
splits {
    abi {
        enable true
        reset()
        // Specifies a list of ABIs that Gradle should create APKs for
        include 'armeabi-v7a', 'arm64-v8a', 'x86', 'x86_64' //select ABIs to build APKs for
        // Specify that we do not want to also generate a universal APK that includes all ABIs
        universalApk true
    }
}

  • 如需集成到自己系统测试,请用大牛直播SDK的app name,授权版按照授权app name正常使用即可;
  • 如何改app-name,strings.xml做以下修改:
<string name="app_name">SmartPublisherSDKDemo</string>

接口设计

Android 推送端SDK接口详解
调用描述 接口 接口描述
最先调用,如成功返回推送实例 SmartPublisherOpen ctx:上下文信息;

Audio_opt:

0:不推送音频;

1:推送编码前音频(PCM);

2:对接外部编码后的audio数据(AAC/PCMA/PCMU/SPEEX)

video_opt:

0:不推送视频;

1:推送编码前视频(YUV420SP/YUV420P/RGBA/ARGB);

2:推送编码后视频(H.264)

3:层叠加模式

width|height:宽高信息。

Event回调 SetSmartPublisherEventCallbackV2 设置event callback
硬编码设置 SetSmartPublisherVideoHWEncoder 检测是否支持H.264硬编码,如果返回0,则支持,否则自动采用软编码
SetSmartPublisherVideoHevcHWEncoder 检测是否支持H.265(HEVC)硬编码,如果返回0,则支持,否则自动采用软编码
SetNativeMediaNDK 设置视频硬编码是否使用 Native Media NDK, 默认是不使用, 安卓5.0以下设备不支持
SetVideoHWEncoderBitrateMode 设置视频硬编码码率控制模式

hw_bitrate_mode: -1表示使用默认值, 不设置也会使用默认值, 0:CQ, 1:VBR, 2:CBR, 3:CBR_FD

SetVideoHWEncoderComplexity 设置视频硬编码复杂度, 安卓5.0及以上支持
SetVideoHWEncoderQuality 设置视频硬编码质量, 安卓9及以上支持, 仅当硬编码器码率控制模式(BitrateMode)是CQ(constant-quality mode)时才有效
SetAVCHWEncoderProfile 设置H.264硬编码Profile, 安卓7及以上支持
SetAVCHWEncoderLevel 设置H.264硬编码Level, 这个只有在设置了Profile的情况下才有效, 安卓7及以上支持
SetVideoHWEncoderMaxBitrate 设置视频硬编码最大码率, 安卓没有相关文档说明, 所以不建议设置
水印 文字、png水印 PostLayerBitmap 通过层模式设置水印,投递层

Bitmap.Config.ARGB_888图像

视频参数配置 软编码可变码率 SmartPublisherSetSwVBRMode 设置软编码可变码率,可变码率下,相邻帧之间变化不大时码率更低
GOP间隔(关键帧) SmartPublisherSetGopInterval 设置推送端GOP间隔,一般建议在帧率的1~3倍,如不设置,用底层默认值
软编码码率设置 SmartPublisherSetSWVideoBitRate 设置软编码视频 bit-rate,最大码流一般是平均码流的2倍,如不设置,用底层计算的默认值
帧率 SmartPublisherSetFPS 设置fps,如不设置,用底层默认值
软编码视频Profile SmartPublisherSetSWVideoEncoderProfile 设置软编码模式下的video encoder profile,默认baseline profile
软编码编码速度 SmartPublisherSetSWVideoEncoderSpeed 设置软编码编码速度,设置范围(1,6),1最快,6最慢,默认是6
频设置 视频镜像 SmartPublisherSetMirror 镜像模式: 播放端和推送端本地回显方向显示一致(前置摄像头)
视频截图 实时快照 CaptureImage 截图接口, 支持JPEG和PNG两种格式
音频配置 音频编码

类型

SmartPublisherSetAudioCodecType 设置编码类型,默认AAC编码,type设置为2时,启用speex编码(码率更低)
AAC编码码率 SmartPublisherSetAudioBitRate 设置音频编码码率, 当前只对AAC编码有效
SPEEX编码质量 SmartPublisherSetSpeexEncoderQuality 设置speex编码质量,数值越大,质量越高,范围(0,10),默认8
音频处理 噪音抑制 SmartPublisherSetNoiseSuppression 噪音抑制开启后,可去除采集端背景杂音
增益控制 SmartPublisherSetAGC 设置自动增益控制,保持声音稳定
回声消除 SmartPublisherSetEchoCancellation 设置音频回音消除
实时静音 SmartPublisherSetMute 设置实时静音、取消静音
设置输入

音量

SmartPublisherSetInputAudioVolume 设置输入音量,默认是1.0,范围是[0.0, 5.0], 设置成0静音, 1音量不变
RTMP推送模式 SetRtmpPublishingType 设置rtmp publisher类型,0:live,1:record,需服务器支持
Enhanced RTMP设置 DisableEnhancedRTMP disable enhanced RTMP, SDK默认是开启enhanced RTMP的
RTMP推送URL设置 SmartPublisherSetURL 设置RTMP推送url
编码前实时视频数据 camera数据 SmartPublisherOnCaptureVideoData 对接camera回调的数据
YV12数据 SmartPublisherOnYV12Data YV12数据接口
NV21数据 SmartPublisherOnNV21Data NV21数据接口
转换接口 SmartPublisherNV21ToI420Rotate NV21转换到I420并旋转
YUV(I420) SmartPublisherOnCaptureVideoI420Data 第三方YUV(I420)接口
RGB24数据 SmartPublisherOnCaptureVideoRGB24Data RGB24接口
RGBA32数据 SmartPublisherOnCaptureVideoRGBA32Data RGBA32接口
YUV420888数据 SmartPublisherOnImageYUV420888 YUV420888接口
RGBA数据 SmartPublisherOnCaptureVideoRGBAData 第三方RGBA数据
ABGR垂直翻转数据 SmartPublisherOnCaptureVideoABGRFlip

VerticalData

ABGR flip vertical(垂直翻转) 数据(Demo中用于传递屏幕数据)
RGBA8888图像 PostLayerImageRGBA8888ByteBuffer 投递层RGBA8888图像,如果不需要Aplpha通道的话, 请使用RGBX8888接口
RGBX8888图像 PostLayerImageRGBX8888ByteBuffer 投递层RGBX8888图像
I420图像 PostLayerImageI420ByteBuffer 投递层I420图像
RGB565数据 SmartPublisherOnCaptureVideoRGB565Data RGB565 data
裁剪过的RGBA

数据

SmartPublisherOnCaptureVideoClipedRGBAData 投递裁剪过的RGBA数据
PCM数据 SmartPublisherOnPCMData 实时PCM数据
远端PCM数据

(用于回音消除)

SmartPublisherOnFarEndPCMData 实时传递远端PCM数据(可用于互动级的回音消除处理)
音频 混音 混音数据 SmartPublisherOnMixPCMData 传递PCM混音音频数据给SDK, 每10ms音频数据传入一次
编码后数据对接 编码后视频数据 SmartPublisherPostVideoEncodedData 设置编码后视频数据
编码后音频数据 SmartPublisherPostAudioEncodedData 编码后音频数据
编码后音视频数据回调 编码后音频数据回调 SmartPublisherSetAudioEncodedDataCallback 设置编码后音频数据回调
编码后视频数据回调 SmartPublisherSetVideoEncodedDataCallback 设置编码后视频数据回调
层结构设置 启用|停用视频层 EnableLayer video_opt为3时,启用或者停用视频层, 这个接口必须在StartXXX之后调用.
移除视频层 RemoveLayer 移除视频层, 这个接口必须在StartXXX之后调用.
RTMP推送 开始推送

RTMP

SmartPublisherStartPublisher 启动RTMP推送
停止推送

RTMP

SmartPublisherStopPublisher 停止RTMP推送
关闭推送实例 关闭实例 SmartPublisherClose 关闭推送实例,结束时必须调用close接口释放资源
设置授权 授权license设置 SmartPublisherSetSDKClientKey 设置授权Key,如需设置授权Key, 请确保在SmartPublisherOpen之前调用!

功能支持

  • 音频编码:AAC/SPEEX;
  • 视频编码:H.264、H.265;
  • 推流协议:RTMP;
  • [音视频]支持纯音频/纯视频/音视频推送;
  • [摄像头]支持采集过程中,前后摄像头实时切换;
  • 支持帧率、关键帧间隔(GOP)、码率(bit-rate)设置;
  • 支持RTMP推送 live|record模式设置;
  • 支持前置摄像头镜像设置;
  • 支持软编码、特定机型硬编码;
  • 支持横屏、竖屏推送;
  • 支持Android屏幕采集推送;
  • 支持自建标准RTMP服务器或CDN;
  • 支持断网自动重连、网络状态回调;
  • 支持实时动态水印;
  • 支持实时快照;
  • 支持降噪处理、自动增益控制;
  • 支持外部编码前音视频数据对接;
  • 支持外部编码后音视频数据对接;
  • 支持RTMP扩展H.265(需设备支持H.265特定机型硬编码)和Enhanced RTMP;
  • 支持实时音量调节;
  • 支持扩展录像模块;
  • 支持Unity接口;
  • 支持H.264扩展SEI发送模块;
  • 支持Android 5.1及以上版本。

接口调用详解

本文以大牛直播SDK Android平台Camera2Demo为例,推送RTMP之前,可以先选择视频分辨率、软编还是硬编码,音频是AAC、SPEEX还是PCMA编码等基础设置,其他参数的设置,可以参考下面InitAndSetConfig()。

以Android平台Camera2对接为例,onCreate()时,想new SmartPublisherJniV2():

/*
 * MainActivity.java
 * Author: daniusdk.com
 */
@Override
protected void onCreate(Bundle savedInstanceState) {
	super.onCreate(savedInstanceState);
	setContentView(R.layout.activity_main);
	
	...

	context_ = this.getApplicationContext();
	
	libPublisher = new SmartPublisherJniV2();
}

推送RTMP:

class ButtonStartPushListener implements View.OnClickListener {
	public void onClick(View v) {
		if (stream_publisher_.is_rtmp_publishing()) {
			stopPush();

			btnRTMPPusher.setText("推送RTMP");
			return;
		}

		Log.i(TAG, "onClick start push rtmp..");
		
        InitAndSetConfig();

		String rtmp_pusher_url ="rtmp://192.168.0.101:1935/hls/stream123";;


		if (!stream_publisher_.SetURL(rtmp_pusher_url))
			Log.e(TAG, "Failed to set publish stream URL..");

		boolean start_ret = stream_publisher_.StartPublisher();
		if (!start_ret) {
			stream_publisher_.try_release();
			Log.e(TAG, "Failed to start push stream..");
			return;
		}

		startAudioRecorder();
		startLayerPostThread();

		btnRTMPPusher.setText("停止推送 ");

	}
}

stopPush()实现如下:

//停止rtmp推送
private void stopPush() {
	stream_publisher_.StopPublisher();
	stream_publisher_.try_release();

	if (!stream_publisher_.is_publishing())
		stopAudioRecorder();
}

其中,InitAndSetConfig()实现如下,通过调SmartPublisherOpen()接口,生成推送实例句柄。

/*
 * MainActivity.java
 * Author: daniusdk.com
 */
private void InitAndSetConfig() {
	if (null == libPublisher)
		return;

	if (!stream_publisher_.empty())
		return;

	Log.i(TAG, "InitAndSetConfig video width: " + video_width_ + ", height" + video_height_ + " imageRotationDegree:" + cameraImageRotationDegree_);

	int audio_opt = 1;
	long handle = libPublisher.SmartPublisherOpen(context_, audio_opt, 3,  video_width_, video_height_);
	if (0==handle) {
		Log.e(TAG, "sdk open failed!");
		return;
	}

	Log.i(TAG, "publisherHandle=" + handle);

	int fps = 25;
	int gop = fps * 3;

	initialize_publisher(libPublisher, handle, video_width_, video_height_, fps, gop);

	stream_publisher_.set(libPublisher, handle);
}

对应的initialize_publisher()实现如下,设置软硬编码、帧率、关键帧间隔等。

private boolean initialize_publisher(SmartPublisherJniV2 lib_publisher, long handle, int width, int height, int fps, int gop) {
	if (null == lib_publisher) {
		Log.e(TAG, "initialize_publisher lib_publisher is null");
		return false;
	}

	if (0 == handle) {
		Log.e(TAG, "initialize_publisher handle is 0");
		return false;
	}

	if (videoEncodeType == 1) {
		int kbps = LibPublisherWrapper.estimate_video_hardware_kbps(width, height, fps, true);
		Log.i(TAG, "h264HWKbps: " + kbps);
		int isSupportH264HWEncoder = lib_publisher.SetSmartPublisherVideoHWEncoder(handle, kbps);
		if (isSupportH264HWEncoder == 0) {
			lib_publisher.SetNativeMediaNDK(handle, 0);
			lib_publisher.SetVideoHWEncoderBitrateMode(handle, 1); // 0:CQ, 1:VBR, 2:CBR
			lib_publisher.SetVideoHWEncoderQuality(handle, 39);
			lib_publisher.SetAVCHWEncoderProfile(handle, 0x08); // 0x01: Baseline, 0x02: Main, 0x08: High

			// lib_publisher.SetAVCHWEncoderLevel(handle, 0x200); // Level 3.1
			// lib_publisher.SetAVCHWEncoderLevel(handle, 0x400); // Level 3.2
			// lib_publisher.SetAVCHWEncoderLevel(handle, 0x800); // Level 4
			lib_publisher.SetAVCHWEncoderLevel(handle, 0x1000); // Level 4.1 多数情况下,这个够用了
			//lib_publisher.SetAVCHWEncoderLevel(handle, 0x2000); // Level 4.2

			// lib_publisher.SetVideoHWEncoderMaxBitrate(handle, ((long)h264HWKbps)*1300);

			Log.i(TAG, "Great, it supports h.264 hardware encoder!");
		}
	} else if (videoEncodeType == 2) {
		int kbps = LibPublisherWrapper.estimate_video_hardware_kbps(width, height, fps, false);
		Log.i(TAG, "hevcHWKbps: " + kbps);
		int isSupportHevcHWEncoder = lib_publisher.SetSmartPublisherVideoHevcHWEncoder(handle, kbps);
		if (isSupportHevcHWEncoder == 0) {
			lib_publisher.SetNativeMediaNDK(handle, 0);
			lib_publisher.SetVideoHWEncoderBitrateMode(handle, 1); // 0:CQ, 1:VBR, 2:CBR
			lib_publisher.SetVideoHWEncoderQuality(handle, 39);

			// libPublisher.SetVideoHWEncoderMaxBitrate(handle, ((long)hevcHWKbps)*1200);

			Log.i(TAG, "Great, it supports hevc hardware encoder!");
		}
	}

	boolean is_sw_vbr_mode = true;
	//H.264 software encoder
	if (is_sw_vbr_mode) {
		int is_enable_vbr = 1;
		int video_quality = LibPublisherWrapper.estimate_video_software_quality(width, height, true);
		int vbr_max_kbps = LibPublisherWrapper.estimate_video_vbr_max_kbps(width, height, fps);
		lib_publisher.SmartPublisherSetSwVBRMode(handle, is_enable_vbr, video_quality, vbr_max_kbps);
	}

	if (is_pcma_) {
		lib_publisher.SmartPublisherSetAudioCodecType(handle, 3);
	} else {
		lib_publisher.SmartPublisherSetAudioCodecType(handle, 1);
	}

	lib_publisher.SetSmartPublisherEventCallbackV2(handle, new EventHandlerPublisherV2().set(handler_, record_executor_));

	lib_publisher.SmartPublisherSetSWVideoEncoderProfile(handle, 3);

	lib_publisher.SmartPublisherSetSWVideoEncoderSpeed(handle, 2);

	lib_publisher.SmartPublisherSetGopInterval(handle, gop);

	lib_publisher.SmartPublisherSetFPS(handle, fps);

	// lib_publisher.SmartPublisherSetSWVideoBitRate(handle, 600, 1200);

	boolean is_noise_suppression = true;
	lib_publisher.SmartPublisherSetNoiseSuppression(handle, is_noise_suppression ? 1 : 0);

	boolean is_agc = false;
	lib_publisher.SmartPublisherSetAGC(handle, is_agc ? 1 : 0);

	int echo_cancel_delay = 0;
	lib_publisher.SmartPublisherSetEchoCancellation(handle, 1, echo_cancel_delay);

	return true;
}

数据投递如下(以Camera2采集为例,如果是其他视频格式,也可以正常对接):

@Override
public void onCameraImageData(Image image) {
	....
	for (LibPublisherWrapper i : publisher_array_)
		i.PostLayerImageYUV420888ByteBuffer(0, 0, 0,
			planes[0].getBuffer(), y_offset, planes[0].getRowStride(),
			planes[1].getBuffer(), u_offset, planes[1].getRowStride(),
			planes[2].getBuffer(), v_offset, planes[2].getRowStride(), planes[1].getPixelStride(),
			w, h, 0, 0,
			scale_w, scale_h, scale_filter_mode, rotation_degree);

}

音频采集投递设计如下:

void startAudioRecorder() {
	if (audio_recorder_ != null)
		return;

	audio_recorder_ = new NTAudioRecordV2(this);

	Log.i(TAG, "startAudioRecorder call audio_recorder_.start()+++...");

	audio_recorder_callback_ = new NTAudioRecordV2CallbackImpl(stream_publisher_, null);

	audio_recorder_.AddCallback(audio_recorder_callback_);

	if (!audio_recorder_.Start(is_pcma_ ? 8000 : 44100, 1) ) {
		audio_recorder_.RemoveCallback(audio_recorder_callback_);
		audio_recorder_callback_ = null;

		audio_recorder_ = null;

		Log.e(TAG, "startAudioRecorder start failed.");
	}
	else {
		Log.i(TAG, "startAudioRecorder call audio_recorder_.start() OK---...");
	}
}

void stopAudioRecorder() {
	if (null == audio_recorder_)
		return;

	Log.i(TAG, "stopAudioRecorder+++");

	audio_recorder_.Stop();

	if (audio_recorder_callback_ != null) {
		audio_recorder_.RemoveCallback(audio_recorder_callback_);
		audio_recorder_callback_ = null;
	}

	audio_recorder_ = null;

	Log.i(TAG, "stopAudioRecorder---");
}

回调Audio数据的地方,直接投递出去:

private static class NTAudioRecordV2CallbackImpl implements NTAudioRecordV2Callback {
	private WeakReference<LibPublisherWrapper> publisher_0_;

	public NTAudioRecordV2CallbackImpl(LibPublisherWrapper publisher_0) {
		if (publisher_0 != null)
			publisher_0_ = new WeakReference<>(publisher_0);
	}

	private final LibPublisherWrapper get_publisher_0() {
		if (publisher_0_ !=null)
			return publisher_0_.get();

		return null;
	}

	@Override
	public void onNTAudioRecordV2Frame(ByteBuffer data, int size, int sampleRate, int channel, int per_channel_sample_number) {

		LibPublisherWrapper publisher_0 = get_publisher_0();
		if (publisher_0 != null)
			publisher_0.OnPCMData(data, size, sampleRate, channel, per_channel_sample_number);
	}
}

图层投递设计如下,图层投递的时候,可设置是否添加文字、图片动态水印:

private void startLayerPostThread() {
	if (layer_post_thread_ != null)
		return;

	layer_post_thread_ = new LayerPostThread(this.context_, publisher_array_);
	layer_post_thread_.start_post();
	update_layer_post_video_size();
	layer_post_thread_.enableText(isHasTextWatermark());
	layer_post_thread_.enablePicture(isHasPictureWatermark());
}

private void update_layer_post_video_size() {
	if (null == layer_post_thread_)
		return;

	int w, h;
	int degree = cameraImageRotationDegree_;
	if (degree < 0 ) {
		w = 0;
		h = 0;
	} else if (90 == degree || 270 == degree) {
		w = video_height_;
		h = video_width_;
	}else {
		w = video_width_;
		h = video_height_;
	}

	layer_post_thread_.update_video_size(w, h);
}

private void stopLayerPostThread() {
	if (layer_post_thread_ != null) {
		layer_post_thread_.stop_post();
		layer_post_thread_ = null;
	}
}

如需摄像头快照,调用以下逻辑实现即可:

class ButtonCaptureImageListener implements View.OnClickListener {
	public void onClick(View v) {
		if (null == snap_shot_impl_) {
			snap_shot_impl_ = new SnapShotImpl(image_path_, context_, handler_, libPublisher, snap_shot_publisher_);
			snap_shot_impl_.start();
		}

		startLayerPostThread();
		snap_shot_impl_.set_layer_post_thread(layer_post_thread_);

		snap_shot_impl_.capture();
	}
}

如需集成录像模块,开始录像、停止录像设计如下:

class ButtonStartRecorderListener implements View.OnClickListener {
	public void onClick(View v) {
		if (layer_post_thread_ != null)
			layer_post_thread_.update_layers();

		if (stream_publisher_.is_recording()) {
			stopRecorder();

			if (stream_publisher_.empty())
				ConfigControlEnable(true);

			btnStartRecorder.setText("实时录像");
			btnPauseRecorder.setText("暂停录像");
			btnPauseRecorder.setEnabled(false);
			isPauseRecording = true;
			return;
		}

		Log.i(TAG, "onClick start recorder..");

		InitAndSetConfig();

		ConfigRecorderParam();

		boolean start_ret = stream_publisher_.StartRecorder();
		if (!start_ret) {
			stream_publisher_.try_release();
			Log.e(TAG, "Failed to start recorder.");
			return;
		}

		startAudioRecorder();
		ConfigControlEnable(false);

		startLayerPostThread();

		btnStartRecorder.setText("停止录像");
		btnPauseRecorder.setEnabled(true);
		isPauseRecording = true;
	}
}

录像参数配置实现如下:

void ConfigRecorderParam() {
	if (null == libPublisher)
		return;

	if (null == recDir || recDir.isEmpty())
		return;

	int ret = libPublisher.SmartPublisherCreateFileDirectory(recDir);
	if (ret != 0) {
		Log.e(TAG, "Create record dir failed, path:" + recDir);
		return;
	}

	if (!stream_publisher_.SetRecorderDirectory(recDir)) {
		Log.e(TAG, "Set record dir failed , path:" + recDir);
		return;
	}

	// 更细粒度控制录像的, 一般情况无需调用
	//libPublisher.SmartPublisherSetRecorderAudio(publisherHandle, 0);
	//libPublisher.SmartPublisherSetRecorderVideo(publisherHandle, 0);

	if (!stream_publisher_.SetRecorderFileMaxSize(200)) {
		Log.e(TAG, "SmartPublisherSetRecorderFileMaxSize failed.");
		return;
	}
}

暂停录像、恢复录像设计如下:

class ButtonPauseRecorderListener implements View.OnClickListener {
	public void onClick(View v) {
		if (stream_publisher_.is_recording()) {
			if (isPauseRecording) {
				boolean ret = stream_publisher_.PauseRecorder(true);
				if (ret) {
					isPauseRecording = false;
					btnPauseRecorder.setText("恢复录像");
				} else {
					Log.e(TAG, "Pause recorder failed..");
				}
			} else {
				boolean ret = stream_publisher_.PauseRecorder(false);
				if (ret) {
					isPauseRecording = true;
					btnPauseRecorder.setText("暂停录像");
				} else {
					Log.e(TAG, "Resume recorder failed..");
				}
			}
		}
	}
}

Event回调实现如下:

private static class EventHandlerPublisherV2 implements NTSmartEventCallbackV2 {
	@Override
	public void onNTSmartEventCallbackV2(long handle, int id, long param1, long param2, String param3, String param4, Object param5) {

		Log.i(TAG, "EventHandeV2: handle=" + handle + " id:" + id);

		String publisher_event = "";

		switch (id) {
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_STARTED:
				publisher_event = "开始..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CONNECTING:
				publisher_event = "连接中..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CONNECTION_FAILED:
				publisher_event = "连接失败..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CONNECTED:
				publisher_event = "连接成功..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_DISCONNECTED:
				publisher_event = "连接断开..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_STOP:
				publisher_event = "关闭..";
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_RECORDER_START_NEW_FILE:
				publisher_event = "开始一个新的录像文件 : " + param3;
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_ONE_RECORDER_FILE_FINISHED:
				if (record_executor_ != null) {
					RecordExecutorService executor = record_executor_.get();
					if (executor != null) {
						RecordFileFinishedHandler file_finished_handler = new RecordFileFinishedHandler().set(handle, param3, param1);
						if (param2 > 0)
							file_finished_handler.set_begin_time(param2);

						executor.execute(file_finished_handler);
					}
				}
				publisher_event = "已生成一个录像文件 : " + param3;
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_SEND_DELAY:
				publisher_event = "发送时延: " + param1 + " 帧数:" + param2;
				break;

			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CAPTURE_IMAGE:
				publisher_event = "快照: " + param1 + " 路径:" + param3;
				if (0 == param1)
					publisher_event = publisher_event + "截取快照成功.." + ", 用户数据:" + param4;
				 else
					publisher_event = publisher_event + "截取快照失败..";

				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_RTSP_URL:
				publisher_event = "RTSP服务URL: " + param3;
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUSH_RTSP_SERVER_RESPONSE_STATUS_CODE:
				publisher_event ="RTSP status code received, codeID: " + param1 + ", RTSP URL: " + param3;
				break;
			case NTSmartEventID.EVENT_DANIULIVE_ERC_PUSH_RTSP_SERVER_NOT_SUPPORT:
				publisher_event ="服务器不支持RTSP推送, 推送的RTSP URL: " + param3;
				break;
		}

		String str = "当前回调状态:" + publisher_event;

		Log.i(TAG, str);

		if (handler_ != null) {
			android.os.Handler handler = handler_.get();
			if (handler != null) {
				Message message = new Message();
				message.what = PUBLISHER_EVENT_MSG;
				message.obj = publisher_event;
				handler.sendMessage(message);
			}
		}
	}

	public NTSmartEventCallbackV2 set(android.os.Handler handler, RecordExecutorService record_executor) {
		this.handler_ = new WeakReference<>(handler);
		this.record_executor_ = new WeakReference<>(record_executor);
		return this;
	}

	private WeakReference<android.os.Handler> handler_;
	private WeakReference<RecordExecutorService> record_executor_;
}

onDestroy() 的时候,调用stopPush()即可,如果有录像和快照,都停掉,此外,停掉图层投递线程,并关闭camera:

@Override
protected void onDestroy() {
	Log.i(TAG, "activity destory!");

	record_executor_.cancel_tasks();

	stopAudioRecorder();

	if (snap_shot_impl_ != null) {
		snap_shot_impl_.stop();
		snap_shot_impl_ = null;
	}

	snap_shot_publisher_.release();

	stopPush();
	stopRecorder();

	stream_publisher_.release();

	stopLayerPostThread();

	if (camera2Helper != null) {
		camera2Helper.release();
	}

	if (!record_executor_.shutdown(60, TimeUnit.SECONDS))
		Log.w(TAG, "call record_executor_.shutdown failed");

	super.onDestroy();
}

总结

以上是大牛直播SDK的Android平台RTMP直播推送模块详细的对接说明,除了可以对接编码前各种类型的音视频数据外,模块还支持对接编码后音视频数据,并实现本地录像、快照等功能,除支持H.264外,RTMP推送模块还支持扩展H.265和Enhanced RTMP。感兴趣的开发者,可以单独跟我们探讨。

Android平台外部编码数据实时预览播放SDK

Android平台除RTMP、RTSP直播播放外,有些场景可输出编码后(视频:H.264/H.265,音频:AAC/PCMA/PCMU)的数据,比如无人机或类似智能硬件设备,回调出来的H.264/H.265数据,编码后的H.264/H.265数据除了正常转推到RTMP、轻量级RTSP服务或GB28181外,还需要本地预览甚至对数据做二次处理(视频分析、实时水印字符叠加等),大概流程如下:

基于这样的场景诉求,我们开发了Android平台外部编码数据实时预览播放模块(以RTSP拉流,然后回调上来编码后数据,再投递到Android平台编码数据实时预览播放SDK为例)。

Android平台外部编码数据实时预览模块可实现本地低延迟的预览播放,支持软解码和特定机型硬解码,支持等比例或铺满显示,感兴趣的开发者可参看:Android平台如何实现第三方模块编码后(H.264/H.265/AAC/PCMA/PCMU)数据实时预览播放和播放

Android平台国网B接口接入SDK

电网视频监控系统是智能电网的一个重要组成部分,广泛应用于电网的建设、生产、运行、经营等方面。由于视频监控系统在不同的建设时期选用了不同的技术和不同厂家的产品,导致了标准不统一、技术路线不一致。目前国家电网公司智能电网建设,对视频监控系统提出了新的要求,因此实现统一监控、统一存储、分级控制、分域管理,使不同的视频监视系统能够互联互通,满足视频监控系统全局化、整体化的发展需求,已成为亟待解决的问题。

为此,大牛直播SDK推出的Android平台国网B接口接入SDK,可实现不具备国网B接口音视频能力的 Android终端,通过平台注册接入到现有的平台。

Android终端除支持常规的音视频数据接入外,还可以支持移动设备位置(MobilePosition)订阅和通知、语音广播,支持对接数据类型如下:

  1. 编码前数据(目前支持的有YV12/NV21/NV12/I420/RGB24/RGBA32/RGB565等数据类型),其中,Android平台前后摄像头数据,或者屏幕数据,或者Unity拿到的数据,均属编码前数据;
  2. 编码后数据(如无人机等264/HEVC数据,或者本地解析的MP4音视频数据);
  3. 拉取RTSP或RTMP流并接入至GB28181平台(比如其他IPC的RTSP流,可通过Android平台GB28181接入到国标平台)。

技术特点和优势:

  1. 全自研框架,易于扩展,自适应算法让延迟更低、采集编码传输效率更高;
  2. 所有功能以SDK接口形式提供,支持状态反馈;
  3. 可同时运行RTMP直播推送SDK、轻量级RTSP服务SDK和录像SDK;
  4. 支持外部YUV/RGB/H.264/H.265/AAC数据源接入;
  5. 所有参数均可通过SDK接口单独设置,亦可通过默认参数,傻瓜式设置。

功能支持:

  •  [视频格式]H.264/H.265(Android H.265硬编码);
  •  [音频格式]G.711 A律、AAC;
  •  [音量调节]Android平台采集端支持实时音量调节;
  •  [H.264硬编码]支持H.264特定机型硬编码;
  •  [H.265硬编码]支持H.265特定机型硬编码;
  •  [软硬编码参数配置]支持gop间隔、帧率、bit-rate设置;
  •  [软编码参数配置]支持软编码profile、软编码速度、可变码率设置;
  •  支持横屏、竖屏推流;
  •  Android平台支持后台service推送屏幕(推送屏幕需要5.0+版本);
  • 支持RTP OVER UDP和RTP OVER TCP被动模式;
  • 支持信令通道网络传输协议TCP/UDP设置;
  • 支持注册、注销,支持注册刷新及注册有效期设置;
  • 支持前端资源上报(Push_Resourse);
  • 支持请求获取资源(Request_Resource)应答;
  • 支持移动设备位置(MobilePosition)订阅和通知;
  • 支持语音广播;
  •  [实时水印]支持动态文字水印、png水印;
  •  [镜像]Android平台支持前置摄像头实时镜像功能;
  •  [实时静音]支持实时静音/取消静音;
  •  [实时快照]支持实时快照;
  •  [降噪]支持环境音、手机干扰等引起的噪音降噪处理、自动增益、VAD检测;
  •  [外部编码前视频数据对接]支持YUV数据对接;
  •  [外部编码前音频数据对接]支持PCM对接;
  •  [外部编码后视频数据对接]支持外部H.264数据对接;
  •  [外部编码后音频数据对接]外部AAC数据对接;
  •  [扩展录像功能]支持和录像SDK组合使用,录像相关功能。

对应Demo:

  •  Android工程:SmartPublisherV2、Camera2Demo

Android平台GB28181接入SDK

大牛直播SDK推出的Android平台GB28181接入SDK(SmartGBD),可实现不具备国标音视频能力的 Android终端,通过平台注册接入到现有的GB/T28181—2016或GB/T28181—2022服务,可用于如执法记录仪、智能安全帽、智能监控、智慧零售、智慧教育、远程办公、明厨亮灶、智慧交通、智慧工地、雪亮工程、平安乡村、生产运输、车载终端等场景,可能是业内为数不多功能齐全性能优异的商业级水准GB28181接入SDK。

Android终端除支持常规的音视频数据接入外,还可以支持移动设备位置(MobilePosition)订阅和通知、图像抓拍、语音广播和语音对讲、历史视音频下载和回放,支持对接数据类型如下:

  1. 编码前数据(目前支持的有YV12/NV21/NV12/I420/RGB24/RGBA32/RGB565等数据类型),其中,Android平台前后摄像头数据,或者屏幕数据,或者Unity拿到的数据,均属编码前数据;
  2. 编码后数据(如无人机等264/HEVC数据,或者本地解析的MP4音视频数据);
  3. 拉取RTSP或RTMP流并接入至GB28181平台(比如其他IPC的RTSP流,可通过Android平台GB28181接入到国标平台)。

技术特点和优势:

  1. 全自研框架,易于扩展,自适应算法让延迟更低、采集编码传输效率更高;
  2. 所有功能以SDK接口形式提供,支持状态反馈;
  3. 可同时运行RTMP直播推送SDK、轻量级RTSP服务SDK和录像SDK;
  4. 支持外部YUV/RGB/H.264/H.265/AAC数据源接入;
  5. 所有参数均可通过SDK接口单独设置,亦可通过默认参数,傻瓜式设置。

功能支持:

  •  [视频格式]H.264/H.265(Android H.265硬编码);
  •  [音频格式]G.711 A律、AAC;
  •  [音量调节]Android平台采集端支持实时音量调节;
  •  [H.264硬编码]支持H.264特定机型硬编码;
  •  [H.265硬编码]支持H.265特定机型硬编码;
  •  [软硬编码参数配置]支持gop间隔、帧率、bit-rate设置;
  •  [软编码参数配置]支持软编码profile、软编码速度、可变码率设置;
  •  支持横屏、竖屏推流;
  •  Android平台支持后台service推送屏幕(推送屏幕需要5.0+版本);
  • 支持纯视频、音视频PS打包传输;
  • 支持RTP OVER UDP和RTP OVER TCP被动模式(TCP媒体流传输客户端);
  • 支持信令通道网络传输协议TCP/UDP设置;
  • 支持注册、注销,支持注册刷新及注册有效期设置;
  • 支持设备目录查询应答;
  • 支持心跳机制,支持心跳间隔、心跳检测次数设置;
  • 支持移动设备位置(MobilePosition)订阅和通知;
  •  适用国家标准:GB/T 28181—2016、GB/T28181—2022;
  • 支持语音广播;
  • 支持语音对讲;
  • 支持图像抓拍;
  • 支持历史视音频文件检索;
  • 支持历史视音频文件下载;
  • 支持历史视音频文件回放;
  • 支持云台控制和预置位查询;
  •  [实时水印]支持动态文字水印、png水印;
  •  [镜像]Android平台支持前置摄像头实时镜像功能;
  •  [实时静音]支持实时静音/取消静音;
  •  [实时快照]支持实时快照;
  •  [降噪]支持环境音、手机干扰等引起的噪音降噪处理、自动增益、VAD检测;
  •  [外部编码前视频数据对接]支持YUV数据对接;
  •  [外部编码前音频数据对接]支持PCM对接;
  •  [外部编码后视频数据对接]支持外部H.264数据对接;
  •  [外部编码后音频数据对接]外部AAC数据对接;
  •  [扩展录像功能]支持和录像SDK组合使用,录像相关功能。

对应Demo:

  •  Android工程:SmartPublisherV2、Camera2Demo;

技术对接文档:

相关博客:

Unity环境下RTMP推流|轻量级RTSP服务+RTMP|RTSP播放低延迟解决方案

除了Windows/Linux/Android/iOS Native SDK,大牛直播SDK发布了Unity环境下的RTMP推流|轻量级RTSP服务(Windows平台+Linux平台+Android平台)和RTMP|RTSP直播播放(Windows、Linux、Android和iOS平台全覆盖)低延迟的解决方案。

目前,大牛直播SDK的Unity3D环境下,已覆盖以下SDK:

  •  Windows平台RTMP直播推送SDK(采集Unity窗体、摄像头或屏幕);
  •  Windows平台轻量级RTSP服务SDK(采集Unity窗体、摄像头或屏幕);
  •  Windows平台RTMP|RTSP直播播放SDK;
  •  Linux平台RTMP直播推送SDK(采集Unity窗体、Unity声音);
  •  Linux平台RTMP|RTSP直播播放SDK;
  •  Android平台RTMP直播推送SDK(采集Unity窗体、摄像头、麦克风或Unity声音);
  •  Android平台轻量级RTSP服务SDK(采集Unity窗体、摄像头、麦克风或Unity声音);
  •  Android平台RTMP|RTSP直播播放SDK;
  •  iOS平台RTMP|RTSP直播播放SDK。

平台覆盖和架构支持

支持平台 支持架构
Windows平台 x86 debug/release, x64 debug/release
Linux(含麒麟操作系统) x86_64、aarch64
Android平台 armeabi-v7a, arm64-v8a, x86, x86_64
iOS平台 arm64

1. Unity环境下RTMP推流、轻量级RTSP服务模块

Unity环境下,不管是camera还是窗体数据也好,主要是高效率的拿到原始数据,采集端可用的数据格式是RGB的,拿到之后,通过高效率的数据传递,发给封装后的原生SDK,完成数据编码和RTMP推送。

需要注意的地方有几点:

1. 数据采集投递,确保高效率;

2. 屏幕分辨率发生变化,可实时适配;

3. Unity和原生SDK之间通信,比如event回调等;

4. 屏幕数据如有水平或垂直翻转,需要有一定的矫正。

2. Unity环境下RTMP|RTSP播放器

Unity环境下RTMP或RTSP直播播放我们前几年就有发布,并已应用在好多传统行业领域,比如教育或工业仿真或一些低延迟的控制场景。

相关实现逻辑如下:

1. Native RTSP或RTSP直播播放SDK回调RGB/YUV420/NV12等其中的一种未压缩的图像格式;

2. Unity3D创建相应的RGB/YUV420等Shader;

3.Unity3D从各个平台获取图像数据来填充纹理即可;

需要注意的有几点:

1. 多实例支持:播放端和推送不一样,比如智慧城市,播放端有多路场景,所以多实例支持是必备功能,多实例环境下,需要能有好的区分event状态回调等;

2. 尽可能高效率的数据传递,确保资源占有最小化;

3. 视频分辨率变化后,能自动适配;

4. Unity和原生SDK之间通信,比如event回调等;

5. 长时间运行稳定性。

相关SDK文档及视频

大牛直播SDK Unity3D接口调用SDK说明

Unity3d RTSP/RTMP直播播放端SDK视频演示1

Unity3d RTSP/RTMP直播播放端SDK视频演示2

相关博客

Windows平台Unity3d下如何同时播放多路RTSP或RTMP流

如何在Unity3d平台下低延迟播放RTMP或RTSP流

Windows平台实现Unity下窗体|摄像头|屏幕采集推送

Android平台实现Unity3D下RTMP推送

Unity3D平台实现全景实时RTMP|RTSP流渲染

Unity3D下Linux平台播放RTSP或RTMP流

Android平台实现VR头显Unity下音视频数据RTMP推送

Unity实现Camera和Audio数据的低延迟RTMP推送技术探讨

Android平台Unity下如何通过WebCamTexture采集摄像头数据并推送至RTMP服务器或轻量级RTSP服务

Windows平台实现Unity下窗体|摄像头|屏幕采集推送

技术背景

随着Unity3D的应用范围越来越广,越来越多的行业开始基于Unity3D开发产品,如传统行业中虚拟仿真教育、航空工业、室内设计、城市规划、工业仿真等领域。

基于此,好多开发者苦于在Unity环境下,没有低延迟的推拉流解决方案,前几年,我们在Unity环境下推出了跨平台低延迟的RTMP|RTSP直播播放器,很好的解决了好多对延迟要求苛刻的使用场景。

随着时间的推移,越来越多的开发者联系我们,希望我们能推出Unity环境下的RTMP推送模块,获取到unity的实时数据,更低延迟更高效率的实现数据传输推送,基于此,我们发布了Unity环境下的RTMP推送模块。

本文以Windows平台为例,数据源分别为Unity的窗口、摄像头或整个屏幕,编码传输模块,还是调用大牛直播SDK(官方)的原生接口,简单界面先睹为快:

技术实现

1. 基础初始化


        private bool InitSDK()
        {
            if (!is_pusher_sdk_init_)
            {
                // 设置日志路径(请确保目录存在)
                String log_path = "D:\\pulisherlog";
                NTSmartLog.NT_SL_SetPath(log_path);

                UInt32 isInited = NTSmartPublisherSDK.NT_PB_Init(0, IntPtr.Zero);

                if (isInited != 0)
                {
                    Debug.Log("调用NT_PB_Init失败..");
                    return false;
                }

                is_pusher_sdk_init_ = true;
            }

            return true;
        }

2. 调用Open()接口,获取推送实例

       public bool OpenPublisherHandle(uint video_option, uint audio_option)
        {
            if (publisher_handle_ != IntPtr.Zero)
            {
                return true;
            }

            publisher_handle_count_ = 0;

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_Open(out publisher_handle_,
                video_option, audio_option, 0, IntPtr.Zero))
            {
                return false;
            }

            if (publisher_handle_ != IntPtr.Zero)
            {
                pb_event_call_back_ = new NT_PB_SDKEventCallBack(PbEventCallBack);

                NTSmartPublisherSDK.NT_PB_SetEventCallBack(publisher_handle_, IntPtr.Zero, pb_event_call_back_);

                return true;
            }
            else
            {
                return false;
            }
        }

3. 初始化参数配置

这里需要注意下,如果要采集unity窗口,需要设置图层模式,先填充一层RGBA黑色背景,然后再添加一层,用于叠加外部数据。

       private void SetCommonOptionToPublisherSDK()
        {
            if (!IsPublisherHandleAvailable())
            {
                Debug.Log("SetCommonOptionToPublisherSDK, publisher handle with null..");
                return;
            }

            NTSmartPublisherSDK.NT_PB_ClearLayersConfig(publisher_handle_, 0,
                            0, IntPtr.Zero);

            if (video_option == NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_LAYER)
            {
                // 第0层填充RGBA矩形, 目的是保证帧率, 颜色就填充全黑
                int red = 0;
                int green = 0;
                int blue = 0;
                int alpha = 255;

                NT_PB_RGBARectangleLayerConfig rgba_layer_c0 = new NT_PB_RGBARectangleLayerConfig();

                rgba_layer_c0.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE;
                rgba_layer_c0.base_.index_ = 0;
                rgba_layer_c0.base_.enable_ = 1;
                rgba_layer_c0.base_.region_.x_ = 0;
                rgba_layer_c0.base_.region_.y_ = 0;
                rgba_layer_c0.base_.region_.width_ = video_width_;
                rgba_layer_c0.base_.region_.height_ = video_height_;

                rgba_layer_c0.base_.offset_ = Marshal.OffsetOf(rgba_layer_c0.GetType(), "base_").ToInt32();
                rgba_layer_c0.base_.cb_size_ = (uint)Marshal.SizeOf(rgba_layer_c0);

                rgba_layer_c0.red_ = System.BitConverter.GetBytes(red)[0];
                rgba_layer_c0.green_ = System.BitConverter.GetBytes(green)[0];
                rgba_layer_c0.blue_ = System.BitConverter.GetBytes(blue)[0];
                rgba_layer_c0.alpha_ = System.BitConverter.GetBytes(alpha)[0];

                IntPtr rgba_conf = Marshal.AllocHGlobal(Marshal.SizeOf(rgba_layer_c0));

                Marshal.StructureToPtr(rgba_layer_c0, rgba_conf, true);

                UInt32 rgba_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                rgba_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE,
                                0, IntPtr.Zero);

                Marshal.FreeHGlobal(rgba_conf);

                NT_PB_ExternalVideoFrameLayerConfig external_layer_c1 = new NT_PB_ExternalVideoFrameLayerConfig();

                external_layer_c1.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_EXTERNAL_VIDEO_FRAME;
                external_layer_c1.base_.index_ = 1;
                external_layer_c1.base_.enable_ = 1;
                external_layer_c1.base_.region_.x_ = 0;
                external_layer_c1.base_.region_.y_ = 0;
                external_layer_c1.base_.region_.width_ = video_width_;
                external_layer_c1.base_.region_.height_ = video_height_;

                external_layer_c1.base_.offset_ = Marshal.OffsetOf(external_layer_c1.GetType(), "base_").ToInt32();
                external_layer_c1.base_.cb_size_ = (uint)Marshal.SizeOf(external_layer_c1);

                IntPtr external_layer_conf = Marshal.AllocHGlobal(Marshal.SizeOf(external_layer_c1));

                Marshal.StructureToPtr(external_layer_c1, external_layer_conf, true);

                UInt32 external_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                external_layer_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_EXTERNAL_VIDEO_FRAME,
                                0, IntPtr.Zero);

                Marshal.FreeHGlobal(external_layer_conf);

            }
            else if (video_option == NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_CAMERA)
            {
                CameraInfo camera = cameras_[cur_sel_camera_index_];
                NT_PB_VideoCaptureCapability cap = camera.capabilities_[cur_sel_camera_resolutions_index_];

                SetVideoCaptureDeviceBaseParameter(camera.id_.ToString(), (UInt32)cap.width_, (UInt32)cap.height_);
            }

            SetFrameRate((UInt32)CalBitRate(edit_key_frame_, video_width_, video_height_));

            Int32 type = 0;   //软编码
            Int32 encoder_id = 1;
            UInt32 codec_id = (UInt32)NTCommonMediaDefine.NT_MEDIA_CODEC_ID.NT_MEDIA_CODEC_ID_H264;
            Int32 param1 = 0;

            SetVideoEncoder(type, encoder_id, codec_id, param1);

            SetVideoQualityV2(CalVideoQuality(video_width_, video_height_, is_h264_encoder));

            SetVideoMaxBitRate((CalMaxKBitRate(edit_key_frame_, video_width_, video_height_, false)));

            SetVideoKeyFrameInterval((edit_key_frame_));

            if (is_h264_encoder)
            {
                SetVideoEncoderProfile(1);
            }

            SetVideoEncoderSpeed(CalVideoEncoderSpeed(video_width_, video_height_, is_h264_encoder));

            // 音频相关设置

            SetAuidoInputDeviceId(0);
            SetPublisherAudioCodecType(1);
            SetPublisherMute(is_mute);
            SetEchoCancellation(0, 0);
            SetNoiseSuppression(0);
            SetAGC(0);
            SetVAD(0);
            SetInputAudioVolume(Convert.ToSingle(edit_audio_input_volume_));
        }

4. 数据采集

摄像头和屏幕的数据采集,还是调用原生的SDK接口,本文不再赘述,如果需要采集Unity窗体的数据,可以用参考以下代码:

        if ( texture_ == null || video_width_ != Screen.width || video_height_ != Screen.height)
        {
            Debug.Log("OnPostRender screen changed++ scr_width: " + Screen.width + " scr_height: " + Screen.height);

            if (screen_image_ != IntPtr.Zero)
            {
                Marshal.FreeHGlobal(screen_image_);
                screen_image_ = IntPtr.Zero;
            }

            if (texture_ !=  null)
            {
                UnityEngine.Object.Destroy(texture_);
                texture_ = null;
            }

            video_width_ = Screen.width;
            video_height_ = Screen.height;

            texture_ = new Texture2D(video_width_, video_height_, TextureFormat.BGRA32, false);

            screen_image_ = Marshal.AllocHGlobal(video_width_ * 4 * video_height_);

            Debug.Log("OnPostRender screen changed--");

            return;
        }

        texture_.ReadPixels(new Rect(0, 0, video_width_, video_height_), 0, 0, false);
        texture_.Apply();

从 texture里面,通过调用 GetRawTextureData(),获取到原始数据。

5. 数据对接

数据对接,通过调用以下接口:

       public void OnPostRGBAData(IntPtr data, int length, int stride, int width, int height)
        {
            NT_PB_Image pb_image = new NT_PB_Image();
            
            pb_image.format_ = (int)NTSmartPublisherDefine.NT_PB_E_IMAGE_FORMAT.NT_PB_E_IMAGE_FORMAT_RGB32;
            pb_image.width_ = width;
            pb_image.height_ = height;
            pb_image.timestamp_ = 0;
            pb_image.cb_size_ = (UInt32)Marshal.SizeOf(pb_image);   

            pb_image.stride_ = new Int32[16];
            pb_image.stride_[0] = stride;

            pb_image.plane_size_ = new Int32[16];
            pb_image.plane_size_[0] = pb_image.stride_[0] * pb_image.height_;         

            pb_image.plane_ = new IntPtr[16];
            pb_image.plane_[0] = data;

            IntPtr image_data = Marshal.AllocHGlobal(Marshal.SizeOf(pb_image));

            Marshal.StructureToPtr(pb_image, image_data, true);

            NTSmartPublisherSDK.NT_PB_PostLayerImage(publisher_handle_, 0,
                            1, image_data, 0, IntPtr.Zero);

            Marshal.FreeHGlobal(image_data);
        }

6. 本地数据预览

        public bool StartPreview()
        {
            if(CheckPublisherHandleAvailable() == false)
                return false;

            video_preview_image_callback_ = new NT_PB_SDKVideoPreviewImageCallBack(SDKVideoPreviewImageCallBack);

            NTSmartPublisherSDK.NT_PB_SetVideoPreviewImageCallBack(publisher_handle_, (int)NTSmartPublisherDefine.NT_PB_E_IMAGE_FORMAT.NT_PB_E_IMAGE_FORMAT_RGB32, IntPtr.Zero, video_preview_image_callback_);

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_StartPreview(publisher_handle_, 0, IntPtr.Zero))
            {
                if (0 == publisher_handle_count_)
                {
                    NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                    publisher_handle_ = IntPtr.Zero;
                }

                return false;
            }

            publisher_handle_count_++;

            is_previewing_ = true;

            return true;
        }

设置preview后,处理preview的数据回调

        //预览数据回调
        public void SDKVideoPreviewImageCallBack(IntPtr handle, IntPtr user_data, IntPtr image)
        {
            NT_PB_Image pb_image = (NT_PB_Image)Marshal.PtrToStructure(image, typeof(NT_PB_Image));

            NT_VideoFrame pVideoFrame = new NT_VideoFrame();

            pVideoFrame.width_ = pb_image.width_;
            pVideoFrame.height_ = pb_image.height_;

            pVideoFrame.stride_ = pb_image.stride_[0];

            Int32 argb_size = pb_image.stride_[0] * pb_image.height_;

            pVideoFrame.plane_data_ = new byte[argb_size];
            
            if (argb_size > 0)
            {
                Marshal.Copy(pb_image.plane_[0],pVideoFrame.plane_data_,0, argb_size);
            }

            {
                cur_image_ = pVideoFrame;
            }
        }      

7. 相关event回调处理

        private void PbEventCallBack(IntPtr handle, IntPtr user_data, 
            UInt32 event_id,
            Int64 param1,
            Int64 param2,
            UInt64 param3,
            UInt64 param4,
            [MarshalAs(UnmanagedType.LPStr)] String param5,
            [MarshalAs(UnmanagedType.LPStr)] String param6,
            IntPtr param7)
        {
            String event_log = "";

            switch (event_id)
            {
                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_CONNECTING:
                    event_log = "连接中";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_CONNECTION_FAILED:
                    event_log = "连接失败";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_CONNECTED:
                    event_log = "已连接";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_DISCONNECTED:
                    event_log = "断开连接";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                default:
                    break;
            }

            if(OnLogEventMsg != null) OnLogEventMsg.Invoke(event_id, event_log);
        }

8. 开始推送、停止推送

       public bool StartPublisher(String url)
        {
            if (CheckPublisherHandleAvailable() == false) return false;

            if (publisher_handle_ == IntPtr.Zero)
            {
                return false;
            }
            if (!String.IsNullOrEmpty(url))
            {
                NTSmartPublisherSDK.NT_PB_SetURL(publisher_handle_, url, IntPtr.Zero);
            }

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_StartPublisher(publisher_handle_, IntPtr.Zero))
            {
                if (0 == publisher_handle_count_)
                {
                    NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                    publisher_handle_ = IntPtr.Zero;
                }

                is_publishing_ = false;

                return false;
            }

            publisher_handle_count_++;

            is_publishing_ = true;

            return true;
        }

        public void StopPublisher()
        {
            if (is_publishing_ == false) return;

            publisher_handle_count_--;
            NTSmartPublisherSDK.NT_PB_StopPublisher(publisher_handle_);

            if (0 == publisher_handle_count_)
            {
                NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                publisher_handle_ = IntPtr.Zero;
            }

            is_publishing_ = false;
        }

9. 关闭实例

        public void Close()
        {
            if (0 == publisher_handle_count_)
            {
                NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                publisher_handle_ = IntPtr.Zero;
            }
        }

总结

经测试,Unity环境下,通过高效率的数据采集、编码和推送,配合SmartPlayer播放器播放,整体延迟可控制在毫秒级,可适用于大多数Unity环境下对延迟和稳定性要求苛刻的场景。

Unity3D RTMP直播推流SDK

好多开发者苦于很难在unity3d下实现RTMP直播推送,本次以大牛直播SDK的Windows平台RTMP推送模块(以推摄像头为例,如需推屏幕数据,设置相关参数即可)为例,介绍下unity3d的RTMP推送集成。

简单来说,Unity3D环境下,可以直接调用C#的接口封装,针对此,我们先做了一层封装 (nt_publisher_wrapper.cs),核心代码如下:

初始化和基础参数设置:

       private bool InitSDK()
        {
            if (!is_pusher_sdk_init_)
            {
                // 设置日志路径(请确保目录存在)
                String log_path = "D:\\pulisherlog";
                NTSmartLog.NT_SL_SetPath(log_path);

                UInt32 isInited = NTSmartPublisherSDK.NT_PB_Init(0, IntPtr.Zero);

                if (isInited != 0)
                {
                    Debug.Log("调用NT_PB_Init失败..");
                    return false;
                }

                is_pusher_sdk_init_ = true;
            }

            return true;
        }

        public bool OpenPublisherHandle(uint video_option, uint audio_option)
        {
            if (publisher_handle_ != IntPtr.Zero)
            {
                return true;
            }

            publisher_handle_count_ = 0;

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_Open(out publisher_handle_,
                video_option, audio_option, 0, IntPtr.Zero))
            {
                return false;
            }

            if (publisher_handle_ != IntPtr.Zero)
            {
                pb_event_call_back_ = new NT_PB_SDKEventCallBack(PbEventCallBack);

                NTSmartPublisherSDK.NT_PB_SetEventCallBack(publisher_handle_, IntPtr.Zero, pb_event_call_back_);

                return true;
            }
            else
            {
                return false;
            }
        }

        private void SetCommonOptionToPublisherSDK()
        {
            if (!IsPublisherHandleAvailable())
            {
                Debug.Log("SetCommonOptionToPublisherSDK, publisher handle with null..");
                return;
            }

            CameraInfo camera = cameras_[cur_sel_camera_index_];
            NT_PB_VideoCaptureCapability cap = camera.capabilities_[cur_sel_camera_resolutions_index_];

            SetVideoCaptureDeviceBaseParameter(camera.id_.ToString(), (UInt32)cap.width_, (UInt32)cap.height_);

            SetFrameRate((UInt32)CalBitRate(edit_key_frame_, cap.width_, cap.height_));

            SetVideoEncoderType(is_h264_encoder ? 1 : 2);

            SetVideoQualityV2(CalVideoQuality(cap.width_, cap.height_, is_h264_encoder));

            SetVideoMaxBitRate((CalMaxKBitRate(edit_key_frame_, cap.width_, cap.height_, false)));

            SetVideoKeyFrameInterval((edit_key_frame_));

            if (is_h264_encoder)
            {
                SetVideoEncoderProfile(1);

            }

            SetVideoEncoderSpeed(CalVideoEncoderSpeed(cap.width_, cap.height_, is_h264_encoder));

            // 音频相关设置

            SetAuidoInputDeviceId(0);

            SetPublisherAudioCodecType(1);

            SetPublisherMute(is_mute);

            SetInputAudioVolume(Convert.ToSingle(edit_audio_input_volume_));
        }

预览、停止预览:

       public bool StartPreview()
        {
            if(CheckPublisherHandleAvailable() == false)
                return false;

            video_preview_image_callback_ = new NT_PB_SDKVideoPreviewImageCallBack(SDKVideoPreviewImageCallBack);

            NTSmartPublisherSDK.NT_PB_SetVideoPreviewImageCallBack(publisher_handle_, (int)NTSmartPublisherDefine.NT_PB_E_IMAGE_FORMAT.NT_PB_E_IMAGE_FORMAT_RGB32, IntPtr.Zero, video_preview_image_callback_);

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_StartPreview(publisher_handle_, 0, IntPtr.Zero))
            {
                if (0 == publisher_handle_count_)
                {
                    NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                    publisher_handle_ = IntPtr.Zero;
                }

                return false;
            }

            publisher_handle_count_++;

            is_previewing_ = true;

            return true;
        }

        public void StopPreview()
        {
            if (is_previewing_ == false) return;

            is_previewing_ = false;

            publisher_handle_count_--;
            NTSmartPublisherSDK.NT_PB_StopPreview(publisher_handle_);

            if (0 == publisher_handle_count_)
            {
                NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                publisher_handle_ = IntPtr.Zero;
            }
        }

开始推送、停止推送:

        public bool StartPublisher(String url)
        {
            if (CheckPublisherHandleAvailable() == false) return false;

            if (publisher_handle_ == IntPtr.Zero)
            {
                return false;
            }
            if (!String.IsNullOrEmpty(url))
            {
                NTSmartPublisherSDK.NT_PB_SetURL(publisher_handle_, url, IntPtr.Zero);
            }

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_StartPublisher(publisher_handle_, IntPtr.Zero))
            {
                if (0 == publisher_handle_count_)
                {
                    NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                    publisher_handle_ = IntPtr.Zero;
                }

                is_publishing_ = false;

                return false;
            }

            publisher_handle_count_++;

            is_publishing_ = true;

            return true;
        }

        public void StopPublisher()
        {
            if (is_publishing_ == false) return;

            publisher_handle_count_--;
            NTSmartPublisherSDK.NT_PB_StopPublisher(publisher_handle_);

            if (0 == publisher_handle_count_)
            {
                NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                publisher_handle_ = IntPtr.Zero;
            }

            is_publishing_ = false;
        }

相关event事件回调:

        private void PbEventCallBack(IntPtr handle, IntPtr user_data, 
            UInt32 event_id,
            Int64 param1,
            Int64 param2,
            UInt64 param3,
            UInt64 param4,
            [MarshalAs(UnmanagedType.LPStr)] String param5,
            [MarshalAs(UnmanagedType.LPStr)] String param6,
            IntPtr param7)
        {
            String event_log = "";

            switch (event_id)
            {
                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_CONNECTING:
                    event_log = "连接中";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_CONNECTION_FAILED:
                    event_log = "连接失败";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_CONNECTED:
                    event_log = "已连接";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                case (uint)NTSmartPublisherDefine.NT_PB_E_EVENT_ID.NT_PB_E_EVENT_ID_DISCONNECTED:
                    event_log = "断开连接";
                    if (!String.IsNullOrEmpty(param5))
                    {
                        event_log = event_log + " url:" + param5;
                    }
                    break;

                default:
                    break;
            }

            if(OnLogEventMsg != null) OnLogEventMsg.Invoke(event_id, event_log);
        }

SmartPublishWinMono.cs 调用上述封装的代码即可,本地预览的话,拿到回调的RGB数据,在unity3d上层刷下即可,如下图:

经测试,unity3d下,RTMP推送,配合RTMP播放端,依然可以实现毫秒级延迟的推拉流体验。

Windows平台RTMP直播推送集成简要说明

好多开发者在集成大牛直播SDK (官方)的Windows平台RTMP推送模块时吓一跳,怎么这么多接口?本文做个简单的拆分:

初始化

初始化之前,如需设置日志路径,调用NTSmartLog.NT_SL_SetPath(log_path); 设置日志存放路径。

设置过后,调用NT_PB_Init()接口,完成SDK初始化动作,注意,哪怕多实例推送,Init()接口也仅需调一次,同理,UnInit()接口也是。

然后,代码会判断系统是不是支持WR模式采集窗口,WR这种只有Win10高版本的才支持,如果不需要用到采集窗口,这个接口可忽略。

        /*
		 * 检查是否支持WR方式采集窗口
         * is_supported: 输出参数, 输出1表示支持, 0表示不支持
         * 注意:这个需要win10较高版本才支持
         * 成功返回 NT_ERC_OK
		 */
        [DllImport(@"SmartPublisherSDK.dll")]
		public static extern UInt32 NT_PB_IsWRCaptureWindowSupported(ref Int32 is_supported);

再往下,是遍历系统支持的硬解、摄像头等信息,比如LoadHWVideoEncoderInfos():

        private void LoadHWVideoEncoderInfos()
        {            
	        hw_video_encoder_infos_.Clear();

	        Int32 count = 0;
            UInt32 ret = NTSmartPublisherSDK.NT_PB_GetHWVideoEncoderInfoCount(ref count);
	        
            if (NTBaseCodeDefine.NT_ERC_OK == ret && count > 0)
	        {
                IntPtr ptr_hw_video_encoder_infos = Marshal.AllocHGlobal(Marshal.SizeOf(typeof(NT_PB_HWVideoEncoderInfo)) * count);
                
                Int32 out_count = 0;

                ret = NTSmartPublisherSDK.NT_PB_GetHWVideoEncoderInfos(ptr_hw_video_encoder_infos, count, ref out_count);

                if (ret != NTBaseCodeDefine.NT_ERC_OK || out_count < 1)
                {
                    hw_video_encoder_infos_.Clear();
                }
                else
                {
                    for (int i = 0; i < out_count; i++)
                    {
                        NT_PB_HWVideoEncoderInfo hw_video_encoder_info = (NT_PB_HWVideoEncoderInfo)Marshal.PtrToStructure(ptr_hw_video_encoder_infos + i * Marshal.SizeOf(typeof(NT_PB_HWVideoEncoderInfo)), typeof(NT_PB_HWVideoEncoderInfo));
                        
                        hw_video_encoder_infos_.Add(hw_video_encoder_info);
                    }
                }

               Marshal.FreeHGlobal(ptr_hw_video_encoder_infos);
	        }
        }

            if (hw_video_encoder_infos_.Count > 0)
            {
                EnableHWVideoEncoderControls(true);
                FillVideoEncodersControl((uint)NTCommonMediaDefine.NT_MEDIA_CODEC_ID.NT_MEDIA_CODEC_ID_H264);
            }

紧接着是Audio和camera相关:

            int auido_devices = 0;

            if (NTBaseCodeDefine.NT_ERC_OK == NTSmartPublisherSDK.NT_PB_GetAuidoInputDeviceNumber(ref auido_devices))
            {
                if (auido_devices > 0)
                {
                    btn_check_auido_mic_input_.Enabled = true;

                    for (int i = 0; i < auido_devices; ++i)
                    {
                        byte[] deviceNameBuffer = new byte[512];

                        string name = "";

                        if (NTBaseCodeDefine.NT_ERC_OK == NTSmartPublisherSDK.NT_PB_GetAuidoInputDeviceName((uint)i, deviceNameBuffer, 512))
                        {
                            int count = 0;
                            for (int j = 0; j < deviceNameBuffer.Length; ++j )
                            {
                                if ( deviceNameBuffer[j] != 0 )
                                {
                                    count++;
                                }
                                else
                                {
                                    break;
                                }
                            }

                            if ( count > 0 )
                            {
                                name = Encoding.UTF8.GetString(deviceNameBuffer, 0, count);
                            }                    
                        }

                        var audio_name = "";

                        if (name.Length == 0)
                        {
                            audio_name = "音频采集设备-";
                        }
                        else
                        {
                            audio_name = name + "-";
                        }

                        audio_name = audio_name + (i + 1);

                        combox_auido_input_devices_.Items.Add(name);
                    }
                    combox_auido_input_devices_.SelectedIndex = 0;
                }
            }

            publisher_handle_ = new IntPtr();

            region_choose_tool_handle_ = new IntPtr();

            win_form_wnd_ = GetForegroundWindow();

            cameras_ = new List<CameraInfo>();

            btn_check_video_bitrate_.CheckState = CheckState.Checked;

            if (IsCanCaptureSpeaker())
            {
                btn_check_auido_speaker_input_.Enabled = true;
            }
            else
            {
                btn_check_auido_speaker_input_.Enabled = false;
            }

            if (btn_check_auido_mic_input_.Checked
                  || btn_check_auido_speaker_input_.Checked)
            {
                btn_check_speex_encoder_.Enabled = true;
                edit_speex_quality_.Enabled = true;
                btn_check_noise_suppression_.Enabled = true;
                btn_check_agc_.Enabled = true;
                btn_check_vad_.Enabled = true;
            }

            if ( btn_check_auido_mic_input_.Checked
                && btn_check_auido_speaker_input_.Checked)
            {
                btn_check_echo_cancel_.Enabled = false;
                edit_echo_delay_.Enabled = false;
            }

            edit_audio_input_volume_.Text = "1.0";
            edit_audio_speaker_input_volume_.Text = "1.0";

            FillCameraInfo();
            InitCameraControl();

OpenPublisherHandle()

OpenPublisherHandle()主要是确认选择数据源类型,然后获取推送句柄,等待做下一步的操作。

选择video option和 audio option

            // 视频
            UInt32 video_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_NO_VIDEO;

            if (btn_desktop_camera_switch.Checked
                || btn_camera_overlay_to_desktop.Checked
                || btn_desktop_overlay_to_camera.Checked)
            {
                // 使用叠加模式
                video_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_LAYER;
            }
            else if (btn_check_window_input_.Checked)
            {
                video_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_WINDOW;
            }
            else if (btn_check_desktop_input_.Checked && btn_check_scale_desktop_.Checked)
            {
                // 使用叠加模式来实现缩放
                video_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_LAYER;
            }
            else if (btn_check_desktop_input_.Checked && !btn_check_scale_desktop_.Checked)
            {
                //屏幕模式
                video_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_SCREEN;
            }
            else if (btn_check_camera_input_.Checked)
            {
                //摄像头模式
                video_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_VIDEO_OPTION.NT_PB_E_VIDEO_OPTION_CAMERA;
            }

            // 音频
            UInt32 audio_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_AUDIO_OPTION.NT_PB_E_AUDIO_OPTION_NO_AUDIO;

            if (btn_check_auido_mic_input_.Checked
                && btn_check_auido_speaker_input_.Checked)
            {
                audio_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_AUDIO_OPTION.NT_PB_E_AUDIO_OPTION_CAPTURE_MIC_SPEAKER_MIXER;
            }
            else if (btn_check_auido_mic_input_.Checked)
            {
                //麦克风模式
                audio_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_AUDIO_OPTION.NT_PB_E_AUDIO_OPTION_CAPTURE_MIC;
            }
            else if (btn_check_auido_speaker_input_.Checked)
            {
                //扬声器模式
                audio_option = (UInt32)NTSmartPublisherDefine.NT_PB_E_AUDIO_OPTION.NT_PB_E_AUDIO_OPTION_CAPTURE_SPEAKER;
            }

调用Open接口获取publisher handle,然设置event callback

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_Open(out publisher_handle_,
                video_option, audio_option, 0, IntPtr.Zero))
            {
                MessageBox.Show("Call open failed!");
                return false;
            }

            if (publisher_handle_ != IntPtr.Zero)
            {
                pb_event_call_back_ = new NT_PB_SDKEventCallBack(PbSDKEventCallBack);

                NTSmartPublisherSDK.NT_PB_SetEventCallBack(publisher_handle_, win_form_wnd_, pb_event_call_back_);
                return true;
            }
            else
            {
                return false;
            }

event callback相关ID

        /*事件ID*/
        public enum NT_PB_E_EVENT_ID : uint
        {
            NT_PB_E_EVENT_ID_BASE = NTBaseCodeDefine.NT_EVENT_ID_SMART_PUBLISHER_SDK,

	        NT_PB_E_EVENT_ID_CONNECTING			= NT_PB_E_EVENT_ID_BASE | 0x2,	/*连接中, param5表示推送URL */
	        NT_PB_E_EVENT_ID_CONNECTION_FAILED	= NT_PB_E_EVENT_ID_BASE | 0x3,	/*连接失败, param5表示推送URL*/
	        NT_PB_E_EVENT_ID_CONNECTED			= NT_PB_E_EVENT_ID_BASE | 0x4,	/*已连接, param5表示推送URL*/
	        NT_PB_E_EVENT_ID_DISCONNECTED		= NT_PB_E_EVENT_ID_BASE | 0x5,	/*断开连接, param5表示推送URL*/
	
	        NT_PB_E_EVENT_ID_RECORDER_START_NEW_FILE    = NT_PB_E_EVENT_ID_BASE | 0x7,	/*录像写入新文件, param5表示录像文件名*/
	        NT_PB_E_EVENT_ID_ONE_RECORDER_FILE_FINISHED = NT_PB_E_EVENT_ID_BASE | 0x8,	/*一个录像文件完成, param5表示录像文件名*/

            NT_PB_E_EVENT_ID_CAPTURE_WINDOW_INVALID = NT_PB_E_EVENT_ID_BASE | 0xd, /*捕获窗口时,如果窗口句柄无效则通知用户, param1为窗口句柄*/

            NT_PB_E_EVENT_ID_RTSP_URL = NT_PB_E_EVENT_ID_BASE | 0xe, /* 通知rtsp url, param1表示rtsp server handle, param5 表示rtsp url */
            NT_PB_E_EVENT_ID_PUSH_RTSP_SERVER_RESPONSE_STATUS_CODE = NT_PB_E_EVENT_ID_BASE | 0xf,  /* 推送rtsp时服务端相应的status code上报,目前只上报401, param1表示status code,  param5表示推送URL */
            NT_PB_E_EVENT_ID_PUSH_RTSP_SERVER_NOT_SUPPORT = NT_PB_E_EVENT_ID_BASE | 0x10,  /* 推送rtsp时服务器不支持rtsp推送,  param5表示推送URL */
        }

SetCommonOptionToPublisherSDK()

SetCommonOptionToPublisherSDK()主要是指定具体采集的音视频数据类型,比如摄像头数据、屏幕数据、摄像头和屏幕叠加后的数据(以层级模式实现)、窗口等,这块比较复杂,好在作为SDK调用者,你只要搞清楚你需要采集的类型,直接移植就可以了。

           // 视频相关设置
            if (btn_desktop_camera_switch.Checked
                || btn_camera_overlay_to_desktop.Checked
                || btn_desktop_overlay_to_camera.Checked
                || btn_check_desktop_input_.Checked
                || btn_check_window_input_.Checked
                || btn_check_camera_input_.Checked)
            {
                if (btn_desktop_camera_switch.Checked)
                {
                    //摄像头和屏幕相互切换
                    int left = Int32.Parse(edit_clip_left_.Text);
                    int top = Int32.Parse(edit_clip_top_.Text);
                    int w = Int32.Parse(edit_clip_width_.Text);
                    int h = Int32.Parse(edit_clip_height_.Text);

                    // 有一个是0, 就使用全屏
                    if (w == 0 || h == 0)
                    {
                        left = 0;
                        top = 0;
                        w = screenArea_.Width;
                        h = screenArea_.Height;
                    }
                    else
                    {
                        // 保证4字节对齐
                        w = NT_ByteAlign(w, 4);
                        h = NT_ByteAlign(h, 4);
                    }


                    NTSmartPublisherSDK.NT_PB_ClearLayersConfig(publisher_handle_, 0,
                                    0, IntPtr.Zero);

                    // 第0层填充RGBA矩形, 目的是保证帧率, 颜色就填充全黑
                    int red = 0;
                    int green = 0;
                    int blue = 0;
                    int alpha = 255;

                    NT_PB_RGBARectangleLayerConfig rgba_layer_c0 = new NT_PB_RGBARectangleLayerConfig();

                    rgba_layer_c0.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE;
                    rgba_layer_c0.base_.index_ = 0;
                    rgba_layer_c0.base_.enable_ = 1;
                    rgba_layer_c0.base_.region_.x_ = left;
                    rgba_layer_c0.base_.region_.y_ = top;
                    rgba_layer_c0.base_.region_.width_ = w;
                    rgba_layer_c0.base_.region_.height_ = h;

                    rgba_layer_c0.base_.offset_ = Marshal.OffsetOf(rgba_layer_c0.GetType(), "base_").ToInt32();
                    rgba_layer_c0.base_.cb_size_ = (uint)Marshal.SizeOf(rgba_layer_c0);

                    rgba_layer_c0.red_ = System.BitConverter.GetBytes(red)[0];
                    rgba_layer_c0.green_ = System.BitConverter.GetBytes(green)[0];
                    rgba_layer_c0.blue_ = System.BitConverter.GetBytes(blue)[0];
                    rgba_layer_c0.alpha_ = System.BitConverter.GetBytes(alpha)[0];

                    IntPtr rgba_conf = Marshal.AllocHGlobal(Marshal.SizeOf(rgba_layer_c0));

                    Marshal.StructureToPtr(rgba_layer_c0, rgba_conf, true);

                    UInt32 rgba_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                    rgba_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE,
                                    0, IntPtr.Zero);

                    Console.WriteLine("[摄像头和屏幕相互切换] NT_PB_AddLayerConfig, rgba: " + rgba_r + Environment.NewLine);

                    Marshal.FreeHGlobal(rgba_conf);

                    //第一层:摄像头
                    NT_PB_CameraLayerConfigV2 camera_layer_c1 = new NT_PB_CameraLayerConfigV2();

                    CameraInfo camera = cameras_[cur_sel_camera_index_];
                    NT_PB_VideoCaptureCapability cap = camera.capabilities_[cur_sel_camera_resolutions_index_];

                    camera_layer_c1.device_unique_id_utf8_ = camera.id_;

                    camera_layer_c1.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_CAMERA;
                    camera_layer_c1.base_.index_ = 1;
                    camera_layer_index_ = camera_layer_c1.base_.index_;
                    camera_layer_c1.base_.enable_ = 1;
                    camera_layer_c1.base_.region_.x_ = left;
                    camera_layer_c1.base_.region_.y_ = top;
                    camera_layer_c1.base_.region_.width_ = w;
                    camera_layer_c1.base_.region_.height_ = h;

                    if (btn_check_flip_horizontal_camera_.Checked)
                    {
                        camera_layer_c1.is_flip_horizontal_ = 1;
                    }
                    else
                    {
                        camera_layer_c1.is_flip_horizontal_ = 0;
                    }

                    if (btn_check_flip_vertical_camera_.Checked)
                    {
                        camera_layer_c1.is_flip_vertical_ = 1;
                    }
                    else
                    {
                        camera_layer_c1.is_flip_vertical_ = 0;
                    }

                    // 这种叠加模式下不要旋转,否则变形厉害, 要么就定好一个角度,调整宽高,但不要动态旋转
                    camera_layer_c1.rotate_degress_ = 0;

                    camera_layer_c1.base_.offset_ = Marshal.OffsetOf(camera_layer_c1.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                    camera_layer_c1.base_.cb_size_ = (uint)Marshal.SizeOf(camera_layer_c1);

                    IntPtr cmr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(camera_layer_c1));

                    Marshal.StructureToPtr(camera_layer_c1, cmr_conf, true);

                    UInt32 c_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                cmr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_CAMERA,
                                0, IntPtr.Zero);

                    Marshal.FreeHGlobal(cmr_conf);

                    //第二层
                    NT_PB_ScreenLayerConfig screen_layer_c2 = new NT_PB_ScreenLayerConfig();

                    screen_layer_c2.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN;
                    screen_layer_c2.base_.index_ = 2;
                    screen_layer_index_ = screen_layer_c2.base_.index_;
                    screen_layer_c2.base_.enable_ = 1;
                    screen_layer_c2.base_.region_.x_ = left;
                    screen_layer_c2.base_.region_.y_ = top;
                    screen_layer_c2.base_.region_.width_ = w;
                    screen_layer_c2.base_.region_.height_ = h;

                    screen_layer_c2.base_.offset_ = Marshal.OffsetOf(screen_layer_c2.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                    screen_layer_c2.base_.cb_size_ = (uint)Marshal.SizeOf(screen_layer_c2);

                    screen_layer_c2.clip_region_.x_ = left;
                    screen_layer_c2.clip_region_.y_ = top;
                    screen_layer_c2.clip_region_.width_ = w;
                    screen_layer_c2.clip_region_.height_ = h;

                    screen_layer_c2.reserve_ = IntPtr.Zero;

                    IntPtr scr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(screen_layer_c2));

                    Marshal.StructureToPtr(screen_layer_c2, scr_conf, true);

                    UInt32 s_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                scr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN,
                                0, IntPtr.Zero);

                    Marshal.FreeHGlobal(scr_conf);

                    // 第三层填充RGBA矩形, 目的是保证帧率, 颜色就填充全黑
                    red = Int32.Parse(edit_rgba_rect_layer_red_.Text);
                    red = ClipIntValue(red, 0, 255);

                    green = Int32.Parse(edit_rgba_rect_layer_green_.Text);
                    green = ClipIntValue(green, 0, 255);

                    blue = Int32.Parse(edit_rgba_rect_layer_blue_.Text);
                    blue = ClipIntValue(blue, 0, 255);

                    alpha = Int32.Parse(edit_rgba_rect_layer_alpha_.Text);
                    alpha = ClipIntValue(alpha, 0, 255);

                    NT_PB_RGBARectangleLayerConfig rgba_layer_c3 = new NT_PB_RGBARectangleLayerConfig();

                    rgba_layer_c3.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE;
                    rgba_layer_c3.base_.index_ = 3;
                    rgba_layer_index_ = rgba_layer_c3.base_.index_;
                    rgba_layer_c3.base_.enable_ = 1;
                    rgba_layer_c3.base_.region_.x_ = left;     //这个只是demo演示,实际以需要遮盖位置为准
                    rgba_layer_c3.base_.region_.y_ = top;
                    rgba_layer_c3.base_.region_.width_ = 160;
                    rgba_layer_c3.base_.region_.height_ = 160;

                    rgba_layer_c3.base_.offset_ = Marshal.OffsetOf(rgba_layer_c3.GetType(), "base_").ToInt32();
                    rgba_layer_c3.base_.cb_size_ = (uint)Marshal.SizeOf(rgba_layer_c3);

                    rgba_layer_c3.red_ = System.BitConverter.GetBytes(red)[0];
                    rgba_layer_c3.green_ = System.BitConverter.GetBytes(green)[0];
                    rgba_layer_c3.blue_ = System.BitConverter.GetBytes(blue)[0];
                    rgba_layer_c3.alpha_ = System.BitConverter.GetBytes(alpha)[0];

                    IntPtr rgba_conf_3 = Marshal.AllocHGlobal(Marshal.SizeOf(rgba_layer_c3));

                    Marshal.StructureToPtr(rgba_layer_c3, rgba_conf_3, true);

                    UInt32 rgba_r_3 = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                    rgba_conf_3, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE,
                                    0, IntPtr.Zero);

                    Console.WriteLine("NT_PB_AddLayerConfig, rgba: " + rgba_r_3 + Environment.NewLine);

                    Marshal.FreeHGlobal(rgba_conf_3);
                    
                    // 第四层填充png水印(注意,实时开启、关闭水印,是根据图层的index来的,如此demo,png水印的index为4)
                    // 如果有图片,增加图片层
                    if (!String.IsNullOrEmpty(image_layer_file_name_utf8_)
                        && image_layer_width_ > 0
                        && image_layer_height_ > 0)
                    {
                        NT_PB_ImageLayerConfig image_layer_c4 = new NT_PB_ImageLayerConfig();

                        image_layer_c4.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_IMAGE;
                        image_layer_c4.base_.index_ = 4;
                        image_layer_index_ = image_layer_c4.base_.index_;
                        image_layer_c4.base_.enable_ = 1;
                        image_layer_c4.base_.region_.x_ = image_layer_left_;
                        image_layer_c4.base_.region_.y_ = image_layer_top_;
                        image_layer_c4.base_.region_.width_ = image_layer_width_;
                        image_layer_c4.base_.region_.height_ = image_layer_height_;

                        image_layer_c4.base_.offset_ = Marshal.OffsetOf(image_layer_c4.GetType(), "base_").ToInt32();
                        image_layer_c4.base_.cb_size_ = (uint)Marshal.SizeOf(image_layer_c4);

                        byte[] buffer1 = Encoding.Default.GetBytes(image_layer_file_name_utf8_);
                        byte[] buffer2 = Encoding.Convert(Encoding.UTF8, Encoding.Default, buffer1, 0, buffer1.Length);
                        string strBuffer = Encoding.Default.GetString(buffer2, 0, buffer2.Length);

                        image_layer_c4.file_name_utf8_ = strBuffer;

                        image_layer_c4.is_setting_background_ = 0;
                        image_layer_c4.bk_red_ = 0;
                        image_layer_c4.bk_green_ = 0;
                        image_layer_c4.bk_blue_ = 0;
                        image_layer_c4.reserve_ = 0;

                        IntPtr image_conf = Marshal.AllocHGlobal(Marshal.SizeOf(image_layer_c4));

                        Marshal.StructureToPtr(image_layer_c4, image_conf, true);

                        UInt32 image_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                        image_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_IMAGE,
                                        0, IntPtr.Zero);

                        Console.WriteLine("NT_PB_AddLayerConfig, image: " + image_r + Environment.NewLine);

                        Marshal.FreeHGlobal(image_conf);

                        NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, UInt32.Parse(edit_frame_rate_.Text));
                    }
                }
                else if (btn_camera_overlay_to_desktop.Checked)
                {
                    //摄像头overlay到桌面
                    int left = Int32.Parse(edit_clip_left_.Text);
                    int top = Int32.Parse(edit_clip_top_.Text);
                    int w = Int32.Parse(edit_clip_width_.Text);
                    int h = Int32.Parse(edit_clip_height_.Text);

			        // 有一个是0, 就使用全屏
			        if ( w == 0 || h == 0 )
			        {
				        left = 0;
				        top = 0;
                        w = screenArea_.Width;
                        h = screenArea_.Height;
			        }
			        else
			        {
				        // 保证4字节对齐
                        w = NT_ByteAlign(w, 4);
                        h = NT_ByteAlign(h, 4);
			        }

                    //第一层:屏幕
                    NT_PB_ScreenLayerConfig screen_layer_c0 = new NT_PB_ScreenLayerConfig();

                    screen_layer_c0.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN;
                    screen_layer_c0.base_.index_ = 0;
                    screen_layer_index_ = screen_layer_c0.base_.index_;
                    screen_layer_c0.base_.enable_ = 1;
                    screen_layer_c0.base_.region_.x_ = left;
                    screen_layer_c0.base_.region_.y_ = top;
                    screen_layer_c0.base_.region_.width_ = w;
                    screen_layer_c0.base_.region_.height_ = h;

                    screen_layer_c0.base_.offset_ = Marshal.OffsetOf(screen_layer_c0.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                    screen_layer_c0.base_.cb_size_ = (uint)Marshal.SizeOf(screen_layer_c0);

                    screen_layer_c0.clip_region_.x_ = left;
                    screen_layer_c0.clip_region_.y_ = top;
                    screen_layer_c0.clip_region_.width_ = w;
                    screen_layer_c0.clip_region_.height_ = h;

                    screen_layer_c0.reserve_ = IntPtr.Zero;

                    NTSmartPublisherSDK.NT_PB_ClearLayersConfig(publisher_handle_, 0,
                            0, IntPtr.Zero);
                    
                    IntPtr scr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(screen_layer_c0));

                    Marshal.StructureToPtr(screen_layer_c0, scr_conf, true);

                    UInt32 s_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                scr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN,
                                0, IntPtr.Zero);

                    Marshal.FreeHGlobal(scr_conf);

                    //第二层:摄像头
                    if (-1 != cur_sel_camera_index_)
                    {
                        int c_l = Int32.Parse(edit_camera_overlay_left_.Text);
                        int c_t = Int32.Parse(edit_camera_overlay_top_.Text);

                        int c_w = Int32.Parse(edit_camera_overlay_width_.Text);
                        int c_h = Int32.Parse(edit_camera_overlay_height_.Text);

                        if (c_w == 0)
                        {
                            c_w = w / 2;
                        }

                        if (c_h == 0)
                        {
                            c_h = h / 2;
                        }

                        ctos_camera_layer_c1_ = new NT_PB_CameraLayerConfigV2();

                        CameraInfo camera = cameras_[cur_sel_camera_index_];
                        NT_PB_VideoCaptureCapability cap = camera.capabilities_[cur_sel_camera_resolutions_index_];

                        ctos_camera_layer_c1_.device_unique_id_utf8_ = camera.id_;

                        ctos_camera_layer_c1_.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_CAMERA;
                        ctos_camera_layer_c1_.base_.index_ = 1;
                        camera_layer_index_ = ctos_camera_layer_c1_.base_.index_;
                        ctos_camera_layer_c1_.base_.enable_ = 1;
                        ctos_camera_layer_c1_.base_.region_.x_ = c_l;
                        ctos_camera_layer_c1_.base_.region_.y_ = c_t;
                        ctos_camera_layer_c1_.base_.region_.width_ = c_w;
                        ctos_camera_layer_c1_.base_.region_.height_ = c_h;

                        if (btn_check_flip_horizontal_camera_.Checked)
                        {
                            ctos_camera_layer_c1_.is_flip_horizontal_ = 1;
                        }
                        else
                        {
                            ctos_camera_layer_c1_.is_flip_horizontal_ = 0;
                        }

                        if (btn_check_flip_vertical_camera_.Checked)
                        {
                            ctos_camera_layer_c1_.is_flip_vertical_ = 1;
                        }
                        else
                        {
                            ctos_camera_layer_c1_.is_flip_vertical_ = 0;
                        }

                        ctos_camera_layer_c1_.rotate_degress_ = GetCameraRotateDegress();

                        ctos_camera_layer_c1_.base_.offset_ = Marshal.OffsetOf(ctos_camera_layer_c1_.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                        ctos_camera_layer_c1_.base_.cb_size_ = (uint)Marshal.SizeOf(ctos_camera_layer_c1_);

                        IntPtr cmr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(ctos_camera_layer_c1_));

                        Marshal.StructureToPtr(ctos_camera_layer_c1_, cmr_conf, true);

                        UInt32 c_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                    cmr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_CAMERA,
                                    0, IntPtr.Zero);

                        Marshal.FreeHGlobal(cmr_conf);
                    }
                   
                    NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, UInt32.Parse(edit_frame_rate_.Text));
                }
                else if (btn_desktop_overlay_to_camera.Checked)
                {
                    //桌面overlay到摄像头

                    //第一层:摄像头
                    if (-1 != cur_sel_camera_index_
                           && -1 != cur_sel_camera_resolutions_index_
                           && -1 != cur_sel_camera_frame_rate_index_)
                    {
                        NT_PB_CameraLayerConfigV2 camera_layer_c0 = new NT_PB_CameraLayerConfigV2();

                        CameraInfo camera = cameras_[cur_sel_camera_index_];
                        NT_PB_VideoCaptureCapability cap = camera.capabilities_[cur_sel_camera_resolutions_index_];

                        camera_layer_c0.device_unique_id_utf8_ = camera.id_;

                        camera_layer_c0.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_CAMERA;
                        camera_layer_c0.base_.index_ = 0;
                        camera_layer_index_ = camera_layer_c0.base_.index_;
                        camera_layer_c0.base_.enable_ = 1;
                        camera_layer_c0.base_.region_.x_ = 0;
                        camera_layer_c0.base_.region_.y_ = 0;
                        camera_layer_c0.base_.region_.width_ = cap.width_;
                        camera_layer_c0.base_.region_.height_ = cap.height_;

                        if (btn_check_flip_horizontal_camera_.Checked)
                        {
                            camera_layer_c0.is_flip_horizontal_ = 1;
                        }
                        else
                        {
                            camera_layer_c0.is_flip_horizontal_ = 0;
                        }

                        if (btn_check_flip_vertical_camera_.Checked)
                        {
                            camera_layer_c0.is_flip_vertical_ = 1;
                        }
                        else
                        {
                            camera_layer_c0.is_flip_vertical_ = 0;
                        }

                        // 这种叠加模式下不要旋转,否则变形厉害, 要么就定好一个角度,调整宽高,但不要动态旋转
                        camera_layer_c0.rotate_degress_ = 0;

                        camera_layer_c0.base_.offset_ = Marshal.OffsetOf(camera_layer_c0.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                        camera_layer_c0.base_.cb_size_ = (uint)Marshal.SizeOf(camera_layer_c0);

                        NTSmartPublisherSDK.NT_PB_ClearLayersConfig(publisher_handle_, 0,
                                        0, IntPtr.Zero);

                        IntPtr cmr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(camera_layer_c0));

                        Marshal.StructureToPtr(camera_layer_c0, cmr_conf, true);

                        UInt32 r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                    cmr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_CAMERA,
                                    0, IntPtr.Zero);

                        Marshal.FreeHGlobal(cmr_conf);

                        //第二层:屏幕
                        NT_PB_ScreenLayerConfig screen_layer_c1 = new NT_PB_ScreenLayerConfig();

                        screen_layer_c1.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN;
                        screen_layer_c1.base_.index_ = 1;
                        screen_layer_index_ = screen_layer_c1.base_.index_;
                        screen_layer_c1.base_.enable_ = 1;
                        screen_layer_c1.base_.region_.x_ = 0;
                        screen_layer_c1.base_.region_.y_ = 0;
                        screen_layer_c1.base_.region_.width_ = cap.width_ / 2;
                        screen_layer_c1.base_.region_.height_ = cap.height_ / 2;

                        screen_layer_c1.base_.offset_ = Marshal.OffsetOf(screen_layer_c1.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                        screen_layer_c1.base_.cb_size_ = (uint)Marshal.SizeOf(screen_layer_c1);

                        screen_layer_c1.clip_region_.x_ = 0;
                        screen_layer_c1.clip_region_.y_ = 0;
                        screen_layer_c1.clip_region_.width_ = cap.width_ / 2;
                        screen_layer_c1.clip_region_.height_ = cap.height_ / 2;

                        screen_layer_c1.reserve_ = IntPtr.Zero;

                        IntPtr scr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(screen_layer_c1));

                        Marshal.StructureToPtr(screen_layer_c1, scr_conf, true);

                        UInt32 s_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                    scr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN,
                                    0, IntPtr.Zero);

                        Marshal.FreeHGlobal(scr_conf);
                    }

                    NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, (uint)(cur_sel_camera_frame_rate_index_ + 1));
                }
                else if (btn_check_desktop_input_.Checked && btn_check_scale_desktop_.Checked)
                {
                    int left = 0;
                    int top = 0;
                    int w = 0;
                    int h = 0;
                    int scale_w = 0;
                    int scale_h = 0;

                    GetScreenScaleConfigInfo(ref left, ref top, ref w, ref h, ref scale_w, ref scale_h);

                    NTSmartPublisherSDK.NT_PB_ClearLayersConfig(publisher_handle_, 0,
                    0, IntPtr.Zero);

                    // 第0层填充RGBA矩形, 目的是保证帧率, 颜色就填充全黑
                    int red = 0;
                    int green = 0;
                    int blue = 0;   
                    int alpha = 255;

                    NT_PB_RGBARectangleLayerConfig rgba_layer_c0 = new NT_PB_RGBARectangleLayerConfig();

                    rgba_layer_c0.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE;
                    rgba_layer_c0.base_.index_ = 0;
                    rgba_layer_index_ = rgba_layer_c0.base_.index_;
                    rgba_layer_c0.base_.enable_ = 1;
                    rgba_layer_c0.base_.region_.x_ = 0;
                    rgba_layer_c0.base_.region_.y_ = 0;
                    rgba_layer_c0.base_.region_.width_ = scale_w;
                    rgba_layer_c0.base_.region_.height_ = scale_h;

                    rgba_layer_c0.base_.offset_ = Marshal.OffsetOf(rgba_layer_c0.GetType(), "base_").ToInt32();
                    rgba_layer_c0.base_.cb_size_ = (uint)Marshal.SizeOf(rgba_layer_c0);

                    rgba_layer_c0.red_   = 0;
                    rgba_layer_c0.green_ = 0;
                    rgba_layer_c0.blue_  = 0;
                    rgba_layer_c0.alpha_ = 255;

                    IntPtr rgba_conf_0 = Marshal.AllocHGlobal(Marshal.SizeOf(rgba_layer_c0));

                    Marshal.StructureToPtr(rgba_layer_c0, rgba_conf_0, true);

                    UInt32 rgba_r_0 = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                    rgba_conf_0, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE,
                                    0, IntPtr.Zero);

                    Console.WriteLine("NT_PB_AddLayerConfig, rgba: " + rgba_r_0 + Environment.NewLine);

                    Marshal.FreeHGlobal(rgba_conf_0);

                    //第1层
                    NT_PB_ScreenLayerConfigV2 screen_layer_c1 = new NT_PB_ScreenLayerConfigV2();

                    screen_layer_c1.base_.type_ = (Int32)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN;
                    screen_layer_c1.base_.index_ = 1;
                    screen_layer_index_ = screen_layer_c1.base_.index_;
                    screen_layer_c1.base_.enable_ = checkbox_black_screen_.Checked?0:1;
                    screen_layer_c1.base_.region_.x_ = left;
                    screen_layer_c1.base_.region_.y_ = top;
                    screen_layer_c1.base_.region_.width_ = scale_w;
                    screen_layer_c1.base_.region_.height_ = scale_h;

                    screen_layer_c1.base_.offset_ = Marshal.OffsetOf(screen_layer_c1.GetType(), "base_").ToInt32(); //offsetof(T, base_);
                    screen_layer_c1.base_.cb_size_ = (uint)Marshal.SizeOf(screen_layer_c1);

                    screen_layer_c1.clip_region_.x_ = left;
                    screen_layer_c1.clip_region_.y_ = top;
                    screen_layer_c1.clip_region_.width_ = w;
                    screen_layer_c1.clip_region_.height_ = h;

                    screen_layer_c1.reserve1_ = IntPtr.Zero;
                    screen_layer_c1.reserve2_ = 0;
                    screen_layer_c1.scale_filter_mode_ = 3;
                    
                    IntPtr scr_conf = Marshal.AllocHGlobal(Marshal.SizeOf(screen_layer_c1));

                    Marshal.StructureToPtr(screen_layer_c1, scr_conf, true);

                    UInt32 s_r = NTSmartPublisherSDK.NT_PB_AddLayerConfig(publisher_handle_, 0,
                                scr_conf, (int)NTSmartPublisherDefine.NT_PB_E_LAYER_TYPE.NT_PB_E_LAYER_TYPE_SCREEN,
                                0, IntPtr.Zero);

                    Marshal.FreeHGlobal(scr_conf);

                    NTSmartPublisherSDK.NT_PB_SetSleepMode(publisher_handle_, checkbox_black_screen_.Checked ? 1 : 0, 0);

                    NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, UInt32.Parse(edit_frame_rate_.Text));
                }
                else if (btn_check_desktop_input_.Checked && !btn_check_scale_desktop_.Checked)
                {
                    //桌面
                    NTSmartPublisherSDK.NT_PB_SetScreenClip(publisher_handle_,
                    UInt32.Parse(edit_clip_left_.Text),
                    UInt32.Parse(edit_clip_top_.Text),
                    UInt32.Parse(edit_clip_width_.Text),
                    UInt32.Parse(edit_clip_height_.Text));

                    NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, UInt32.Parse(edit_frame_rate_.Text));
                }
                else if (btn_check_window_input_.Checked)
                {
                    if (IntPtr.Zero != cur_sel_capture_window_)
                    {
                        NTSmartPublisherSDK.NT_PB_SetCaptureWindow(publisher_handle_, cur_sel_capture_window_);

                        NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, UInt32.Parse(edit_frame_rate_.Text));

                        NTSmartPublisherSDK.NT_PB_ClearLayersConfig(publisher_handle_, 0, 0, IntPtr.Zero);
                    }
                }
                else if (btn_check_camera_input_.Checked)
                {
                    //摄像头
                    if (-1 != cur_sel_camera_index_
                        && -1 != cur_sel_camera_resolutions_index_
                        && -1 != cur_sel_camera_frame_rate_index_)
                    {
                        CameraInfo camera = cameras_[cur_sel_camera_index_];
                        NT_PB_VideoCaptureCapability cap = camera.capabilities_[cur_sel_camera_resolutions_index_];

                        NTSmartPublisherSDK.NT_PB_SetVideoCaptureDeviceBaseParameter(publisher_handle_,
                            camera.id_.ToString(), (UInt32)cap.width_, (UInt32)cap.height_);

                        NTSmartPublisherSDK.NT_PB_SetFrameRate(publisher_handle_, (UInt32)(cur_sel_camera_frame_rate_index_ + 1));

                        if (btn_check_flip_vertical_camera_.Checked)
                        {
                            NTSmartPublisherSDK.NT_PB_FlipVerticalCamera(publisher_handle_, 1);
                        }
                        else
                        {
                            NTSmartPublisherSDK.NT_PB_FlipVerticalCamera(publisher_handle_, 0);
                        }

                        if (btn_check_flip_horizontal_camera_.Checked)
                        {
                            NTSmartPublisherSDK.NT_PB_FlipHorizontalCamera(publisher_handle_, 1);
                        }
                        else
                        {
                            NTSmartPublisherSDK.NT_PB_FlipHorizontalCamera(publisher_handle_, 0);
                        }

                        Int32 degress = GetCameraRotateDegress();
                        NTSmartPublisherSDK.NT_PB_RotateCamera(publisher_handle_, degress);
                    }
                }

音视频参数设定

其他音视频相关接口参数设定,比是否启用DXGI, Aero模式,软硬编码模式,帧率关键帧间隔码率等设定。

                if (btn_check_dxgi_screen_capturer_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_EnableDXGIScreenCapturer(publisher_handle_, 1);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_EnableDXGIScreenCapturer(publisher_handle_, 0);
                }

                if (check_capture_layered_window_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_EnableScreenCaptureLayeredWindow(publisher_handle_, 1);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_EnableScreenCaptureLayeredWindow(publisher_handle_, 0);
                }

                if (btn_check_capturer_disable_aero_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_DisableAeroScreenCapturer(publisher_handle_, 1);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_DisableAeroScreenCapturer(publisher_handle_, 0);
                }

                if (btn_check_wr_way_capture_window_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetCaptureWindowWay(publisher_handle_, 2);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetCaptureWindowWay(publisher_handle_, 1);
                }

                int cur_video_codec_id = (int)NTCommonMediaDefine.NT_MEDIA_CODEC_ID.NT_MEDIA_CODEC_ID_H264;

                if (btn_check_h265_encoder_.Checked)
                {
                    cur_video_codec_id = (int)NTCommonMediaDefine.NT_MEDIA_CODEC_ID.NT_MEDIA_CODEC_ID_H265;
                }

                bool is_hw_encoder = false;

                if ( btn_check_video_hardware_encoder_.Checked)
                {
                    is_hw_encoder = true;
                }

                Int32 cur_sel_encoder_id = 0;
                Int32 cur_sel_gpu = 0;


                if (is_hw_encoder)
                {
                    int cur_sel_hw = combobox_video_encoders_.SelectedIndex;
                    if (cur_sel_hw >= 0)
                    {
                        cur_sel_encoder_id = Convert.ToInt32(combobox_video_encoders_.SelectedValue);
                        cur_sel_gpu = -1;

                        int cur_sel_hw_dev = combobox_video_hardware_encoder_devices_.SelectedIndex;
                        if (cur_sel_hw_dev >= 0)
                        {
                            cur_sel_gpu = Convert.ToInt32(combobox_video_hardware_encoder_devices_.SelectedValue);
                        }
                    }
                    else
                    {
                        is_hw_encoder = false;
                    }
                }

                if (!is_hw_encoder)
                {
                    if ((int)NTCommonMediaDefine.NT_MEDIA_CODEC_ID.NT_MEDIA_CODEC_ID_H264 == cur_video_codec_id)
                    {
                        cur_sel_encoder_id = btn_check_openh264_encoder_.Checked ? 1 : 0;
                    }
                }

                NTSmartPublisherSDK.NT_PB_SetVideoEncoder(publisher_handle_, (int)(is_hw_encoder ? 1 : 0), (int)cur_sel_encoder_id, (uint)cur_video_codec_id, (int)cur_sel_gpu);

                if (!btn_check_window_input_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetVideoBitRate(publisher_handle_, Int32.Parse(edit_bit_rate_.Text));
                }
                else
                {
                    // 窗口的分辨率会变, 所以设置一组码率下去
                    Int32 frame_rate = Int32.Parse(edit_bit_rate_.Text);
                    SetBitrateGroup(publisher_handle_, frame_rate);
                }

                NTSmartPublisherSDK.NT_PB_SetVideoQualityV2(publisher_handle_, Int32.Parse(edit_video_quality_.Text));

                NTSmartPublisherSDK.NT_PB_SetVideoMaxBitRate(publisher_handle_, Int32.Parse(edit_video_max_bitrate_.Text));

                NTSmartPublisherSDK.NT_PB_SetVideoKeyFrameInterval(publisher_handle_, Int32.Parse(edit_key_frame_.Text));

                if (cur_video_codec_id == (int)NTCommonMediaDefine.NT_MEDIA_CODEC_ID.NT_MEDIA_CODEC_ID_H264)
                {
                    int profile_sel = combox_h264_profile_.SelectedIndex;

                    if (profile_sel != -1)
                    {
                        NTSmartPublisherSDK.NT_PB_SetVideoEncoderProfile(publisher_handle_, profile_sel + 1);
                    }
                }

                NTSmartPublisherSDK.NT_PB_SetVideoEncoderSpeed(publisher_handle_, Int32.Parse(edit_video_encode_speed_.Text));

                // 清除编码器所有的特定的参数
                NTSmartPublisherSDK.NT_PB_ClearVideoEncoderSpecialOptions(publisher_handle_);

                if (cur_sel_encoder_id == 1)
                {
                    // qp_max 和 qp_min 当前只对openh264有效, 这里也就只在openh264使用的场景下设置配置值
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderQPMax(publisher_handle_, Int32.Parse(edit_qp_max_.Text));
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderQPMin(publisher_handle_, Int32.Parse(edit_qp_min_.Text));

                    // openh264 配置特定参数
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderSpecialInt32Option(publisher_handle_, "usage_type", btn_check_openh264_ppt_usage_type_.Checked ? 1 : 0);
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderSpecialInt32Option(publisher_handle_, "rc_mode", btn_check_openh264_rc_bitrate_mode_.Checked ? 1 : 0);
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderSpecialInt32Option(publisher_handle_, "enable_frame_skip", btn_check_openh264_frame_skip_.Checked ? 1 : 0);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderQPMax(publisher_handle_, -1);
                    NTSmartPublisherSDK.NT_PB_SetVideoEncoderQPMin(publisher_handle_, -1);
                }

                // 音频相关设置
                if (btn_check_auido_mic_input_.Checked)
                {
                    int count = combox_auido_input_devices_.Items.Count;
                    if (count != -1 && count > 0)
                    {
                        int cur_sel = combox_auido_input_devices_.SelectedIndex;
                        if (cur_sel != -1)
                        {
                            NTSmartPublisherSDK.NT_PB_SetAuidoInputDeviceId(publisher_handle_, (uint)cur_sel);
                        }
                    }
                }

                // 只采集扬声器时做静音补偿
                if (!btn_check_auido_mic_input_.Checked
                    && btn_check_auido_speaker_input_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetCaptureSpeakerCompensateMute(publisher_handle_, 1);
                }

                if (btn_check_speex_encoder_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetPublisherAudioCodecType(publisher_handle_, 2);

                    NTSmartPublisherSDK.NT_PB_SetPublisherSpeexEncoderQuality(publisher_handle_, Int32.Parse(edit_speex_quality_.Text));
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetPublisherAudioCodecType(publisher_handle_, 1);
                }

                if (btn_check_auido_mic_input_.Checked
                    || btn_check_auido_speaker_input_.Checked)
                {
                    if (btn_check_set_mute_.Checked)
                    {
                        NTSmartPublisherSDK.NT_PB_SetMute(publisher_handle_, 1);
                    }
                }

                if (btn_check_echo_cancel_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetEchoCancellation(publisher_handle_, 1, Int32.Parse(edit_echo_delay_.Text));
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetEchoCancellation(publisher_handle_, 0, 0);
                }

                if (btn_check_noise_suppression_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetNoiseSuppression(publisher_handle_, 1);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetNoiseSuppression(publisher_handle_, 0);
                }

                if (btn_check_agc_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetAGC(publisher_handle_, 1);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetAGC(publisher_handle_, 0);
                }

                if (btn_check_vad_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetVAD(publisher_handle_, 1);
                }
                else
                {
                    NTSmartPublisherSDK.NT_PB_SetVAD(publisher_handle_, 0);
                }

                if (btn_check_auido_mic_input_.Checked
                    && btn_check_auido_speaker_input_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetInputAudioVolume(publisher_handle_, 0, Convert.ToSingle(edit_audio_input_volume_.Text));
                    NTSmartPublisherSDK.NT_PB_SetInputAudioVolume(publisher_handle_, 1, Convert.ToSingle(edit_audio_speaker_input_volume_.Text));
                }
                else if (btn_check_auido_mic_input_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetInputAudioVolume(publisher_handle_, 0, Convert.ToSingle(edit_audio_input_volume_.Text));
                }
                else if (btn_check_auido_speaker_input_.Checked)
                {
                    NTSmartPublisherSDK.NT_PB_SetInputAudioVolume(publisher_handle_, 0, Convert.ToSingle(edit_audio_speaker_input_volume_.Text));
                }

获取视频码率默认值,不是每个开发者都有音视频开发背景,如果不想自行设置码率等一些参数,可参考我们的码率设定。

       private void FillBitrateControlDefValue()
        {
            int w = 640, h = 480;
            int frame_rate = 5;
            bool is_var_bitrate = false;

            GetVideoConfigInfo(ref w, ref h, ref frame_rate, ref is_var_bitrate);

            if (btn_check_openh264_encoder_.Checked)
            {
                is_var_bitrate = false;
            }

            int kbit_rate = CalBitRate(frame_rate, w, h);
            int max_kbit_rate = CalMaxKBitRate(frame_rate, w, h, is_var_bitrate);

            if (is_var_bitrate)
            {
                btn_check_video_bitrate_.CheckState = CheckState.Unchecked;
            }
            else
            {
                btn_check_video_bitrate_.CheckState = CheckState.Checked;
            }

            if (is_var_bitrate)
            {
                edit_bit_rate_.Enabled = false;
                edit_video_quality_.Enabled = true;
            }
            else
            {
                edit_bit_rate_.Enabled = true;
                edit_video_quality_.Enabled = false;
            }

            if (btn_check_video_bitrate_.Checked)
            {
                edit_bit_rate_.Text = kbit_rate.ToString();
                edit_video_max_bitrate_.Text = max_kbit_rate.ToString();
            }
            else
            {
                edit_bit_rate_.Text = "0";
                edit_video_max_bitrate_.Text = max_kbit_rate.ToString();
            }

            bool is_h264 = false;

            if (btn_check_h265_encoder_.Checked)
            {
                is_h264 = false;
            }
            else
            {
                is_h264 = true;
            }

            edit_video_quality_.Text = CalVideoQuality(w, h, is_h264).ToString();

            combox_h264_profile_.SelectedIndex = 2;

            edit_video_encode_speed_.Text = CalVideoEncoderSpeed(w, h, is_h264).ToString();

            // 默认关键帧间隔设置为帧率的2倍
            edit_key_frame_.Text = (frame_rate * 2).ToString();
        }

开始推送

设置推送URL后,调用StartPublisher接口开始推流,如需发送扩展SEI用户数据,推送之前设置下数据发送对接大小。

            if (publisher_handle_ == IntPtr.Zero)
            {
                MessageBox.Show("[publish] handle with null");
            }
            if (!String.IsNullOrEmpty(url))
	        {
                NTSmartPublisherSDK.NT_PB_SetURL(publisher_handle_, url, IntPtr.Zero);
	        }

            //设置用户数据发送队列大小
            NTSmartPublisherSDK.NT_PB_SetPostUserDataQueueMaxSize(publisher_handle_, 3, 0);

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_StartPublisher(publisher_handle_, IntPtr.Zero))
            {
                if (0 == publisher_handle_count_)
                {
                    NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                    publisher_handle_ = IntPtr.Zero;
                }

                is_publishing_ = false;

                MessageBox.Show("调用推流接口失败");

                return;
            }

停止推送

调用NT_PB_StopPublisher()即可,停止推送后,如果没有录像等,可调用NT_PB_Close()接口,关掉实例,并把handle置 IntPtr.Zero。

        private void btn_stop_publish_Click(object sender, EventArgs e)
        {
            publisher_handle_count_--;
            NTSmartPublisherSDK.NT_PB_StopPublisher(publisher_handle_);

            rtmp_play_urls_.Clear();
            UpdateDisplayURLs();

            if (0 == publisher_handle_count_)
            {
                NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                publisher_handle_ = IntPtr.Zero;
            }

            btn_publish.Enabled = true;
            btn_stop_publish.Enabled = false;

            is_publishing_ = false;

            if (0 == publisher_handle_count_)
            {
                if (btn_check_desktop_input_.Checked)
                {
                    btn_choose_screen_region_.Text = "选择屏幕区域";
                }

                btn_check_dxgi_screen_capturer_.Enabled = true;
                check_capture_layered_window_.Enabled = true;

                btn_check_wr_way_capture_window_.Enabled = true;

                btn_desktop_camera_switch_.Text = "切换到摄像头";
                btn_disable_image_watermark_.Text = "停止水印";
                btn_disable_camera_overlay_.Text = "停止叠加摄像头";
                btn_disable_desktop_overlay_.Text = "停止叠加屏幕";

                btn_desktop_camera_switch.Enabled = true;
                btn_camera_overlay_to_desktop.Enabled = true;
                btn_desktop_overlay_to_camera.Enabled = true;
                btn_desktop_camera_switch.Enabled = true;

                btn_check_desktop_input_.Enabled = true;
                btn_check_scale_desktop_.Enabled = true;
                edit_desktop_scale_.Enabled = true;
                btn_check_camera_input_.Enabled = true;

                btn_add_image_watermark_.Enabled = true;

                timer_clock_.Enabled = false;

                if (btn_desktop_camera_switch.Checked
                    || btn_camera_overlay_to_desktop.Checked
                    || btn_desktop_overlay_to_camera.Checked)
                {
                    btn_check_desktop_input_.CheckState = CheckState.Checked;
                    btn_check_camera_input_.CheckState = CheckState.Checked;
                }
                else
                {
                }

                EnableAuidoInputControl();
            }
        }

预览推送数据

设置NT_PB_SetVideoPreviewImageCallBack(),调用NT_PB_StartPreview()接口即可。

        private void btn_preview_Click(object sender, EventArgs e)
        {
            if (btn_check_window_input_.Checked)
            {
                if (IntPtr.Zero == cur_sel_capture_window_)
                {
                    MessageBox.Show("请先下拉选择采集窗口");
                    return;
                }
            }

            if (publisher_handle_ == IntPtr.Zero)
            {
                if (!OpenPublisherHandle())
                {
                    return;
                }
            }

            if (publisher_handle_count_ < 1)
            {
                SetCommonOptionToPublisherSDK();
            }

            video_preview_image_callback_ = new NT_PB_SDKVideoPreviewImageCallBack(SDKVideoPreviewImageCallBack);

            NTSmartPublisherSDK.NT_PB_SetVideoPreviewImageCallBack(publisher_handle_, (int)NTSmartPublisherDefine.NT_PB_E_IMAGE_FORMAT.NT_PB_E_IMAGE_FORMAT_RGB32, IntPtr.Zero, video_preview_image_callback_);

            if (NTBaseCodeDefine.NT_ERC_OK != NTSmartPublisherSDK.NT_PB_StartPreview(publisher_handle_, 0, IntPtr.Zero))
            {
                if (0 == publisher_handle_count_)
                {
                    NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                    publisher_handle_ = IntPtr.Zero;
                }

                 MessageBox.Show("预览失败, 请确保选择了视频采集选项");
                return;
            }

            publisher_handle_count_++;

            btn_preview.Enabled = false;
            btn_stop_preview.Enabled = true;

            if (1 == publisher_handle_count_)
            {
                if (btn_check_desktop_input_.Checked)
                {
                    btn_choose_screen_region_.Text = "移动屏幕区域";
                }

                btn_check_dxgi_screen_capturer_.Enabled = false;
                check_capture_layered_window_.Enabled = false;

                btn_check_wr_way_capture_window_.Enabled = false;

                btn_desktop_camera_switch.Enabled = false;
                btn_camera_overlay_to_desktop.Enabled = false;
                btn_desktop_overlay_to_camera.Enabled = false;

                btn_add_image_watermark_.Enabled = false;

                if (btn_desktop_camera_switch.Checked
                    || btn_camera_overlay_to_desktop.Checked
                    || btn_desktop_overlay_to_camera.Checked)
                {

                }
                else
                {
                    btn_check_desktop_input_.Enabled = false;
                    btn_check_camera_input_.Enabled = false;
                }

                DisableAuidoInputControl();
            }

            if (ui_preview_wnd_ == null)
            {
                ui_preview_wnd_ = new nt_pb_ui_preview_wnd();
            }

            ui_preview_wnd_.Show();
        }

        public void VideoPreviewImageCallBack(NT_VideoFrame frame)
        {
            if (cur_image_.plane_ != IntPtr.Zero)
            {
                Marshal.FreeHGlobal(cur_image_.plane_);
                cur_image_.plane_ = IntPtr.Zero;
            }

            cur_image_ = frame;

            if (ui_preview_wnd_ != null)
            {
                ui_preview_wnd_.OnRGBXImage(cur_image_);
            }
        }

        public void SDKVideoPreviewImageCallBack(IntPtr handle, IntPtr user_data, IntPtr image)
        {
            NT_PB_Image pb_image = (NT_PB_Image)Marshal.PtrToStructure(image, typeof(NT_PB_Image));

            NT_VideoFrame pVideoFrame = new NT_VideoFrame();

            pVideoFrame.width_ = pb_image.width_;
            pVideoFrame.height_ = pb_image.height_;

            pVideoFrame.stride_ = pb_image.stride_[0];

            Int32 argb_size = pb_image.stride_[0] * pb_image.height_;
            
            pVideoFrame.plane_ = Marshal.AllocHGlobal(argb_size);

            CopyMemory(pVideoFrame.plane_, pb_image.plane_[0], (UInt32)argb_size);

            if (InvokeRequired)
            {
                BeginInvoke(set_video_preview_image_callback_, pVideoFrame);
            }
            else
            {
                set_video_preview_image_callback_(pVideoFrame);
            }
        }

停止预览更简单,调用NT_PB_StopPreview()。

        private void btn_stop_preview_Click(object sender, EventArgs e)
        {
            publisher_handle_count_--;
            NTSmartPublisherSDK.NT_PB_StopPreview(publisher_handle_);

            if (0 == publisher_handle_count_)
            {
                NTSmartPublisherSDK.NT_PB_Close(publisher_handle_);
                publisher_handle_ = IntPtr.Zero;
            }

            btn_preview.Enabled = true;
            btn_stop_preview.Enabled = false;

            if (0 == publisher_handle_count_)
            {
                if (btn_check_desktop_input_.Checked)
                {
                    btn_choose_screen_region_.Text = "选择屏幕区域";
                }

                btn_check_dxgi_screen_capturer_.Enabled = true;
                check_capture_layered_window_.Enabled = true;

                btn_check_wr_way_capture_window_.Enabled = true;

                btn_desktop_camera_switch_.Text = "切换到摄像头";
                btn_disable_image_watermark_.Text = "停止水印";
                btn_disable_camera_overlay_.Text = "停止叠加摄像头";
                btn_disable_desktop_overlay_.Text = "停止叠加屏幕";

                btn_desktop_camera_switch.Enabled = true;
                btn_camera_overlay_to_desktop.Enabled = true;
                btn_desktop_overlay_to_camera.Enabled = true;
                btn_desktop_camera_switch.Enabled = true;

                btn_check_desktop_input_.Enabled = true;
                btn_check_camera_input_.Enabled = true;

                btn_add_image_watermark_.Enabled = true;

                timer_clock_.Enabled = false;

                if (btn_desktop_camera_switch.Checked
                    || btn_camera_overlay_to_desktop.Checked
                    || btn_desktop_overlay_to_camera.Checked)
                {
                    btn_check_desktop_input_.CheckState = CheckState.Checked;
                    btn_check_camera_input_.CheckState = CheckState.Checked;
                }
                else
                {
                }

                EnableAuidoInputControl();
            }

            ui_preview_wnd_.Hide();
            ui_preview_wnd_ = null;
        }

实时截图

            if (String.IsNullOrEmpty(capture_image_path_))
            {
                MessageBox.Show("请先设置保存截图文件的目录! 点击截图左边的按钮设置!");
                return;
            }

            if (publisher_handle_ == IntPtr.Zero)
            {
                return;
            }

            String name = capture_image_path_ + "\\" + DateTime.Now.ToString("hh-mm-ss") + ".png";

            byte[] buffer1 = Encoding.Default.GetBytes(name);
            byte[] buffer2 = Encoding.Convert(Encoding.Default, Encoding.UTF8, buffer1, 0, buffer1.Length);

            byte[] buffer3 = new byte[buffer2.Length + 1];
            buffer3[buffer2.Length] = 0;

            Array.Copy(buffer2, buffer3, buffer2.Length);

            IntPtr file_name_ptr = Marshal.AllocHGlobal(buffer3.Length);
            Marshal.Copy(buffer3, 0, file_name_ptr, buffer3.Length);

            capture_image_call_back_ = new NT_PB_SDKCaptureImageCallBack(SDKCaptureImageCallBack);

            UInt32 ret = NTSmartPublisherSDK.NT_PB_CaptureImage(publisher_handle_, file_name_ptr, IntPtr.Zero, capture_image_call_back_);

            Marshal.FreeHGlobal(file_name_ptr);

            if (NT.NTBaseCodeDefine.NT_ERC_OK == ret)
            {
                // 发送截图请求成功
            }
            else if ((UInt32)NT.NTSmartPublisherDefine.NT_PB_E_ERROR_CODE.NT_ERC_PB_TOO_MANY_CAPTURE_IMAGE_REQUESTS == ret)
            {
                // 通知用户延时
                MessageBox.Show("Too many capture image requests!");
            }
            else
            {
                // 其他失败
            }

       private void ImageCallBack(UInt32 result, String file_name)
        {
            if (file_name == null && file_name.Length == 0)
                return;

            MessageBox.Show(file_name);
        }

        public void SDKCaptureImageCallBack(IntPtr handle, IntPtr userData, UInt32 result, IntPtr file_name)
        {
            if (file_name == IntPtr.Zero)
                return;

            int index = 0;

            while (true)
            {
                if (0 == Marshal.ReadByte(file_name, index))
                    break;

                index++;
            }

            byte[] file_name_buffer = new byte[index];

            Marshal.Copy(file_name, file_name_buffer, 0, index);

            byte[] dst_buffer = Encoding.Convert(Encoding.UTF8, Encoding.Default, file_name_buffer, 0, file_name_buffer.Length);
            String image_name = Encoding.Default.GetString(dst_buffer, 0, dst_buffer.Length);

            if (InvokeRequired)
            {
                BeginInvoke(set_capture_image_call_back_, result, image_name);
            }
            else
            {
                set_capture_image_call_back_(result, image_name);
            }
        }

问答式参考

1视频采集设置

说明:

1. 屏幕和摄像头相互切换:用于在线教育或者无纸化等场景,推送或录像过程中,随时切换屏幕或摄像头数据(切换数据源),如需实时切换,点击页面“切换到摄像头”按钮即可;

2. 设置遮盖层,用于设定一个长方形或正方形区域(可自指定区域大小),遮盖不想给用户展示的部分;

3. 水印:添加PNG水印,支持推送或录像过程中,随时添加、取消水印;

4. 摄像头叠加到屏幕:意在用于同屏过程中,主讲人摄像头悬浮于屏幕之上(可指定叠加坐标),实现双画面展示,推送或录像过程中,可以随时取消摄像头叠加;

5. 屏幕叠加到摄像头:同4,效果展示,实际根据需求实现;

6. 采集桌面:可以通过点击“选择屏幕区域”获取采集区域,并可在采集过程中,随时切换区域位置,如不设定,默认全屏采集;

7. 使用DXGI采集屏幕,采集时停用Aero;

8. 采集窗口:可设定需要采集的窗口,窗口放大或缩小,推送端会自适应码率和分辨率;

9. 采集帧率(帧/秒):默认屏幕采集12帧,可根据实际场景需求设定到期望帧率;

10. 缩放屏幕大小缩放比:用于高清或超高清屏,通过设定一定的比例因子,缩放屏幕采集分辨率;

11. 采集摄像头:可选择需要采集的摄像头、采集分辨率、帧率、是否需要水平或者垂直反转、是否需要旋转;

追加提问:

问题[确认数据源]:采集桌面还是摄像头?如果桌面,全屏还是部分区域?

回答:

如果是摄像头:可以选择摄像头列表,然后分辨率、帧率。

如果是屏幕:默认帧率是12帧,可以根据实际场景调整,选取屏幕区域,可以实时拉取选择需要采集或录像区域;

如果是叠加模式:可选择摄像头叠加到屏幕,还是屏幕叠加到摄像头;

更高需求的用户,可以设置水印或应用层遮盖。

问题:如果是摄像头,采集到的摄像头角度不对怎么办?

回答:我们支持摄像头镜像和翻转设置,摄像头可通过SDK接口轻松实现水平/垂直翻转、镜像效果。

2 视频码率控制

如何选择适合我的码率

回答:如果不是有音视频背景的开发人员,可点击“获取视频码率默认值”,参考我们默认的码率推荐,如果觉得推荐码率过高或不够,可根据实际情况酌情调整。

265编码还是H.264编码?

回答:Windows平台支持H.265特定机型硬编码,如果推RTMP流,需要服务器支持RTMP H.265扩展,播放器SDK,也需要同步支持RTMP H.265扩展播放。

如果是轻量级RTSP服务SDK对接的话,只需要播放器支持RTSP H.265即可。

如果推摄像头数据,建议采用可变码率+H.265编码。

如何设置码率参数更合理?

回答:

关键帧间隔:一般来说,设置到帧率的2-4倍,比如帧率20,关键帧间隔可以设置到40-80;

平均码率:可以点击“获取视频码率默认值”,最大码率是平均码率的2倍;

视频质量:如果使用可变码率,建议采用大牛直播SDK默认推荐视频质量值;

编码速度:如高分辨率,建议1-3,值越小,编码速度越快;

H.264 Profile:默认baseline profile,可根据需要,酌情设置High profile;

NOTE:点击“推送”或“录像”或启动内置RTSP服务SDK之前,请务必设置视频码率,如不想手动设置,请点击“获取视频码率默认值”!!!

3 音频采集设置

问答式:采集音频吗?如果采集,采集麦克风还是扬声器的,亦或混音?

回答:

如果想采集电脑输出的音频(比如音乐之类),可以选择“采集扬声器”;

如果想采集麦克风音频,可以选择“采集麦克风”,并选择相关设备;

如果两个都想采集,可以两个都选择,混音输出。

4 实时音量调节

问答式:采集过程中可以改变麦克风或扬声器采集音量吗?

回答:可以,如果二者都选中,处于混音模式,也可单独调整麦克风或扬声器音量。

5 音频编码

问题:是AAC还是SPEEX?

回答:我们默认是AAC编码模式,如果需要码率更低,可以选择SPEEX编码模式,当然我们的AAC编码码率也不高,如果没有太高要求,考虑到通用性,建议使用AAC。

6 音频处理

问题:我想过滤背景噪音怎么办?

回答:选中“噪音抑制”,“噪音抑制“请和“自动增益控制”组合使用,“端点检测(VAD)”可选设置。

问题:我想做一对一互动怎么办?

回答:选中“回音消除”,可以和“噪音抑制”、“自动增益控制”组合使用,具体可参看回音消除的demo工程:WIN-EchoCancellation-CSharp-Demo。

问题:我推送或者录像过程中,随时静音怎么办?

回答:推送过程中,随时选择或取消选择“静音”功能。

7多路推送

问题:我想同时推送到多个url怎么办(比如一个内网服务器,一个外网服务器)?

回答:同时填写多个url(最多3个),然后点推送即可。

8 截图(快照)

问题:我想推送或者录像过程中,截取当前图像怎么办?

回答:那就设置好截图路径,推送或录像过程中,随时点击“截图”。

9 录像

问题:我还想录像,怎么办?

回答:设置录像文件存放目录,文件前缀、单个文件大小,是否加日期、时间,随时录制即可,此外,我们的SDK还支持录像过程中,暂停录像,恢复录像。

10 实时预览

问题:我还想看看推出去视频特别是合成后的效果,怎么办?

回答:点击页面的“预览”按钮,就可以看到。

 

如何用轻量级RTSP服务本地生成RTSP测试URL

最近发现好多开发者都在搜索可用的RTSP测试URL,目前公网实际可测试的RTSP URL非常少,即便是可用,分辨率和网络也非常差,不适合长期测试。

针对此,我们的建议是最好直接网上买个海康或大华的摄像头,一般来说,海康大华的RTSP URL格式如下:

海康摄像头RTSP URL规则

主码流:rtsp://admin:daniulive12345@192.168.0.120:554/h265/ch1/main/av_stream

子码流:rtsp://admin:daniulive12345@192.168.0.120:554/h264/ch1/sub/av_stream

rtsp://[username]:[password]@[ip]:[port]/[codectype]/[channel]/[subtype]/av_stream
URL组成说明:
username: 用户名;
password: 密码;
ip: 网络摄像机IP地址;
port: 端口号,默认554;
codectype:有h264/h265/mjpeg;
channel: 通道号。
subtype: 码流类型,主码流:main,子码流:sub。

大华摄像头RTSP URL规则

主码流:rtsp://admin:admin123456@192.168.0.121:554/cam/realmonitor?channel=1&subtype=0

子码流:rtsp://admin:admin123456@192.168.0.121:554/cam/realmonitor?channel=1&subtype=1

rtsp://[username]:[password]@[ip]:[port]/cam/realmonitor?/[channel]&/[subtype]
URL组成说明:
username: 用户名;
password: 密码;
ip: 网络摄像机IP地址;
port: 端口号,默认554;
codectype:有h264/h265/mjpeg;
channel: 通道号。
subtype: 码流类型,主码流:0,子码流:1。

如何自己生成个本地RTSP测试URL

如果想采集PC摄像头或者屏幕,也可以本地用轻量级RTSP服务,本地生成个RTSP测试URL。常用的方式,比如VLC串流,或者大牛直播SDK(URL)的Windows平台SmartPublisherDemo生成即可。

本文就以SmartPublisherDemo轻量级RTSP服务采集本地摄像头为例,说明下如何创建个本地测试的RTSP地址。

1. 选中采集摄像头,并选择需要测试的分辨率、帧率,点击“获取视频码率默认值”,得到系统推荐的码率(高级用户也可自行配置),如需要采集audio,看采集麦克风还是扬声器的,如果二者均需采集,同时选中即可(混音模式)。

2. 点击“配置查看Rtsp服务”按钮,在弹出框点击“启动服务”即可,可启动一组也可启动多组,每个服务对应一个RTSP URL。

3. 确定后,点击“发布RTSP流”按钮即可,发布后,可本地生成个RTSP URL,以本机为例,生成的URL是“rtsp://192.168.0.211:8554/stream1”。

4. 启动播放端,输入生成的RTSP URL,测试即可。

5.服务器负载查看:再次点击“配置查看RTSP服务”,即可看到每个服务连接的会话数:

6. 如需停止服务,点击页面的“停止RTSP流”即可;

是不是非常方便?

为什么要做轻量级RTSP服务?

轻量级RTSP服务解决的核心痛点是避免用户或者开发者单独部署RTSP或者RTMP服务,实现本地的音视频数据(如摄像头、麦克风),编码后,汇聚到内置RTSP服务,对外提供可供拉流的RTSP URL,轻量级RTSP服务,适用于内网环境下,对并发要求不高的场景,支持H.264/H.265,支持RTSP鉴权、单播、组播模式,考虑到单个服务承载能力,我们支持同时创建多个RTSP服务,并支持获取当前RTSP服务会话连接数。

设计功能:

  •  [基础功能]采集摄像头、屏幕、窗口或外部自定义音视频数据;
  •  [音频格式]AAC;
  •  [视频格式]H.264、H.265;
  •  [协议类型]RTSP;
  •  [传输模式]支持单播和组播模式;
  •  [端口设置]支持RTSP端口设置;
  •  [鉴权设置]支持RTSP鉴权用户名、密码设置;
  •  [获取session连接数]支持获取当前RTSP服务会话连接数;
  •  [多服务支持]支持同时创建多个内置RTSP服务;
  •  [H.265支持]Windows内置rtsp server支持发布H.265视频;
  •  [RTSP url回调]支持设置后的rtsp url通过event回调到上层。

感兴趣的开发者,可以自行尝试。

大牛直播SDK试用、测试服务协议

欢迎使用上海视沃信息科技有限公司(以下简称“视沃科技”)旗下“大牛直播SDK”,试用测试前,请您仔细阅读视沃科技官方网站公布的相关规范和使用流程及本协议的全部内容,如您不同意前述任意内容,请不要进行后续操作。如您实际使用了“大牛直播SDK”,我方将视为您已完全理解并认同规范、流程及服务协议的全部内容。

  • 协议主体

本服务协议是因您使用大牛直播SDK与视沃科技所订立的有效合约。

  • 协议的订立与生效

一旦您选择试用、测试大牛直播SDK并进行后续操作,即表示您同意遵循本服务协议之所有约定,本协议即成为双方之间就大牛直播SDK软件包服务达成的有效合约。

  • 大牛直播SDK软件包服务的使用

3.1 在试用测试大牛直播SDK软件包服务前,您应知悉阅读视沃科技官网页面上的相关规范、使用流程,并理解相关内容及可能发生的后果,在使用大牛直播SDK软件包服务的过程中,您应依照相关操作指引进行操作,请您自行把握风险谨慎操作。

3.2您理解并同意,使用大牛直播SDK软件服务包是您自行独立审慎判断的结果,您将自行对此负责,包括但不限于:

3.2.1在使用过程中,您将对自行操作的行为及产生的结果负责;

3.2.1在试用、测试大牛直播SDK软件服务包阶段可免费使用,但正式授权版需签订购买合同并支付相关费用;

3.2.3在使用大牛直播SDK软件包过程中,您不应进行任何破坏或试图破坏网络安全的行为,您承诺SDK试用、测试及授权后的使用场景保持一致且视频内容合法合规,不得含有我国法律、行政法规禁止发布或传输的信息,为履行法律赋予的安全管理义务,请您下载填写附件1并将签章后的附件1发至指定邮箱;

3.2.4除视沃科技明示许可外,不得修改、翻译、改编、转许可、转让大牛直播SDK软件服务包,也不得逆向工程、反编译或试图以其他方式发现大牛直播SDK软件服务包或软件源代码;

3.2.5您不应以任何将违反国家、地方法律法规、行业管理和社会公共道德、及影响、损害或可能影响、损害视沃科技利益的方式或目的使用大牛直播SDK。

  • 责任限制

您理解并同意,在免费测试、试用其间,视沃科技虽然对大牛直播SDK软件包服务提供可用性支撑,但不对其中任何错误或漏洞提供任何担保,并不对您使用大牛直播SDK软件服务包的工作和结果承担任何责任。

  • 变更和终止

5.1 您理解并认可,视沃科技保留随时修改、取消、增强大牛直播SDK软件包服务一项或多项功能的权利;

5.2如您有任何违反本服务协议的情形,或根据视沃科技自己的独立判断认为您对大牛直播SDK的使用行为不符合我司要求,我司有权随时中断您使用大牛直播SDK而无需通知您,并将相关情况向有关主管部门报告;同时,如给我司造成损失,我司有权要求赔偿。

  • 保密

您及视沃科技都应对对方的保密信息承担保密责任,除非经国家行政、司法等有权机关要求披露或该信息已进入公有领域。

  • 其他

7.1视沃科技有权随时根据有关法律、法规的变化及公司经营状况和经营策略的调整等修改本服务协议。修改后的服务协议会在视沃科技官网公布。如果不同意修改的内容,您应停止使用大牛直播SDK软件包服务。如果继续使用大牛直播SDK软件包,则视为您接受本服务协议的变动。

7.2如果本服务协议中的任何条款无论因何种原因完全或部分无效或不具有执行力,或违反任何适用的法律,则该条款被视为删除,但本服务协议的其余条款仍应有效并且有约束力。

7.3本服务协议受中华人民共和国法律管辖。在执行本服务协议过程中如发生纠纷,双方应及时协商解决。协商不成时,任何一方可直接向上海市长宁区人民法院提起诉讼。

附件1:

  • 甲方(测试、试用方公司名称):
  • 测试、试用大牛直播SDK实际使用场景描述:(附文字说明、产品截图、视频画面)
  • 在使用大牛直播SDK软件包过程中,您不应进行任何破坏或试图破坏网络安全的行为,您承诺SDK试用、测试及授权后的使用场景保持一致且视频内容合法合规,不得含有我国法律、行政法规禁止发布或传输的信息

附件1请发送至 1130758427@qq.com

 

甲方签章

xxxx年xx月xx日

 

视沃科技(大牛直播SDK)官方测试版获取流程

视沃科技官方测试版本暂不提供网络下载,如因产品需求,需要测试版,可按照以下流程:

  1. 联系视沃科技官方商务|技术人员电话、QQ或微信,手机:130-7210-2209 或 135-6452-9354  QQ:89030985  或 517631076 微信:xinsheng120 或 ldxevt
  2. 查看“大牛直播SDK试用、测试服务协议.pdf”;右键另存为
  3. 填写“上海视沃信息科技有限公司SDK使用场景调查表”(右键另存为),并签章(主要是写清楚使用的行业,和需要试用的模块,场景调查表仅作为试用企业场景合法合规审核留底,不作为其他任何用途,也不产生任何费用,如果公司/学校签章流程复杂,您也可以填写后,打印出来自己签字+留联系方式+工牌(或可证明隶属于本公司的材料即可);
  4. 公司审核通过后,获取试用版和前期技术支持。

Windows平台RTSP|RTMP播放端SDK集成说明

2.1 demo说明

  • 大牛直播SDK提供C++/C#两套接口,对外提供32/64位debug/release库,C++和C#接口一一对应,C#接口比C++接口增加前缀NT_PB_;
  • WIN-PlayerSDK-CPP-Demo:播放端SDK对应的C++接口的demo;
  • WIN-PlayerSDK-CSharp-Demo:播放端SDK对应的C#接口的demo;
  • 播放端SDK支持Win7及以上系统;
  • 本demo基于VS2013开发。

2.2 界面UI展示

2.3集成说明

C++头文件:

  • [类型定义]nt_type_define.h
  • [Log定义]smart_log.h
  • [Log定义]smart_log_define.h
  • [base code定义]nt_base_code_define.h
  • [player接口]smart_player_define.h
  • [player参数定义]smart_player_sdk.h

C#头文件:

  • [base code定义]nt_base_code_define.cs
  • [player接口]smart_player_define.cs
  • [player参数定义]smart_player_sdk.cs

相关Lib:

  • SmartLog.dll
  • SmartLog.lib
  • SmartPlayerSDK.dll
  • SmartPlayerSDK.lib
  • avcodec-56.dll
  • avdevice-56.dll
  • avfilter-5.dll
  • avformat-56.dll
  • avutil-54.dll
  • postproc-53.dll
  • swresample-1.dll
  • swscale-3.dll

集成步骤:

  1. 把lib目录下debug/release库拷贝到需要集成的工程对应的debug或release目录下(确保32位/64位库debug/release目录一一对应);

lib目录如下:

    1. 32位debug库:debug
    2. 32位release库:release
    3. 64位debug库:x64\debug
    4. 64位release库:x64\release

2. 相关cs头文件,加入需要集成的工程;

3. 在需要集成的工程,右键->Properties->

Application->Assembly name,大牛直播SDK按照APP名称授权,未授权版本,此处请改成“SmartPlayer”,如需授权,可直接联系商务;

4. 正式授权版,需要在Init()接口调用之前添加设置license的代码(相关Key和CID请根据正式授权版邮件说明填写):

2.4 接口调用时序(以C#为例)

2.4.1 设置授权license

C#的SDK,请在在NT.NTSmartPlayerSDK.NT_SP_Init之前添加下面的代码:

NT.NTSmartPlayerSDK.NT_SP_SetSDKClientKey("xxxxxxxxxx", "xxxxxxxxxx", 0, IntPtr.Zero);

UInt32 isInited = NT.NTSmartPlayerSDK.NT_SP_Init(0, IntPtr.Zero);
if (isInited != 0)
{
    MessageBox.Show("调用NT_SP_Init失败..");
    return;
}

C++的SDK,请在player_api_.Init之前添加下面的代码:

NT_SP_SetSDKClientKey(NT_SP_SetSDKClientKey("xxxxxxxxxx", "xxxxxxxxxx", 0, nullptr);

if ( NT_ERC_OK != player_api_.Init(0, NULL) )
{
    return FALSE;
}

2.4.2 设置日志存放路径

需要在player_api_.Init之前添加下面的代码:

// 设置日志路径(请确保目录存在)
String log_path = "D:\\playerlog";
NTSmartLog.NT_SL_SetPath(log_path);

如目录存在,并具备文件写入权限,关闭应用程序后,相关文件夹下会有smart_sdk.log生成。

2.4.3 初始化SDK

NT_SP_Init:SDK初始化,多实例播放,此接口仅需调用一次即可。

2.4.4 特定机型硬解码检测

如系统用于特定机型环境下,特别是多路播放场景,需用到硬解码的话,可以用以下两组接口检测系统是否支持硬解。

注:在软解性能满足系统需求的前提下,一般建议优先使用软解。

/*
 * 检查是否支持H264硬解码
 * 如果支持的话返回NT_ERC_OK
 */
[DllImport(@"SmartPlayerSDK.dll")]
public static extern UInt32 NT_SP_IsSupportH264HardwareDecoder();

/*
  * 检查是否支持H265硬解码
  * 如果支持的话返回NT_ERC_OK
  */
[DllImport(@"SmartPlayerSDK.dll")]
public static extern UInt32 NT_SP_IsSupportH265HardwareDecoder();

如需使用硬解码,调用如下接口即可:

NTSmartPlayerSDK.NT_SP_SetH264HardwareDecoder(player_handle_, is_support_h264_hardware_decoder_ ? 1 : 0, 0);
NTSmartPlayerSDK.NT_SP_SetH265HardwareDecoder(player_handle_, is_support_h265_hardware_decoder_ ? 1 : 0, 0);

2.4.5 Open生成播放实例

NT_SP_Open:每调用一次Open接口,对应一个播放实例,如需播放多实例,对应多个player handler。

if (player_handle_ == IntPtr.Zero)
{
    player_handle_ = new IntPtr();

    UInt32 ret_open = NTSmartPlayerSDK.NT_SP_Open(out player_handle_, IntPtr.Zero, 0, IntPtr.Zero);

    if (ret_open != 0)
    {
        player_handle_ = IntPtr.Zero;
        MessageBox.Show("调用NT_SP_Open失败..");
        return;
    }
}

2.4.6 设置回调事件

  1. NT_SP_SetEventCallBack:用于回调网络链接状态、buffer状态(开始、buffer比例、结束)、实时带宽等,对应EventID如下:
/*事件ID*/
public enum NT_SP_E_EVENT_ID : uint
{
        NT_SP_E_EVENT_ID_BASE = NTBaseCodeDefine.NT_EVENT_ID_SMART_PLAYER_SDK,

        NT_SP_E_EVENT_ID_CONNECTING          = NT_SP_E_EVENT_ID_BASE | 0x2, /*连接中*/
        NT_SP_E_EVENT_ID_CONNECTION_FAILED = NT_SP_E_EVENT_ID_BASE | 0x3, /*连接失败*/
        NT_SP_E_EVENT_ID_CONNECTED       = NT_SP_E_EVENT_ID_BASE | 0x4, /*已连接*/
        NT_SP_E_EVENT_ID_DISCONNECTED     = NT_SP_E_EVENT_ID_BASE | 0x5, /*断开连接*/
        NT_SP_E_EVENT_ID_NO_MEDIADATA_RECEIVED = NT_SP_E_EVENT_ID_BASE | 0x8,  /*收不到RTMP数据*/
        NT_SP_E_EVENT_ID_RTSP_STATUS_CODE   = NT_SP_E_EVENT_ID_BASE | 0xB,  /*rtsp status code上报, 目前只上报401, param1表示status code*/

        /* 接下来请从0x81开始*/
        NT_SP_E_EVENT_ID_START_BUFFERING = NT_SP_E_EVENT_ID_BASE | 0x81, /*开始缓冲*/
        NT_SP_E_EVENT_ID_BUFFERING     = NT_SP_E_EVENT_ID_BASE | 0x82, /*缓冲中, param1 表示百分比进度*/
        NT_SP_E_EVENT_ID_STOP_BUFFERING  = NT_SP_E_EVENT_ID_BASE | 0x83, /*停止缓冲*/

        NT_SP_E_EVENT_ID_DOWNLOAD_SPEED  = NT_SP_E_EVENT_ID_BASE | 0x91, /*下载速度, param1表示下载速度,单位是(Byte/s)*/

        NT_SP_E_EVENT_ID_PLAYBACK_REACH_EOS = NT_SP_E_EVENT_ID_BASE | 0xa1,     /*播放结束, 直播流没有这个事件,点播流才有*/
        NT_SP_E_EVENT_ID_RECORDER_REACH_EOS = NT_SP_E_EVENT_ID_BASE | 0xa2,     /*录像结束, 直播流没有这个事件, 点播流才有*/
        NT_SP_E_EVENT_ID_PULLSTREAM_REACH_EOS = NT_SP_E_EVENT_ID_BASE | 0xa3,   /*拉流结束, 直播流没有这个事件,点播流才有*/

        NT_SP_E_EVENT_ID_DURATION = NT_SP_E_EVENT_ID_BASE | 0xa8, /*视频时长,如果是直播,则不上报,如果是点播的话, 若能从视频源获取视频时长的话,则上报, param1表示视频时长,单位是毫秒(ms)*/
}

  1. NT_SP_SetVideoSizeCallBack:设置视频分辨率回调,如流数据携带视频数据,SDK会回上来视频宽高信息:
//video resolution callback
video_size_call_back_ = new SP_SDKVideoSizeCallBack(SP_SDKVideoSizeHandle);
NTSmartPlayerSDK.NT_SP_SetVideoSizeCallBack(player_handle_, IntPtr.Zero, video_size_call_back_);

注意:视频宽高回上来或绘制窗口发生变化时,记得调用NT_SP_OnWindowSize()更新,如不调用可能会引起视频模糊。

private void PlaybackWindowResized(Int32 width,Int32 height)
{
    width_=width;
    height_=height;

    int left=playWnd.Left;
    int top=playWnd.Top;

    textBox_resolution.Text=width+"*"+height;

    if(player_handle_==IntPtr.Zero)
    {
       return;
    }

    NTSmartPlayerSDK.NT_SP_OnWindowSize(player_handle_,playWnd.Width,playWnd.Height);
}
  1. NT_SP_SetVideoFrameCallBack:设置YUV/RGB32数据回调,可用于对接第三方视频分析,或自行绘制等,如系统不支持D3D绘制,可设置回调数据,上层GDI模式绘制:
/*定义视频帧图像格式*/
public enum NT_SP_E_VIDEO_FRAME_FORMAT : uint
{
      NT_SP_E_VIDEO_FRAME_FORMAT_RGB32 = 1, // 32位的rgb格式, r, g, b各占8, 另外一个字节保留, 内存字节格式为: bb gg rr xx, 主要是和windows位图匹配, 在小端模式下,按DWORD类型操作,最高位是xx, 依次是rr, gg, bb
      NT_SP_E_VIDEO_FRAME_FORMAT_ARGB = 2, // 32位的argb格式,内存字节格式是: bb gg rr aa 这种类型,和windows位图匹配
      NT_SP_E_VIDEO_FRAME_FROMAT_I420 = 3, // YUV420格式, 三个分量保存在三个面上
}

  1. NT_SP_SetVideoFrameCallBackV2:设置YUV/RGB32数据回调,与NT_SP_SetVideoFrameCallBack接口的不同在于,吐出来的视频数据, 可以指定宽高;
  2. NT_SP_SetRenderVideoFrameTimestampCallBack:设置绘制视频帧时,视频帧时间戳回调,一般播放器无时间戳回调需求的话,无需设置:
//video timestamp callback
video_frame_ts_callback_ = new SP_SDKRenderVideoFrameTimestampCallBack(SP_SDKRenderVideoFrameTimestampCallBack);
NTSmartPlayerSDK.NT_SP_SetRenderVideoFrameTimestampCallBack(player_handle_, IntPtr.Zero, video_frame_ts_callback_);
  1. NT_SP_SetAudioPCMFrameCallBack:设置音频PCM帧回调, 吐PCM数据出来,目前每帧大小是10ms,一般播放器无使用需求的话,无需设置;
  1. NT_SP_SetUserDataCallBack:设置用户数据回调,此接口需要和推送端SDK配套使用,用于返回推送端设定的实时用户数据(如时间戳、经纬度等各种扩展指令或信息),如只是单纯使用播放SDK,无需设置;
  1. NT_SP_SetSEIDataCallBack:设置视频SEI数据回调,如只是单纯使用播放SDK,不需要额外处理扩展SEI数据的话,无需设置。

2.4.7 D3DRender检测

目前,几乎很少存在不支持D3D绘制的情况,考虑到系统通用性,我们在播放之前,先做检测,具体调用接口如下:

/*
 * handle: 播放句柄
 * hwnd: 这个要传入真正用来绘制的窗口句柄
 * is_support: 如果支持的话 *is_support 为1, 不支持的话为0
 * 接口调用成功返回NT_ERC_OK
 */
[DllImport(@"SmartPlayerSDK.dll")]
public static extern UInt32 NT_SP_IsSupportD3DRender(IntPtr handle, IntPtr hwnd, ref Int32 is_support);

对于不支持D3D绘制的情况下,设置回调YUV数据,上层直接用GDI模式绘制,注意:GDI绘制效率偏低。

Int32 in_support_d3d_render = 0;

if (NT.NTBaseCodeDefine.NT_ERC_OK == NTSmartPlayerSDK.NT_SP_IsSupportD3DRender(player_handle_, playWnd.Handle, ref in_support_d3d_render))
{
    if (1 == in_support_d3d_render)
    {
        is_support_d3d_render = true;
    }
}

if (is_support_d3d_render)
{
    is_gdi_render_ = false;

    // 支持d3d绘制的话,就用D3D绘制
    NTSmartPlayerSDK.NT_SP_SetRenderWindow(player_handle_, playWnd.Handle);

    if (btn_check_render_scale_mode.Checked)
    {
        NTSmartPlayerSDK.NT_SP_SetRenderScaleMode(player_handle_, 1);
    }
    else
    {
        NTSmartPlayerSDK.NT_SP_SetRenderScaleMode(player_handle_, 0);
    }
}
else
{
    is_gdi_render_ = true;
    playWnd.Visible = false;

    // 不支持D3D就让播放器吐出数据来,用GDI绘制

    //video frame callback (YUV/RGB)
    //format请参见 NT_SP_E_VIDEO_FRAME_FORMAT,如需回调YUV,请设置为 NT_SP_E_VIDEO_FRAME_FROMAT_I420
    video_frame_call_back_ = new SP_SDKVideoFrameCallBack(SetVideoFrameCallBack);
NTSmartPlayerSDK.NT_SP_SetVideoFrameCallBack(player_handle_, (Int32)NT.NTSmartPlayerDefine.NT_SP_E_VIDEO_FRAME_FORMAT.NT_SP_E_VIDEO_FRAME_FORMAT_RGB32, IntPtr.Zero, video_frame_call_back_);
}

2.4.8 设置播放URL

NT_SP_SetURL:支持rtsp/rtmp/本地FLV文件(全路径)。

2.4.9 设置回调PCM

NT_SP_SetIsOutputAudioDevice:设置是否播放出声音,这个和静音接口是有区别的,这个接口的主要目的是为了用户设置了外部PCM回调接口后,又不想让SDK播放出声音时使用。

2.4.10 RTMP/RTSP播放参数设置

具体可参照Demo源码里面InitCommonSDKParam():

2.4.10.1 播放前可选设置接口

  1. NT_SP_SetBuffer:设置视频播放缓冲buffer大小,单位:毫秒;
  2. NT_SP_SetRTSPTcpMode:设置RTSP TCP 模式, 1为TCP, 0为UDP, 此接口仅RTSP有效;
  3. NT_SP_SetRtspTimeout:设置RTSP超时时间, timeout单位为秒,必须大于0;
  4. NT_SP_SetRtspAutoSwitchTcpUdp:对于RTSP来说,有些可能支持rtp over udp方式,有些可能支持使用rtp over tcp方式. 为了方便使用,有些场景下可以开启自动尝试切换开关, 打开后如果udp无法播放,sdk会自动尝试tcp, 如果tcp方式播放不了,sdk会自动尝试udp, is_auto_switch_tcp_udp: 如果设置1的话, sdk将在tcp和udp之间尝试切换播放,如果设置为0,则不尝试切换;
  5. NT_SP_SetFastStartup:设置秒开, 1为秒开, 0为不秒开,此接口用于如RTMP服务器缓存GOP时,酌情使用;
  6. NT_SP_SetLowLatencyMode:设置低延时播放模式,默认是正常播放模式,mode: 1为低延时模式, 0为正常模式,低延迟模式下,可能会导致音视频不同步,或视频帧不均匀;
  7. NT_SP_SetReportDownloadSpeed:设置下载速度上报, 默认不上报下载速度;

* is_report: 上报开关, 1: 表上报. 0: 表示不上报. 其他值无效.

* report_interval: 上报时间间隔(上报频率),单位是秒,最小值是1秒1次. 如果小于1且设置了上报,将调用失败

* 注意:如果设置上报的话,请设置SetEventCallBack, 然后在回调函数里面处理这个事件.

* 上报事件是:NT_SP_E_EVENT_ID_DOWNLOAD_SPEED

  1. NT_SP_GetDownloadSpeed:主动获取下载速度,speed: 返回下载速度,单位是Byte/s;
  2. NT_SP_SetParam:万能接口, 设置参数, 大多数问题, 这些接口都能解决;
  3. NT_SP_GetParam:万能接口, 得到参数, 大多数问题,这些接口都能解决;

2.4.10.2 播放前后可实时调用的接口

  1. NT_SP_SetMute:播放过程中,实时静音、取消静音,可播放之前调用,亦或播放过程中实时调用;
  2. NT_SP_SetAudioVolume:不同于实时静音接口,此接口可以更细粒度的控制音量,默认范围[0,100],其中0是静音,100是最大音量, 默认是100;
  3. NT_SP_SetOnlyDecodeVideoKeyFrame:多窗口播放场景下,部分窗口可能只需要播放关键帧,如有类似场景需求,可用此接口;
  4. NT_SP_SetRotation:设置视频View旋转,顺时针旋转,degress: 设置0, 90, 180, 270度有效,其他值无效,注意:除了0度,其他角度播放会耗费更多CPU;
  5. NT_SP_SetFlipVertical:设置视频View上下反转(垂直反转);
  6. NT_SP_SetFlipHorizontal:设置视频View水平反转;
  7. NT_SP_SetRenderScaleMode:设置视频画面的填充模式,如填充整个绘制窗口、等比例填充绘制窗口,如不设置,默认填充整个绘制窗口;

2.4.11 开始播放

NT_SP_StartPlay

开始播放RTMP或RTSP流数据。

2.4.12 RTMP/RTSP拉流端录像

  1. NT_SP_SetRecorderDirectory:设置录像目录
  2. NT_SP_SetRecorderFileMaxSize:设置单个文件最大大小
  3. NT_SP_SetRecorderFileNameRuler:设置录像文件名生成规则
  4. NT_SP_SetRecorderCallBack:设置录像回调接口
  5. NT_SP_SetRecorderAudioTranscodeAAC:设置录像时音频转AAC编码的开关, aac比较通用,sdk增加其他音频编码(比如speex, pcmu, pcma等)转aac的功能
  6. NT_SP_SetRecorderVideo:设置是否录视频,默认的话,如果视频源有视频就录,没有就没得录, 但有些场景下可能不想录制视频,只想录音频,所以增加个开关
  7. NT_SP_SetRecorderAudio:设置是否录音频,默认的话,如果视频源有音频就录,没有就没得录, 但有些场景下可能不想录制音频,只想录视频,所以增加个开关
  8. NT_SP_StartRecorder:启动录像
  9. NT_SP_StopRecorder:停止录像

2.4.13 实时快照

NT_SP_CaptureImage

用于播放端实时截取当前播放图片,图片以PNG形式保存至本地。

String name = capture_image_path_ + "\\" +  DateTime.Now.ToString("hh-mm-ss") + ".png";

byte[] buffer1 = Encoding.Default.GetBytes(name);
byte[] buffer2 = Encoding.Convert(Encoding.Default, Encoding.UTF8, buffer1, 0, buffer1.Length);

byte[] buffer3 = new byte[buffer2.Length + 1];
buffer3[buffer2.Length] = 0;

Array.Copy(buffer2, buffer3, buffer2.Length);

IntPtr file_name_ptr = Marshal.AllocHGlobal(buffer3.Length);
Marshal.Copy(buffer3, 0, file_name_ptr, buffer3.Length);

capture_image_call_back_ = new SP_SDKCaptureImageCallBack(SDKCaptureImageCallBack);

UInt32 ret = NTSmartPlayerSDK.NT_SP_CaptureImage(player_handle_, file_name_ptr, IntPtr.Zero, capture_image_call_back_);

Marshal.FreeHGlobal(file_name_ptr);

if (NT.NTBaseCodeDefine.NT_ERC_OK == ret)
{
    // 发送截图请求成功
}
else if ((UInt32)NT.NTSmartPlayerDefine.SP_E_ERROR_CODE.NT_ERC_SP_TOO_MANY_CAPTURE_IMAGE_REQUESTS == ret)
{
    // 通知用户延时
    MessageBox.Show("Too many capture image requests!");
}
else
{
    // 其他失败
}

2.4.14 快速切换URL

NT_SP_SwitchURL

快速切换URL,用于不用析构整个player实例的前提下,实时切换播放的URL。

2.4.15 用户数据回调

NT_SP_SetUserDataCallBack

设置用户数据回调,用于接收扩展SEI模块发送的用户数据信息,如不是配合我们扩展SEI发送DK,此接口无需调用。

2.4.16 SEI数据回调

NT_SP_SetSEIDataCallBack

设置视频sei数据回调,用于接收SEI数据回调,如流数据不存在SEI或不准备处理SEI数据,此接口无需调用。

2.4.17 停止播放

NT_SP_StopPlay

停止播放RTMP或RTSP流数据。

2.4.18 关闭播放实例

NT_SP_Close

调用Close接口后,player handler置空。

if ( player_handle_ != IntPtr.Zero)
{
     NTSmartPlayerSDK.NT_SP_Close(player_handle_);
     player_handle_ = IntPtr.Zero;
}

2.4.19 Uninit

NT_SP_UnInit

UnInit() 是SDK最后一个调用的接口,多实例环境下,只需要调用一次即可。

Windows平台RTMP/RTSP直播推送模块设计和使用说明

开发背景

好多开发者一直反馈,Windows平台,做个推屏或者推摄像头,推RTMP或者RTSP出去,不知道哪些功能是必须的,哪些设计是可有可无的,还有就是,不知道如何选技术方案,以下是基于我们设计的Windows平台RTSP、RTMP直播推送模块,设计和使用说明,供大家参考。

整体方案架构

Windows平台RTMP或RTSP推送,系采集端模块,主要完成,屏幕或者摄像头数据、麦克风或扬声器数据的采集,编码,然后按照特定格式打包,通过RTMP或者RTSP传输出去,实现直播目的。

对应设计架构图的“发布端”,编码后的音视频数据,按照协议打包后,推送到流媒体服务器(如RTMP服务器,自建服务,可以考虑SRS或者nginx服务器,如果是RTSP服务器,可以考虑苹果官方的darwin streaming server)。

这种方案的设计,一般是一对多设计模型,接收端接收RTMP或RTSP流,然后解析音视频数据,解码、同步音视频数据,并绘制,实现整体的直播解决方案。

以下是设计架构图:

模块设计

  • 自有框架,易于扩展,自适应算法让延迟更低、采集编码传输效率更高;
  • 所有功能以接口形式提供,所有状态,均有event回调,支持断网自动重连;
  • 模块化设计,可和大牛直播RTSP或RTMP直播播放模块组合实现流媒体数据转发、连麦、一对一互动等场景;
  • 推送叠加以层级模式提供,开发者可以自行组合数据源(如多摄像头/屏幕/水印叠加);
  • 支持外部YUV/RGB/H.264/AAC/SPEEX/PCMA/PCMU数据源接入;
  • 所有参数均可通过SDK接口单独设置,亦可通过默认参数,傻瓜式设置;
  • 推送、录像、内置轻量级RTSP服务模块完全分离,可单独使用亦可组合使用。

功能设计

  • [本地预览]支持摄像头/屏幕/合成数据实时预览功能;
  • [摄像头反转/旋转]支持摄像头水平反转、垂直反转、0°/90°/180°/270°旋转;
  • [摄像头采集]除常规YUV格式外,还支持MJPEG格式的摄像头采集;
  • [RTMP推流]超低延时的RTMP协议直播推流SDK(Windows 64位库支持RTMP扩展H.265推送);
  • [视频格式]Windows支持H.264/H.265编码;
  • [音频格式]支持AAC编码和Speex编码;
  • [音频编码]支持Speex推送、Speex编码质量设置;
  • [软硬编码参数配置]支持gop间隔、帧率、bit-rate设置;
  • [软编码参数配置]支持软编码profile、软编码速度、可变码率设置;
  • [多实例推送]支持多实例推送(如同时推送屏幕/摄像头和外部数据);
  • [RTMP扩展H.265]Windows/Android推送SDK支持RTMP扩展H.265推送,Windows针对摄像头采集软编码,使用H.265可变码率,带宽大幅节省,效果直逼传统H.265编码摄像头;
  • [多分辨率支持]支持摄像头或屏幕多种分辨率设置;
  • [Windows推屏]支持屏幕裁剪、窗口采集、屏幕/摄像头数据合成等多种模式推送;
  • [事件回调]支持各种状态实时回调;
  • [水印]Windows平台支持文字水印、png水印、实时遮挡;
  • [复杂网络处理]支持断网重连等各种网络环境自动适配;
  • [动态码率]支持根据网络情况自动调整推流码率;
  • [实时静音]支持推送过程中,实时静音/取消静音;
  • [实时快照]支持推流过程中,实时快照;
  • [纯音频推流]支持仅采集音频流并发起推流功能;
  • [纯视频推流]支持特殊场景下的纯视频推流功能;
  • [降噪]支持环境音、手机干扰等引起的噪音降噪处理、自动增益、VAD检测;
  • [外部编码前视频数据对接]支持YUV数据对接;
  • [外部编码前音频数据对接]支持PCM对接;
  • [外部编码后视频数据对接]支持外部H.264数据对接;
  • [外部编码后音频数据对接]外部AAC/PCMA/PCMU/SPEEX数据对接;
  • [扩展录像功能]完美支持和录像SDK组合使用;
  • [服务器兼容]支持支持自建服务器(如Nginx、SRS)或CDN。

集成和使用说明

demo说明

  • Windows平台RTMP/RTSP直播推送模块对外提供C++/C#两套接口,对外提供32/64位库,C++和C#接口一一对应,C#接口比C++接口增加前缀NT_PB_。
  • WIN-PublisherSDK-CPP-Demo:推送端SDK对应的C++接口的demo;
  • WIN-PublisherSDK-CSharp-Demo:推送端SDK对应的C#接口的demo;
  • 推送端模块支持Win7及以上系统。
  • 本demo基于VS2013开发。

C++头文件:

  • [类型定义]nt_type_define.h
  • [Log定义]smart_log.h
  • [Log定义]smart_log_define.h
  • [音视频类型定义]nt_common_media_define.h
  • [base code定义]nt_base_code_define.h
  • [publisher接口]nt_smart_publisher_define.h
  • [publisher接口]nt_smart_publisher_sdk.h

C#头文件:

  • [Log定义]smart_log.cs
  • [Log定义]smart_log_define.cs
  • [base code定义]nt_base_code_define.cs
  • [publisher接口]nt_smart_publisher_define.cs
  • [publisher参数定义]nt_smart_publisher_sdk.cs

相关Lib:

  • SmartLog.dll
  • SmartLog.lib
  • SmartPublisherSDK.dll
  • SmartPublisherSDK.lib
  • avcodec-56.dll
  • avdevice-56.dll
  • avfilter-5.dll
  • avformat-56.dll
  • avutil-54.dll
  • postproc-53.dll
  • swresample-1.dll
  • swscale-3.dll

集成步骤

  1. 把lib目录下debug/release库拷贝到需要集成的工程对应的debug或release目录下(确保32位/64位库debug/release目录一一对应);

lib目录如下:

    1. 32位debug库:debug
    2. 32位release库:release
    3. 64位debug库:x64\debug
    4. 64位release库:x64\release

2. 相关cs头文件,加入需要集成的工程;

3. 在需要集成的工程,右键->Properties->Application->Assembly name,写入“SmartPulisherDemo”。

功能详解

考虑到Windows平台推送端SDK功能相对复杂,以问答式:

1视频采集设置

1. 屏幕和摄像头相互切换:用于在线教育或者无纸化等场景,推送或录像过程中,随时切换屏幕或摄像头数据(切换数据源),如需实时切换,点击页面“切换到摄像头”按钮即可;

2. 设置遮盖层,用于设定一个长方形或正方形区域(可自指定区域大小),遮盖不想给用户展示的部分;

3. 水印:添加PNG水印,支持推送或录像过程中,随时添加、取消水印;

4. 摄像头叠加到屏幕:意在用于同屏过程中,主讲人摄像头悬浮于屏幕之上(可指定叠加坐标),实现双画面展示,推送或录像过程中,可以随时取消摄像头叠加;

5. 屏幕叠加到摄像头:同4,效果展示,实际根据需求实现;

6. 采集桌面:可以通过点击“选择屏幕区域”获取采集区域,并可在采集过程中,随时切换区域位置,如不设定,默认全屏采集;

7. 使用DXGI采集屏幕,采集时停用Aero;

8. 采集窗口:可设定需要采集的窗口,窗口放大或缩小,推送端会自适应码率和分辨率;

9. 采集帧率(帧/秒):默认屏幕采集8帧,可根据实际场景需求设定到期望帧率;

10. 缩放屏幕大小缩放比:用于高清或超高清屏,通过设定一定的比例因子,缩放屏幕采集分辨率;

11. 采集摄像头:可选择需要采集的摄像头、采集分辨率、帧率、是否需要水平或者垂直反转、是否需要旋转;

追加提问:

问题[确认数据源]:采集桌面还是摄像头?如果桌面,全屏还是部分区域?

回答:

如果是摄像头:可以选择摄像头列表,然后分辨率、帧率。

如果是屏幕:默认帧率是5帧,可以根据实际场景调整,选取屏幕区域,可以实时拉取选择需要采集或录像区域;

如果是叠加模式:可选择摄像头叠加到屏幕,还是屏幕叠加到摄像头;

更高需求的用户,可以设置水印或应用层遮盖。

问题:如果是摄像头,采集到的摄像头角度不对怎么办?

回答:我们支持摄像头镜像和翻转设置,摄像头可通过SDK接口轻松实现水平/垂直翻转、镜像效果。

2 视频码率控制

我选可变码率还是平均码率?

回答:可变码率的优势在于,如果屏幕或摄像头变化不大,码率超低,特别是H.265编码,平均码率,码率比较均匀,需设置平均码率+最大码率,一般摄像头采集建议选择可变码率,屏幕采集选择平均码率,如需采用可变码率,请取消“使用平均码率”选项。

265编码还是H.264编码?

回答:Windows 64位库支持H.265编码,如果推RTMP流,需要服务器支持RTMP H.265扩展,播放器SDK,也需要同步支持RTMP H.265扩展播放。

如果是轻量级RTSP服务SDK对接的话,只需要播放器支持RTSP H.265即可。

如果推摄像头数据,建议采用可变码率+H.265编码。

如何设置码率参数更合理?

回答:

关键帧间隔:一般来说,设置到帧率的2-4倍,比如帧率20,关键帧间隔可以设置到40-80;

平均码率:可以点击“获取视频码率默认值”,最大码率是平均码率的2倍;

视频质量:如果使用可变码率,建议采用大牛直播SDK默认推荐视频质量值;

编码速度:如高分辨率,建议1-3,值越小,编码速度越快;

H.264 Profile:默认baseline profile,可根据需要,酌情设置High profile;

NOTE:点击“推送”或“录像”或启动内置RTSP服务SDK之前,请务必设置视频码率,如不想手动设置,请点击“获取视频码率默认值”!!!

3 音频采集设置

问答式:采集音频吗?如果采集,采集麦克风还是扬声器的,亦或混音?

回答:

如果想采集电脑输出的音频(比如音乐之类),可以选择“采集扬声器”;

如果想采集麦克风音频,可以选择“采集麦克风”,并选择相关设备;

如果两个都想采集,可以两个都选择,混音输出。

4 音频编码

问题:是AAC还是SPEEX?

回答:我们默认是AAC编码模式,如果需要码率更低,可以选择SPEEX编码模式,当然我们的AAC编码码率也不高。

5 音频处理

问题:我想过滤背景噪音怎么办?

回答:选中“噪音抑制”,“噪音抑制“请和“自动增益控制”组合使用,“端点检测(VAD)”可选设置。

问题:我想做一对一互动怎么办?

回答:选中“回音消除”,可以和“噪音抑制”、“自动增益控制”组合使用。

问题:我推送或者录像过程中,随时静音怎么办?

回答:推送过程中,随时选择或取消选择“静音”功能。

6多路推送

问题:我想同时推送到多个url怎么办(比如一个内网服务器,一个外网服务器)?

回答:同时填写多个url,然后点推送即可。

7 截图(快照)

问题:我想推送或者录像过程中,截取当前图像怎么办?

回答:那就设置好截图路径,推送或录像过程中,随时点击“截图”。

8 录像

问题:我还想录像,怎么办?

回答:设置录像文件存放目录,文件前缀、单个文件大小,是否加日期、时间,随时录制即可,此外,我们的SDK还支持录像过程中,暂停录像,恢复录像。

9 实时预览

问题:我还想看看视频特别是合成后的效果,怎么办?

回答:点击页面的“预览”按钮,就可以看到。

接口调用时序(以C#为例)

如需下载demo源码工程,可以到 Github 下载 “Windows平台RTMP|RTSP推送SDK、内置RTSP服务SDK、录像SDK”,C++或者C#的都有。

1 初始化

NT_PB_Init

如需配置log路径,请在NT_PB_Init之前,做如下设置(目录可自行指定):

// 设置日志路径(请确保目录存在)

//String log_path = “D:\\pulisherlog”;

//NTSmartLog.NT_SL_SetPath(log_path);

2 Open

NT_PB_Open

3 设置回调事件

  • NT_PB_SetEventCallBack:设置事件回调,如果想监听事件的话,建议调用Open成功后,就调用这个接口
  • NT_PB_SetVideoPacketTimestampCallBack:设置视频包时间戳回调
  • NT_PB_SetPublisherStatusCallBack:设置推送状态回调

4 设置屏幕裁剪

  • NT_PB_SetScreenClip:设置屏幕裁剪
  • NT_PB_MoveScreenClipRegion:移动屏幕剪切区域,这个接口只能推送或者录像中调用

5 屏幕选取工具

  • NT_PB_OpenScreenRegionChooseTool:打开一个屏幕选取工具的toolHandle
  • NT_PB_MoveScreenClipRegion:移动屏幕剪切区域,这个接口只能推送或者录像中调用
  • NT_PB_AllocateImage:分配Image, 分配后,SDK内部会初始化这个结构体, 失败的话返回NULL
  • NT_PB_FreeImage:释放Image, 注意一定要调用这个接口释放内存,如果在你自己的模块中释放,Windows会出问题的
  • NT_PB_CloneImage:克隆一个Image, 失败返回NULL
  • NT_PB_CopyImage:拷贝Image, 会先释放dst的资源,然后再拷贝
  • NT_PB_SetImagePlane: 给图像一个面设置数据,如果这个面已经有数据,将会释放掉再设置
  • NT_PB_LoadImage:加载PNG图片

6 设置屏幕采集参数

  • NT_PB_EnableDXGIScreenCapturer:允许使用DXGI屏幕采集方式, 这种方式需要win8及以上系统才支持
  • NT_PB_DisableAeroScreenCapturer:采集屏幕时停用Aero, 这个只对win7有影响,win8及以上系统, 微软已经抛弃了Aero Glass效果
  • NT_PB_CheckCapturerWindow:判断顶层窗口能否能被捕获, 如果不能被捕获的话返回NT_ERC_FAILED(采集窗口)
  • NT_PB_SetCaptureWindow:设置要捕获的窗口的句柄(采集窗口)

7 设置摄像头采集参数

  • NT_PB_StartGetVideoCaptureDeviceImage:获取句柄,且保存句柄
  • NT_PB_FlipVerticalVideoCaptureDeviceImage:上下反转设备图像
  • NT_PB_FlipHorizontalVideoCaptureDeviceImage:水平反转设备图像
  • NT_PB_RotateVideoCaptureDeviceImage:旋转设备图像, 顺时针旋转
  • NT_PB_GetVideoCaptureDeviceNumber:获取摄像头数量
  • NT_PB_GetVideoCaptureDeviceInfo:返回摄像头设备信息
  • NT_PB_GetVideoCaptureDeviceCapabilityNumber:返回摄像头能力数
  • NT_PB_GetVideoCaptureDeviceCapability:返回摄像头能力
  • NT_PB_DisableVideoCaptureResolutionSetting:

在多个实例推送多路时,对于一个摄像头来说,所有实例只能共享摄像头,那么只有一个实例可以改变摄像头分辨率,其他实例使用这个缩放后的图像;

在使用多实例时,调用这个接口禁止掉实例的分辨率设置能力.只留一个实例能改变分辨,如果不设置,行为未定义;

这个接口必须在 SetLayersConfig, AddLayerConfig 之前调用。

  • NT_PB_StartVideoCaptureDevicePreview: 启动摄像头预览
  • NT_PB_FlipVerticalCameraPreview:上下反转摄像头预览图像
  • NT_PB_FlipHorizontalCameraPreview:水平反转摄像头预览图像
  • NT_PB_RotateCameraPreview:旋转摄像头预览图像, 顺时针旋转
  • NT_PB_VideoCaptureDevicePreviewWindowSizeChanged:告诉SDK预览窗口大小改变
  • NT_PB_StopVideoCaptureDevicePreview:停止摄像头预览
  • NT_PB_GetVideoCaptureDeviceImage:调用这个接口可以获取摄像头图像
  • NT_PB_StopGetVideoCaptureDeviceImage:停止获取摄像头图像
  • NT_PB_SetVideoCaptureDeviceBaseParameter:设置摄像头信息
  • NT_PB_FlipVerticalCamera上下反转摄像头图像
  • NT_PB_FlipHorizontalCamera:水平反转摄像头图像
  1. NT_PB_RotateCamera:旋转摄像头图像, 顺时针旋转

8 视频合成图层类型

public enum NT_PB_E_LAYER_TYPE : int

{

NT_PB_E_LAYER_TYPE_SCREEN = 1,                  // 屏幕层

NT_PB_E_LAYER_TYPE_CAMERA = 2,                  // 摄像头层

NT_PB_E_LAYER_TYPE_RGBA_RECTANGLE = 3,          // RGBA矩形

NT_PB_E_LAYER_TYPE_IMAGE = 4,                   // 图片层

NT_PB_E_LAYER_TYPE_EXTERNAL_VIDEO_FRAME = 5,    // 外部视频数据层

NT_PB_E_LAYER_TYPE_WINDOW = 6, // 窗口层

}

9 音视频源类型

/*定义Video源选项*/

public enum NT_PB_E_VIDEO_OPTION : uint

{

NT_PB_E_VIDEO_OPTION_NO_VIDEO = 0x0,

NT_PB_E_VIDEO_OPTION_SCREEN = 0x1, // 采集屏幕

NT_PB_E_VIDEO_OPTION_CAMERA = 0x2, // 摄像头采集

NT_PB_E_VIDEO_OPTION_LAYER = 0x3,  // 视频合并,比如桌面叠加摄像头等

NT_PB_E_VIDEO_OPTION_ENCODED_DATA = 0x4, // 已经编码的视频数据,目前支持H264

NT_PB_E_VIDEO_OPTION_WINDOW = 0x5, // 采集窗口

}

/*定义Auido源选项*/

public enum NT_PB_E_AUDIO_OPTION : uint

{

NT_PB_E_AUDIO_OPTION_NO_AUDIO = 0x0,

NT_PB_E_AUDIO_OPTION_CAPTURE_MIC = 0x1,           // 采集麦克风音频

NT_PB_E_AUDIO_OPTION_CAPTURE_SPEAKER = 0x2,           // 采集扬声器

NT_PB_E_AUDIO_OPTION_CAPTURE_MIC_SPEAKER_MIXER = 0x3,    // 麦克风扬声器混音

NT_PB_E_AUDIO_OPTION_ENCODED_DATA = 0x4, // 编码后的音频数据,目前支持AAC, speex宽带(wideband mode)

}

10 视频编码接口

  • NT_PB_SetVideoEncoderType:设置编码类型, 当前支持h264和h265(注意:h265只有64位sdk库支持, 在32位库上设置会失败);
  • NT_PB_SetVideoQuality:设置视频质量, 范围[0-20], 默认是10, 值越小质量越好,但码率会越大
  • NT_PB_SetVideoQualityV2:设置视频质量, 范围[1-50], 值越小视频质量越好,但码率会越大. 请优先考虑默认值;
  • NT_PB_SetFrameRate:设置帧率
  • NT_PB_SetVideoMaxBitRate:设置最大视频码率, 单位kbps
  • NT_PB_AddVideoEncoderBitrateGroupItem:

* 在一些特殊场景下, 视频分辨率会改变, 如果设置一个固定码率的的话,当视频分辨率变大的时候会变的模糊,变小的话又会浪费码率

* 所以提供可以设置一组码率的接口,满足不同分辨率切换的需求

* 规则: 比如设置两组分辨率 640*360, 640*480, 那么当分辨率小于等于640*360时都使用640*360的码率,

* 当分辨率大于640*360且小于等于640*480时,就使用640*480的码率,如果分辨率大于640*480 那就使用640*480的分辨率

* 为了设置的更准确, 建议多划分几组, 让区间变小

* 调用这个接口每次设置一组,设置多组就调用多次

* item对应 NT_PB_VideoEncoderBitrateGroupItem

  • NT_PB_ClearVideoEncoderBitrateGroup:清除视频码率组
  • NT_PB_SetVideoKeyFrameInterval:设置关键帧间隔, 比如1表示所有帧都是关键帧,10表示每10帧里面一个关键帧,25表示每25帧一个关键帧
  • NT_PB_SetVideoEncoderProfile:设置H264 profile,1: H264 baseline(默认值). 2: H264 main. 3. H264 high
  • NT_PB_SetVideoEncoderSpeed:设置H264编码速度,speed: 范围是 1 到 6,  值越小,速度越快,质量也越差
  • NT_PB_SetVideoCompareSameImage:设置是否对图像进行相同比较,相同图像比较一般在采集桌面时有一定好处,可能能降低码率
  • NT_PB_SetVideoMaxKeyFrameInterval:设置视频最大关键帧间隔, 这个接口一般不使用,这里是用来配合SetVideoCompareSameImage接口的,比如开启图像比较后,SDK发现连续20s图像都是相同的,但播放端需要收到关键帧才能解码播放,所以需要一个限制

11 音频编码接口

  • NT_PB_GetAuidoInputDeviceNumber:获取系统音频输入设备数
  • NT_PB_GetAuidoInputDeviceName:获取音频输入设备名称
  • NT_PB_SetPublisherAudioCodecType:设置推送音频编码类型,type: 1:使用AAC编码, 2:使用speex编码, 其他值返回错误
  • NT_PB_SetPublisherSpeexEncoderQuality:设置推送Speex编码质量
  • NT_PB_SetAuidoInputDeviceId:设置音频输入设备ID
  • NT_PB_IsCanCaptureSpeaker:检查是否能捕获扬声器音频

12 音频处理接口

  • NT_PB_SetEchoCancellation:设置回音消除
  • NT_PB_SetNoiseSuppression:设置音频噪音抑制
  • NT_PB_SetAGC:设置音频自动增益控制
  • NT_PB_SetVAD:设置端点检测(Voice Activity Detection (VAD))

13 图层合成等接口

  • NT_PB_SetLayersConfig:设置视频合成层, 传入的是一个数组, 请正确填充每一层
  • NT_PB_ClearLayersConfig:清除所有层配置,注意这个接口只能在推送或者录像之前调用,否则结果未定义
  • NT_PB_AddLayerConfig: 增加层配置,注意这个接口只能在推送或者录像之前调用,否则结果未定义
  • NT_PB_EnableLayer:动态禁止或者启用层
  • NT_PB_UpdateLayerConfigV2:更新层相关配置, 注意不是层的所有字段都可以更新,只是部分可以更新,并且有些层没有字段可以更新,传入的参数,SDK只选择能更新的字段更新,不能更新的字段会被忽略
  • NT_PB_UpdateLayerRegion:修改图层
  • NT_PB_PostLayerImage:给index层投递Image数据,目前主要是用来把rgb和yuv视频数据传给相关层
  • NT_PB_SetParam:万能接口, 设置参数, 大多数问题, 这些接口都能解决
  • NT_PB_GetParam:万能接口, 得到参数, 大多数问题,这些接口都能解决

15 RTMP推送-设置推送RTMP Url

NT_PB_SetURL:rtmp推送url设置

16 RTMP推送-启动推送RTMP流

NT_PB_StartPublisher

17 RTMP推送-停止推送RTMP流

NT_PB_StopPublisher:注意,此接口和NT_PB_StartPublisher配套使用

18 RTSP推送-设置传输方式(TCP/UDP)

NT_PB_SetPushRtspTransportProtocol:设置推送rtsp传输方式,一般服务器可同时支持RTSP TCP或UDP传输模式,部分服务器只支持TCP或UDP模式。其中,transport_protocol: 1表示UDP传输rtp包; 2表示TCP传输rtp包. 默认是1, UDP传输。

19 RTSP推送-设置推送RTSP Url

NT_PB_SetPushRtspURL:注意,RTSP推送时,确保服务器推送URL可用。

20 RTSP推送-启动推送RTSP流

NT_PB_StartPushRtsp

21 RTSP推送-启动推送RTSP流

NT_PB_StopPushRtsp:注意,此接口和NT_PB_StartPushRtsp配套使用。

22 RTMP/RTSP推送端录像

  • NT_PB_SetRecorderDirectory:设置本地录像目录, 必须是英文目录,否则会失败
  • NT_PB_SetRecorderFileMaxSize:设置单个录像文件最大大小, 当超过这个值的时候,将切割成第二个文件
  • NT_PB_SetRecorderFileNameRuler:设置录像文件名生成规则
  • NT_PB_StartRecorder:启动录像
  • NT_PB_PauseRecorder:暂停录像,is_pause: 1表示暂停, 0表示恢复录像, 输入其他值将调用失败
  • NT_PB_StopRecorder:停止录像

23 实时静音(实时调用)

NT_PB_SetMute:设置推送实时静音

24 快照(实时调用)

NT_PB_CaptureImage:推送或者录像过程中,实时快照

25 Close

NT_PB_Close:调用这个接口之后handle失效

26 Uninit

NT_PB_UnInit:这个是最后一个调用的接口

以上是我们的设计模块部分资料,感兴趣的开发者,可以酌情参考。