英文:
Azure media service live stream how to implement stream from android app
问题
我正在尝试查找有关在Java中进行实时流的任何代码示例。
我想在Android应用中通过Azure媒体服务实现它。
目前,我有一个配置如下的MediaRecorder实例:
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mServiceCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
public String filePath = Environment.getExternalStorageDirectory().getPath() + "/video.mp4";
mMediaRecorder.setOutputFile(filePath);
我搜索了与在我的应用程序中集成Azure媒体服务以将实时流广播到云端并通过推送通知将流的传递链接到其他应用程序的方法。
我公开了这个链接。在那里的依赖项在Android Gradle文件中无效。我们应该如何处理这个?
另外,是否需要使用RTMP?还是因为文件内容直接保存在Azure存储中的资产文件中,就已经完成了工作?
就我所理解的定位器创建终端点而言,是否正确?
我是否可以在应用程序中获取此终端点URL以便发送,还是它仅存在于Azure门户中?
广播员是否需要是Azure Active Directory中的用户?
在应用程序中是否有可能使用户有能力创建凭据?
英文:
I am trying to find any code sample on live streaming in java. <br>
I want implement it in an android app by azure media service. <br>
currently I have MediaRecorder instance which configured like this:
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mServiceCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
public String filePath = Environment.getExternalStorageDirectory().getPath() + "/video.mp4";
mMediaRecorder.setOutputFile(filePath);
I searched way to integrate with azure media service in my app to broadcast live streaming to cloud and delivery link to the stream in push notification to other app <br>
I exposed this . the dependency there is not valid in android gradle file. how we need to work with that?
also does need to use RTMP? or because the file content was saved directly in asset file in the azure storage it is doing the job? <br>
how much I am understood the locator create endpoint. <br>
can I get this endpoint url in the app for send it or it is existing only in azure portal? <br>
is the broadcaster need to be user in AAD? <br>
does it possible to give the user in the app the ability to create credentials?
答案1
得分: 0
我并非Java专家,但根据我所阅读的关于MediaRecorder的内容,它似乎没有能力以RTMP协议进行流输出。手机需要充当客户端,将RTMP协议推送到AMS服务器端点。根据API文档中的内容,我认为MediaRecorder API只能输出mpeg 2 TS格式。
我见过一些项目,它们会将MediaRecorder的输出重新封装成RTMP流。例如,可以参考https://github.com/octiplex/Android-RTMP-Muxer。请注意,这并不是我在背后背书。
至于你其他的问题:
- 绝不要在移动应用程序中包含Azure凭据!你需要使用中间层来为应用程序进行工作。这意味着你需要构建一个API与应用程序前端进行通信,而这个API将调用它所需的后端服务。在这种情况下,一个简单的Azure Functions应用程序就足够了,或者你可以使用任何你喜欢的框架来构建和托管你的API。AMS的凭据将是一个AAD服务主体客户端ID和秘钥,需要在中间层中进行保护,供你的API使用。然后,该API将返回诸如RTMP摄取URL之类的信息。
- 当你在AMS的v3 API中创建一个直播事件时,你会得到一个包含“摄取URL”端点的对象。该端点具有一个RTMP://服务器DNS名称:1935端口。这就是你连接的URL。当然,在连接之前,请确保先启动直播事件。
- 我建议首先使用OBS Studio进行测试,直到你了解如何连接到RTMP服务器的流程。你可以参考我们文档中的OBS Studio教程。在阅读该教程时,请考虑OBS Studio就像你的Android应用程序一样在操作。
- 在你的MediaRecorder代码中,我看到你将音频设置为AMR格式。你需要使用AAC LC音频格式。这是我们用于摄取的唯一音频格式,也是HLS和Dash流媒体中最常用的格式。它将在向设备和浏览器传递时提供最广泛的兼容性。
英文:
I’m no Java expert but from what I’ve read about MediaRecorder it does not have the ability to stream an output in the RTMP protocol. The phone needs to act like a client and push the RTMP protocol to the AMS server endpoint. I believe the MediaRecorder api only output mpeg 2 TS from what I see in the API Docs.
I’ve seen projects that re wrap the output from the MediaRecorder into an RTMP stream. See https://github.com/octiplex/Android-RTMP-Muxer as an example. No endorsement by me.
For your other questions.
- Never include your azure credentials in a mobile application! You need to use a middle tier to do the work for your app. That means you would build an api to talk to the app front end, and that api would call the backend services it needs. In this case a simple Azure Functions app would be sufficient or you can build and host your api in any framework you like. The credentials to AMS will be an AAD service principal client id and secret that needs to be secured in the mid tier for your api to use. The api would then return information like the RTMP ingest url.
- When you create a live event in the v3 api for AMS you get back an object that contains the “ingest url” endpoint. That has a RTMP://server dns name :1935 port . This is the url you connect to. Make sure you start the live event first of course before connecting.
- I recommend testing with OBS studio first until you understand the flow of connecting to an RTMP server. Go through the OBS studio tutorial in our documentation
When going through that tutorial consider that the OBS studio is acting like your android app. - In your code for MediaRecorder I see you set the audio to AMR. You need to use AAC LC audio format. That is the only audio format that we support for ingest, and it is the most widely used format for HLS and Dash streaming. It will provide the most compatibility on delivery to devices and browsers.
专注分享java语言的经验与见解,让所有开发者获益!
评论