英文:
How to use a custom video capturer and add to local video track with WebRTC for Android (org.webrtc)
问题
我有这个自定义视频捕获器类:
...
import org.webrtc.JavaI420Buffer;
import org.webrtc.SurfaceTextureHelper;
import org.webrtc.VideoCapturer;
import org.webrtc.VideoFrame;
import java.nio.ByteBuffer;
public class CustomCapturer implements VideoCapturer {
private SurfaceTextureHelper surTexture;
private Context appContext;
private org.webrtc.CapturerObserver capturerObs;
private Thread captureThread;
@Override
public void initialize(SurfaceTextureHelper surfaceTextureHelper, Context applicationContext, org.webrtc.CapturerObserver capturerObserver) {
surTexture = surfaceTextureHelper;
appContext = applicationContext;
capturerObs = capturerObserver;
}
public void addFrame(Bitmap frame_asbitmap) {
try {
final long captureTimeNs =
TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
long start = System.nanoTime();
JavaI420Buffer buffer = JavaI420Buffer.allocate(640, 480);
bitmapToI420(frame_asbitmap, buffer);
long frameTime = System.nanoTime() - start;
VideoFrame videoFrame = new VideoFrame(buffer, 0, frameTime);
capturerObs.onFrameCaptured(videoFrame);
} catch (Exception e) {
}
}
@Override
public void startCapture(int width, int height, int fps) {
captureThread = new Thread(() -> {
try {
long start = System.nanoTime();
capturerObs.onCapturerStarted(true);
} catch (InterruptedException ex) {
ex.printStackTrace();
}
});
captureThread.start();
}
@Override
public void stopCapture() {
captureThread.interrupt();
}
@Override
public void changeCaptureFormat(int width, int height, int fps) {
}
@Override
public void dispose() {
}
@Override
public boolean isScreencast() {
return false;
}
}
如何将其用作本地视频轨道的源?我在寻找相关文档,但没有找到。我计划将此类附加到本地视频轨道,并使用MainActivity
中的addFrame(Bitmap frame_asbitmap)
方法向其添加来自USB摄像头的帧。
英文:
I have this custom video capturer class:
...
import org.webrtc.JavaI420Buffer;
import org.webrtc.SurfaceTextureHelper;
import org.webrtc.VideoCapturer;
import org.webrtc.VideoFrame;
import java.nio.ByteBuffer;
public class CustomCapturer implements VideoCapturer {
private SurfaceTextureHelper surTexture;
private Context appContext;
private org.webrtc.CapturerObserver capturerObs;
private Thread captureThread;
@Override
public void initialize(SurfaceTextureHelper surfaceTextureHelper, Context applicationContext, org.webrtc.CapturerObserver capturerObserver) {
surTexture = surfaceTextureHelper;
appContext = applicationContext;
capturerObs = capturerObserver;
}
public void addFrame(Bitmap frame_asbitmap) {
try {
final long captureTimeNs =
TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
long start = System.nanoTime();
JavaI420Buffer buffer = JavaI420Buffer.allocate(640, 480);
bitmapToI420(frame_asbitmap, buffer);
long frameTime = System.nanoTime() - start;
VideoFrame videoFrame = new VideoFrame(buffer, 0, frameTime);
capturerObs.onFrameCaptured(videoFrame);
} catch (Exception e) {
}
}
@Override
public void startCapture(int width, int height, int fps) {
captureThread = new Thread(() -> {
try {
long start = System.nanoTime();
capturerObs.onCapturerStarted(true);
} catch(InterruptedException ex) {
ex.printStackTrace();
}
});
captureThread.start();
}
@Override
public void stopCapture() {
captureThread.interrupt();
}
@Override
public void changeCaptureFormat(int width, int height, int fps) {
}
@Override
public void dispose() { }
@Override
public boolean isScreencast() {
return false;
}
}
How to use this as a source for the local video track? I have looked for documentation on this but could not find any. I plan to attach this class to the local video track and then add frames to it form a USB camera using the addFrame(Bitmap frame_asbitmap)
method from MainActivity
.
专注分享java语言的经验与见解,让所有开发者获益!
评论