Android实现线程Server将录屏的数据经过H264编码后发送到本地RtpSocket,线程Client从RtpSocket获取录屏数据后解码H264,然后计算从Client发送数据到Server接收数据的时延
时间: 2023-06-10 16:07:23 浏览: 165
实现这个功能需要以下几个步骤:
1. 录屏数据采集:使用Android提供的MediaProjection API,可以获取屏幕内容并转化为Surface,然后使用MediaCodec将Surface上的图像数据编码为H264格式的视频流。
2. 发送编码后的视频流:使用RtpSocket发送编码后的视频流,可以使用Java的RtpPacket类来封装H264数据,并通过RtpSocket发送到指定的目的地址和端口号。
3. 接收视频流并解码:在客户端上,使用Java的RtpPacket类接收RtpSocket传输的H264数据,并解包成H264格式的视频流。然后使用MediaCodec解码H264视频流,获取解码后的图像数据。
4. 计算时延:在客户端上,记录发送数据的时间戳和接收数据的时间戳,计算出时延。
具体实现细节可以参考以下代码:
Server端:
```java
public class ScreenRecordServer implements Runnable {
private MediaProjection mediaProjection;
private MediaCodec mediaCodec;
private RtpSocket rtpSocket;
private int width, height, bitRate, frameRate;
public ScreenRecordServer(MediaProjection mediaProjection, RtpSocket rtpSocket, int width, int height, int bitRate, int frameRate) {
this.mediaProjection = mediaProjection;
this.rtpSocket = rtpSocket;
this.width = width;
this.height = height;
this.bitRate = bitRate;
this.frameRate = frameRate;
}
@Override
public void run() {
try {
mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Surface surface = mediaCodec.createInputSurface();
mediaProjection.createVirtualDisplay("ScreenRecordServer", width, height, Resources.getSystem().getDisplayMetrics().densityDpi, DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, null);
mediaCodec.start();
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
boolean isRunning = true;
while (isRunning) {
int inputBufferIndex = mediaCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
long presentationTimeUs = System.nanoTime() / 1000;
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
int size = mediaProjection.getMediaProjection().getProjection().updateSurface();
if (size > 0) {
inputBuffer.put(mediaProjection.getMediaProjection().getProjection().getBuffer());
mediaCodec.queueInputBuffer(inputBufferIndex, 0, size, presentationTimeUs, 0);
}
}
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
if (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] packet = new byte[bufferInfo.size];
outputBuffer.get(packet);
RtpPacket rtpPacket = new RtpPacket(packet, packet.length);
rtpSocket.send(rtpPacket);
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
}
}
mediaCodec.stop();
mediaCodec.release();
} catch (Exception e) {
e.printStackTrace();
}
}
}
```
Client端:
```java
public class ScreenRecordClient implements Runnable {
private RtpSocket rtpSocket;
private MediaCodec mediaCodec;
private int width, height, bitRate, frameRate;
private long startTime, endTime;
public ScreenRecordClient(RtpSocket rtpSocket, int width, int height, int bitRate, int frameRate) {
this.rtpSocket = rtpSocket;
this.width = width;
this.height = height;
this.bitRate = bitRate;
this.frameRate = frameRate;
}
@Override
public void run() {
try {
mediaCodec = MediaCodec.createDecoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
mediaCodec.configure(mediaFormat, null, null, 0);
mediaCodec.start();
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
boolean isRunning = true;
while (isRunning) {
RtpPacket rtpPacket = rtpSocket.receive();
byte[] packet = rtpPacket.getPacket();
int offset = rtpPacket.getOffset();
int length = rtpPacket.getLength();
ByteBuffer inputBuffer = null;
long presentationTimeUs = System.nanoTime() / 1000;
int inputBufferIndex = mediaCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(packet, offset, length);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, length, presentationTimeUs, 0);
}
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
if (outputBufferIndex >= 0) {
startTime = System.currentTimeMillis();
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] frame = new byte[bufferInfo.size];
outputBuffer.get(frame);
mediaCodec.releaseOutputBuffer(outputBufferIndex, true);
endTime = System.currentTimeMillis();
Log.d("ScreenRecordClient", "delay: " + (endTime - startTime) + "ms");
}
}
mediaCodec.stop();
mediaCodec.release();
} catch (Exception e) {
e.printStackTrace();
}
}
}
```
在主线程中启动Server和Client线程:
```java
MediaProjectionManager mediaProjectionManager = (MediaProjectionManager) getSystemService(MEDIA_PROJECTION_SERVICE);
MediaProjection mediaProjection = mediaProjectionManager.getMediaProjection(Activity.RESULT_OK, data);
RtpSocket rtpSocket = new RtpSocket("localhost", 5555);
new Thread(new ScreenRecordServer(mediaProjection, rtpSocket, 1280, 720, 4000000, 30)).start();
new Thread(new ScreenRecordClient(rtpSocket, 1280, 720, 4000000, 30)).start();
```
其中,1280x720为录屏分辨率,4000000为码率,30为帧率。在Client线程中,记录了发送数据的时间戳和接收数据的时间戳,并计算出了时延。
阅读全文