SimpleScreenCast: Android Screen Mirroring & Recording
Explore SimpleScreenCast, a powerful Android app for real-time screen mirroring and recording. Initially a hobby project, it evolved into a robust tool leveraging RTP/RTSP streaming with native Android APIs and broad client compatibility.
Technical Foundation: RTP/RTSP Streaming and Protocols
Core Protocols
Implemented RTP (Real-time Transport Protocol), RTSP (Real-time Streaming Protocol), and SDP (Session Description Protocol) directly from RFC standards to enable real-time video and audio transport.
Encoding/Decoding
Supports H264 video and AAC audio encoding and decoding through Android's native capabilities, ensuring low latency and high compatibility.
Interoperability
The app works seamlessly as client or server with popular media players like VLC and FFMPEG, enabling flexible viewing and streaming setups. I plan to upgrade the implementation to allow easy mirroring on to TV.
Use-case Scenarios for Real-time Screen Mirroring
Baby Sleep Monitoring
Ensure peace of mind by watching live video streams as "baby monitors" on any compatible device.
Drone Flight View
Stream real-time aerial footage directly to/from your Android device for accurate and responsive remote piloting.
IP-Cams & Surveillance
Use as an IP-Cam viewer or streamer to monitor security cameras with low latency and high reliability.
General Screen Mirroring
Mirror your Android screen to external displays (TV) or computers effortlessly for presentations or entertainment.
SimpleScreenCast: Key Features
Screen Recording & Mirroring
Capture device screen activity smoothly with minimal lag, ideal for tutorials and demos.
Permission Handling
Automatically manage necessary Android permissions for seamless casting and recording experience.
Getting Started: Development Prerequisites
Development Environment
  • Android Studio IDE for coding, building, and debugging
  • JDK 11 or later for compatibility with Android Gradle Plugin
Test Devices
  • Physical Android device or emulator running compatible Android versions
  • Minimum API level support ensuring wide device coverage
Installation & Setup Guide
1
Clone Repository
Run git clone https://github.com/yourusername/SimpleScreenCast to get the source code.
2
Open in Android Studio
Load the project workspace and sync Gradle files automatically.
3
Build Project
Use Build > Make Project to compile source and resolve dependency issues.
See SimpleScreenCast in Action
Prototypes
  1. Early demonstration of real-time streaming capabilities. Watch here.
  1. Enhanced version showcasing improved latency and user interface. Watch here.
SimpleScreenCast: Core Architecture Components
Server Components
  • RtspServer handles TCP connections
  • MediaStreamManager captures screen content
  • MediaCodecHandlers encode audio/video streams
Protocol Handlers
  • RtspClient processes RTSP commands
  • RtpPacketizer formats media data packets
  • RtpSocket manages UDP transmission
Configuration Options
  • Video: resolution, framerate, bitrate
  • Audio: sample rate, channels
  • Network: TCP/UDP transport selection
Implementation Flow
  • Obtain MediaProjection permission
  • Configure and start RTSP server
  • Respond to client commands
  • Stream encoded media via RTP
Client Architecture: RTSP Client Components
SimpleScreenCast's client architecture complements the server components to create a complete streaming solution.
Connection Management
RtspClient establishes TCP socket connections and handles the request-response protocol flow.
Stream Processing
RtpReceiver captures incoming packets while RtpDepacketizer reassembles encoded media frames.
Media Rendering
MediaCodecHandlers decode streams for VideoRenderer and AudioRenderer to display content.
Configuration
Flexible setup for server URL, transport protocols, and rendering targets.
Implementation follows a four-step process: create the RtspClient, initiate playback with RTSP signaling, process incoming data, and release resources on completion.
Buffer Management on the Receiver Side
Jitter Buffer
Captures and reorders incoming RTP packets based on sequence numbers.
Compensates for network jitter to maintain smooth playback.
Depacketization
Reassembles encoded media from packet streams.
Ensures complete frames reach the decoder despite packet loss.
Playback Management
Balances buffer levels to prevent underruns and excessive latency.
Synchronizes audio and video streams using RTP timestamps.
Loss Mitigation
Implements error concealment techniques for lost packets.
Uses adaptive buffering based on network conditions.
Effective buffer management is critical for maintaining smooth playback while minimizing latency in SimpleScreenCast's client architecture.
H264 Video Stream Serialization
Buffer Validation
Checks for end-of-stream flags and validates position. Ensures media buffers conform to expected parameters.
NAL Unit Processing
Extracts and verifies NAL headers (0x00000001). Identifies frame types through NAL unit type masks.
Frame Filtering
Discards non-IDR frames before IDR processing. Ensures proper decoding sequence for optimal playback.
Serialization
Handles buffer positioning and calls internal methods. Provides debugging capabilities through logging and binary dumps.
This critical component transforms encoded video data into transmission-ready packets. The serialize() method preserves frame integrity while managing buffer constraints.

public synchronized void serialize(ByteBuffer sampleBuffer, MediaCodec.BufferInfo sampleBufferInfo) { if ((sampleBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) { Log.w("H264Serializer", "[serialize] NOOP - EOS!"); return; // NOOP -> No action need, codec closing and there's so buffer to process } ByteBuffer mediaBuffer = sampleBuffer.slice(); if (mediaBuffer.position() != 0) { throw new AssertionError("Expected new buffer position to be zero!"); } // Read NAL-U header and verify header conformity // NOTE: This implementation assumes encoded media slice data contains NAL-U magic boundary in the beginning only ONCE mediaBuffer.get(naluHeader, 0, H264Defs.NAL_HEADER_SIZE); if (mediaBuffer.position() != H264Defs.NAL_HEADER_SIZE) { throw new AssertionError("Unable to read buffer with the given constraint"); } if (!ProtocolUtils.startsWithNaluMagicHeader(naluHeader)) { throw new AssertionError("NAL units are not preceeded by 0x00000001"); } refCountSlice++; naluByte = naluHeader[H264Defs.NAL_BYTE_POSITION] & 0xFF; nalType = (naluByte & H264Defs.NAL_TYPE_MASK) & 0xFF; // Discard non-IDR frames before a treatment of IDR if ((nalType == H264Defs.NAL_UNIT_TYPE_NON_IDR) && !sentIFrame) { Log.w("H264Serializer", "[serialize] *** Discarding non-IDR coded picture before IDR processing!"); return; } dumpDebugInfo(mediaBuffer); BinaryFileDump.dumpMediaSample(mediaBuffer, mediaBuffer.limit(), sampleBufferInfo.presentationTimeUs); if (mediaBuffer.position() != H264Defs.NAL_HEADER_SIZE) { mediaBuffer.rewind(); mediaBuffer.position(H264Defs.NAL_HEADER_SIZE); } serializeInternal(sampleBufferInfo, mediaBuffer); }

public synchronized void serialize(ByteBuffer sampleBuffer, MediaCodec.BufferInfo sampleBufferInfo) { if ((sampleBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) { Log.w("H264Serializer", "[serialize] NOOP - EOS!"); return; // NOOP -> No action need, codec closing and there's so buffer to process } ByteBuffer mediaBuffer = sampleBuffer.slice(); if (mediaBuffer.position() != 0) { throw new AssertionError("Expected new buffer position to be zero!"); } // Read NAL-U header and verify header conformity // NOTE: This implementation assumes encoded media slice data contains NAL-U magic boundary in the beginning only ONCE mediaBuffer.get(naluHeader, 0, H264Defs.NAL_HEADER_SIZE); if (mediaBuffer.position() != H264Defs.NAL_HEADER_SIZE) { throw new AssertionError("Unable to read buffer with the given constraint"); } if (!ProtocolUtils.startsWithNaluMagicHeader(naluHeader)) { throw new AssertionError("NAL units are not preceeded by 0x00000001"); } refCountSlice++; naluByte = naluHeader[H264Defs.NAL_BYTE_POSITION] & 0xFF; nalType = (naluByte & H264Defs.NAL_TYPE_MASK) & 0xFF; // Discard non-IDR frames before a treatment of IDR if ((nalType == H264Defs.NAL_UNIT_TYPE_NON_IDR) && !sentIFrame) { Log.w("H264Serializer", "[serialize] *** Discarding non-IDR coded picture before IDR processing!"); return; } dumpDebugInfo(mediaBuffer); BinaryFileDump.dumpMediaSample(mediaBuffer, mediaBuffer.limit(), sampleBufferInfo.presentationTimeUs); if (mediaBuffer.position() != H264Defs.NAL_HEADER_SIZE) { mediaBuffer.rewind(); mediaBuffer.position(H264Defs.NAL_HEADER_SIZE); } serializeInternal(sampleBufferInfo, mediaBuffer); }