Currently, Android system is becoming increasingly popular among the airborne environment based audio and video transmission applications, but Android system’s multimedia streaming framework does not provide good support for real-time transmission.The lack of support for some of the real-time transport protocol bring difficulties for the realization of the end to end real-time video transmission. On this issue, this paper mainly focus on solving real-time audio and video transmission problems from single-end to multi-end based on the Android platform,and achieving RTSP/RTP/ RTCP real-time transmission on the Android platform,making it easy for the for real-time audio and video communications between the flight attendants’ Android handheld devices and passengers terminal end.This paper achieved real-time audio and video capture and encoding based on the Android system’s multimedia framework, and accomplish the packeting process in the Java layer according to RFC in the RTP/RTCP protocol standards for H.264 and AAC. At the same time it implemented a RTSP streaming media server in the Java layer. Designed and implemented a C/S mode real-time transmission system framework composed by Android terminals, RTSP server and RTSP client terminal,using RTSP/ RTP/UDP protocols,Desighed and implemented a B/S mode real-time audio and video transmission system framework composed by Android terminal,WEB server and browses,using Http Live Stream protocol.In the B/S model, this paper proposed and validated a new scheme for the transmision of the real-time audio and video.Android terminal serve as a RTSP server,using RTP/UDP to transport to the server, the WEB server is in charge of TS stream’s fragmentation, making it convenient for the media streaming.Both frame models achieve the real-time audio and video communication sessions within onboard airborne LAN systems and Android terminal. In practical, both of the two audio and video transmission solutions provided smooth playback, and have a strong practical value. |