1. We need a C library (intended to be ran from the command line in a linux environment).
2. The application needs to take an RTP audio source and send it to a Flash Media Server (librtmp/librtp can be used).
3. It should also be able to do the reverse, take an RTMP stream from Flash Media Server and send it as RTP to a specific address.
4. When the original audio source is sending audio to the RTMP server, it should NOT send that same audio back to the audio source.
1. RTP source (we'll call SOURCE1) has audio playing, it should be published to Flash Media Server (example: rtmp://localhost/live/mystream) (using a library such as librtp and librtmp or similar)
2. If another source (we'll call SOURCE2) publishes to the same stream, it should be sent to SOURCE1 (possibly using librtmp or librtp)
We can currently accomplish the previously mentioned requirements using FFMPEG, however, if we set FFMPEG to take the input stream (SOURCE1) and send it to the flash media server, and then run another instance of FFMPEG to do the opposite (take the stream from flash media server and send it to SOURCE1), SOURCE1 will cease to transmit.
The core problem is that if SOURCE1 receives audio, it will stop playing. We do not have the ability to change the way SOURCE1 functions, and we want it to receive any audio from the same stream, other than it's own.
A thing to note: The RTP source will send and receive audio in the g711u codec, so some minor transcoding may be required. - In addition, we are ONLY working with live audio streams, no recording.