Hi all! I want to use the ffserver to stream a live animation produced by an OpenGL (Mesa-) based program (virtual reality). The idea behind this is to send an animation which can be influenced by the user at the receiving side. I have already some success with the ffmpeg API. My problem is: I don't know how to bring the Windows Media player to play a live stream without sending a "Content-Length:" HTTP header. It seems the "asf" date haeder must contain an appropriate "play endlessly" (or so) information. And it seems the ffserver can produce such headers, because it is possible to set the "FileMaxSize" to zero meaning "unlimited". My problem is: The ffserver is not prepared for a live stream created by an OpenGL animation. I tried a named pipe to which the OpenGL-Process writes asf data and the ffserver reads and sends it. But that does not work. So I want to change the source. And I'd like to know your opinion. Near line 2056 in "ffserver.c" is redo: if (av_read_frame(c->fmt_in, &pkt) < 0) { ... It seems this is the location where a frame is read from the webcam (video4linux). My idea is to change this: redo: if (read_opengel_frame(c->fmt_in, &pkt) < 0) { ... whereby static int read_opengel_frame(AVFormatContext *s, AVPacket *pkt) {..} fills an RGB image and converts it to YUV using "sws_scale". Any comments ? -- J.Anders, GERMANY, TU Chemnitz, Fakultaet fuer Informatik
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4