ffserver man page on Alpinelinux

Man page or keyword search:  
man Server   18016 pages
apropos Keyword Search (all sections)
Output format
Alpinelinux logo
[printable version]

FFSERVER(1)							   FFSERVER(1)

NAME
       ffserver - ffserver video server

SYNOPSIS
       ffserver [options]

DESCRIPTION
       ffserver is a streaming server for both audio and video. It supports
       several live feeds, streaming from files and time shifting on live
       feeds (you can seek to positions in the past on each live feed,
       provided you specify a big enough feed storage in ffserver.conf).

       ffserver receives prerecorded files or FFM streams from some ffmpeg
       instance as input, then streams them over RTP/RTSP/HTTP.

       An ffserver instance will listen on some port as specified in the
       configuration file. You can launch one or more instances of ffmpeg and
       send one or more FFM streams to the port where ffserver is expecting to
       receive them. Alternately, you can make ffserver launch such ffmpeg
       instances at startup.

       Input streams are called feeds, and each one is specified by a "<Feed>"
       section in the configuration file.

       For each feed you can have different output streams in various formats,
       each one specified by a "<Stream>" section in the configuration file.

   Status stream
       ffserver supports an HTTP interface which exposes the current status of
       the server.

       Simply point your browser to the address of the special status stream
       specified in the configuration file.

       For example if you have:

	       <Stream status.html>
	       Format status

	       # Only allow local people to get the status
	       ACL allow localhost
	       ACL allow 192.168.0.0 192.168.255.255
	       </Stream>

       then the server will post a page with the status information when the
       special stream status.html is requested.

   What can this do?
       When properly configured and running, you can capture video and audio
       in real time from a suitable capture card, and stream it out over the
       Internet to either Windows Media Player or RealAudio player (with some
       restrictions).

       It can also stream from files, though that is currently broken. Very
       often, a web server can be used to serve up the files just as well.

       It can stream prerecorded video from .ffm files, though it is somewhat
       tricky to make it work correctly.

   How do I make it work?
       First, build the kit. It *really* helps to have installed LAME first.
       Then when you run the ffserver ./configure, make sure that you have the
       "--enable-libmp3lame" flag turned on.

       LAME is important as it allows for streaming audio to Windows Media
       Player.	Don't ask why the other audio types do not work.

       As a simple test, just run the following two command lines where
       INPUTFILE is some file which you can decode with ffmpeg:

	       ffserver -f doc/ffserver.conf &
	       ffmpeg -i INPUTFILE http://localhost:8090/feed1.ffm

       At this point you should be able to go to your Windows machine and fire
       up Windows Media Player (WMP). Go to Open URL and enter

		   http://<linuxbox>:8090/test.asf

       You should (after a short delay) see video and hear audio.

       WARNING: trying to stream test1.mpg doesn't work with WMP as it tries
       to transfer the entire file before starting to play.  The same is true
       of AVI files.

   What happens next?
       You should edit the ffserver.conf file to suit your needs (in terms of
       frame rates etc). Then install ffserver and ffmpeg, write a script to
       start them up, and off you go.

   Troubleshooting
       I don't hear any audio, but video is fine.

       Maybe you didn't install LAME, or got your ./configure statement wrong.
       Check the ffmpeg output to see if a line referring to MP3 is present.
       If not, then your configuration was incorrect. If it is, then maybe
       your wiring is not set up correctly. Maybe the sound card is not
       getting data from the right input source. Maybe you have a really awful
       audio interface (like I do) that only captures in stereo and also
       requires that one channel be flipped.  If you are one of these people,
       then export 'AUDIO_FLIP_LEFT=1' before starting ffmpeg.

       The audio and video lose sync after a while.

       Yes, they do.

       After a long while, the video update rate goes way down in WMP.

       Yes, it does. Who knows why?

       WMP 6.4 behaves differently to WMP 7.

       Yes, it does. Any thoughts on this would be gratefully received. These
       differences extend to embedding WMP into a web page. [There are two
       object IDs that you can use: The old one, which does not play well, and
       the new one, which does (both tested on the same system). However, I
       suspect that the new one is not available unless you have installed WMP
       7].

   What else can it do?
       You can replay video from .ffm files that was recorded earlier.
       However, there are a number of caveats, including the fact that the
       ffserver parameters must match the original parameters used to record
       the file. If they do not, then ffserver deletes the file before
       recording into it.  (Now that I write this, it seems broken).

       You can fiddle with many of the codec choices and encoding parameters,
       and there are a bunch more parameters that you cannot control. Post a
       message to the mailing list if there are some 'must have' parameters.
       Look in ffserver.conf for a list of the currently available controls.

       It will automatically generate the ASX or RAM files that are often used
       in browsers. These files are actually redirections to the underlying
       ASF or RM file. The reason for this is that the browser often fetches
       the entire file before starting up the external viewer. The redirection
       files are very small and can be transferred quickly. [The stream itself
       is often 'infinite' and thus the browser tries to download it and never
       finishes.]

   Tips
       * When you connect to a live stream, most players (WMP, RA, etc) want
       to buffer a certain number of seconds of material so that they can
       display the signal continuously. However, ffserver (by default) starts
       sending data in realtime. This means that there is a pause of a few
       seconds while the buffering is being done by the player. The good news
       is that this can be cured by adding a '?buffer=5' to the end of the
       URL. This means that the stream should start 5 seconds in the past --
       and so the first 5 seconds of the stream are sent as fast as the
       network will allow. It will then slow down to real time. This
       noticeably improves the startup experience.

       You can also add a 'Preroll 15' statement into the ffserver.conf that
       will add the 15 second prebuffering on all requests that do not
       otherwise specify a time. In addition, ffserver will skip frames until
       a key_frame is found. This further reduces the startup delay by not
       transferring data that will be discarded.

       * You may want to adjust the MaxBandwidth in the ffserver.conf to limit
       the amount of bandwidth consumed by live streams.

   Why does the ?buffer / Preroll stop working after a time?
       It turns out that (on my machine at least) the number of frames
       successfully grabbed is marginally less than the number that ought to
       be grabbed. This means that the timestamp in the encoded data stream
       gets behind realtime.  This means that if you say 'Preroll 10', then
       when the stream gets 10 or more seconds behind, there is no Preroll
       left.

       Fixing this requires a change in the internals of how timestamps are
       handled.

   Does the "?date=" stuff work.
       Yes (subject to the limitation outlined above). Also note that whenever
       you start ffserver, it deletes the ffm file (if any parameters have
       changed), thus wiping out what you had recorded before.

       The format of the "?date=xxxxxx" is fairly flexible. You should use one
       of the following formats (the 'T' is literal):

	       * YYYY-MM-DDTHH:MM:SS	 (localtime)
	       * YYYY-MM-DDTHH:MM:SSZ	 (UTC)

       You can omit the YYYY-MM-DD, and then it refers to the current day.
       However note that ?date=16:00:00 refers to 16:00 on the current day --
       this may be in the future and so is unlikely to be useful.

       You use this by adding the ?date= to the end of the URL for the stream.
       For example:   http://localhost:8080/test.asf?date=2002-07-26T23:05:00.

   What is FFM, FFM2
       FFM and FFM2 are formats used by ffserver. They allow storing a wide
       variety of video and audio streams and encoding options, and can store
       a moving time segment of an infinite movie or a whole movie.

       FFM is version specific, and there is limited compatibility of FFM
       files generated by one version of ffmpeg/ffserver and another version
       of ffmpeg/ffserver. It may work but it is not guaranteed to work.

       FFM2 is extensible while maintaining compatibility and should work
       between differing versions of tools. FFM2 is the default.

OPTIONS
       All the numerical options, if not specified otherwise, accept in input
       a string representing a number, which may contain one of the SI unit
       prefixes, for example 'K', 'M', 'G'.  If 'i' is appended after the
       prefix, binary prefixes are used, which are based on powers of 1024
       instead of powers of 1000.  The 'B' postfix multiplies the value by 8,
       and can be appended after a unit prefix or used alone. This allows
       using for example 'KB', 'MiB', 'G' and 'B' as number postfix.

       Options which do not take arguments are boolean options, and set the
       corresponding value to true. They can be set to false by prefixing with
       "no" the option name, for example using "-nofoo" in the command line
       will set to false the boolean option with name "foo".

   Stream specifiers
       Some options are applied per-stream, e.g. bitrate or codec. Stream
       specifiers are used to precisely specify which stream(s) does a given
       option belong to.

       A stream specifier is a string generally appended to the option name
       and separated from it by a colon. E.g. "-codec:a:1 ac3" option contains
       "a:1" stream specifier, which matches the second audio stream.
       Therefore it would select the ac3 codec for the second audio stream.

       A stream specifier can match several streams, the option is then
       applied to all of them. E.g. the stream specifier in "-b:a 128k"
       matches all audio streams.

       An empty stream specifier matches all streams, for example "-codec
       copy" or "-codec: copy" would copy all the streams without reencoding.

       Possible forms of stream specifiers are:

       stream_index
	   Matches the stream with this index. E.g. "-threads:1 4" would set
	   the thread count for the second stream to 4.

       stream_type[:stream_index]
	   stream_type is one of: 'v' for video, 'a' for audio, 's' for
	   subtitle, 'd' for data and 't' for attachments. If stream_index is
	   given, then matches stream number stream_index of this type.
	   Otherwise matches all streams of this type.

       p:program_id[:stream_index]
	   If stream_index is given, then matches stream number stream_index
	   in program with id program_id. Otherwise matches all streams in
	   this program.

       #stream_id
	   Matches the stream by format-specific ID.

   Generic options
       These options are shared amongst the av* tools.

       -L  Show license.

       -h, -?, -help, --help [arg]
	   Show help. An optional parameter may be specified to print help
	   about a specific item.

	   Possible values of arg are:

	   decoder=decoder_name
	       Print detailed information about the decoder named
	       decoder_name. Use the -decoders option to get a list of all
	       decoders.

	   encoder=encoder_name
	       Print detailed information about the encoder named
	       encoder_name. Use the -encoders option to get a list of all
	       encoders.

	   demuxer=demuxer_name
	       Print detailed information about the demuxer named
	       demuxer_name. Use the -formats option to get a list of all
	       demuxers and muxers.

	   muxer=muxer_name
	       Print detailed information about the muxer named muxer_name.
	       Use the -formats option to get a list of all muxers and
	       demuxers.

       -version
	   Show version.

       -formats
	   Show available formats.

	   The fields preceding the format names have the following meanings:

	   D   Decoding available

	   E   Encoding available

       -codecs
	   Show all codecs known to libavcodec.

	   Note that the term 'codec' is used throughout this documentation as
	   a shortcut for what is more correctly called a media bitstream
	   format.

       -decoders
	   Show available decoders.

       -encoders
	   Show all available encoders.

       -bsfs
	   Show available bitstream filters.

       -protocols
	   Show available protocols.

       -filters
	   Show available libavfilter filters.

       -pix_fmts
	   Show available pixel formats.

       -sample_fmts
	   Show available sample formats.

       -layouts
	   Show channel names and standard channel layouts.

       -loglevel loglevel | -v loglevel
	   Set the logging level used by the library.  loglevel is a number or
	   a string containing one of the following values:

	   quiet
	   panic
	   fatal
	   error
	   warning
	   info
	   verbose
	   debug

	   By default the program logs to stderr, if coloring is supported by
	   the terminal, colors are used to mark errors and warnings. Log
	   coloring can be disabled setting the environment variable
	   AV_LOG_FORCE_NOCOLOR or NO_COLOR, or can be forced setting the
	   environment variable AV_LOG_FORCE_COLOR.  The use of the
	   environment variable NO_COLOR is deprecated and will be dropped in
	   a following FFmpeg version.

       -report
	   Dump full command line and console output to a file named
	   "program-YYYYMMDD-HHMMSS.log" in the current directory.  This file
	   can be useful for bug reports.  It also implies "-loglevel
	   verbose".

	   Setting the environment variable "FFREPORT" to any value has the
	   same effect. If the value is a ':'-separated key=value sequence,
	   these options will affect the report; options values must be
	   escaped if they contain special characters or the options delimiter
	   ':' (see the ``Quoting and escaping'' section in the ffmpeg-utils
	   manual). The following option is recognized:

	   file
	       set the file name to use for the report; %p is expanded to the
	       name of the program, %t is expanded to a timestamp, "%%" is
	       expanded to a plain "%"

	   Errors in parsing the environment variable are not fatal, and will
	   not appear in the report.

       -cpuflags flags (global)
	   Allows setting and clearing cpu flags. This option is intended for
	   testing. Do not use it unless you know what you're doing.

		   ffmpeg -cpuflags -sse+mmx ...
		   ffmpeg -cpuflags mmx ...
		   ffmpeg -cpuflags 0 ...

   AVOptions
       These options are provided directly by the libavformat, libavdevice and
       libavcodec libraries. To see the list of available AVOptions, use the
       -help option. They are separated into two categories:

       generic
	   These options can be set for any container, codec or device.
	   Generic options are listed under AVFormatContext options for
	   containers/devices and under AVCodecContext options for codecs.

       private
	   These options are specific to the given container, device or codec.
	   Private options are listed under their corresponding
	   containers/devices/codecs.

       For example to write an ID3v2.3 header instead of a default ID3v2.4 to
       an MP3 file, use the id3v2_version private option of the MP3 muxer:

	       ffmpeg -i input.flac -id3v2_version 3 out.mp3

       All codec AVOptions are obviously per-stream, so the chapter on stream
       specifiers applies to them

       Note -nooption syntax cannot be used for boolean AVOptions, use -option
       0/-option 1.

       Note2 old undocumented way of specifying per-stream AVOptions by
       prepending v/a/s to the options name is now obsolete and will be
       removed soon.

   Main options
       -f configfile
	   Use configfile instead of /etc/ffserver.conf.

       -n  Enable no-launch mode. This option disables all the Launch
	   directives within the various <Stream> sections. Since ffserver
	   will not launch any ffmpeg instances, you will have to launch them
	   manually.

       -d  Enable debug mode. This option increases log verbosity, directs log
	   messages to stdout.

SEE ALSO
       The doc/ffserver.conf example, ffmpeg(1), ffplay(1), ffprobe(1),
       ffmpeg-utils(1), ffmpeg-scaler(1), ffmpeg-resampler(1),
       ffmpeg-codecs(1), ffmpeg-bitstream-filters(1), ffmpeg-formats(1),
       ffmpeg-devices(1), ffmpeg-protocols(1), ffmpeg-filters(1)

AUTHORS
       The FFmpeg developers.

       For details about the authorship, see the Git history of the project
       (git://source.ffmpeg.org/ffmpeg), e.g. by typing the command git log in
       the FFmpeg source directory, or browsing the online repository at
       <http://source.ffmpeg.org>.

       Maintainers for the specific components are listed in the file
       MAINTAINERS in the source code tree.

				  2014-05-14			   FFSERVER(1)
[top]

List of man pages available for Alpinelinux

Copyright (c) for man pages and the logo by the respective OS vendor.

For those who want to learn more, the polarhome community provides shell access and support.

[legal] [privacy] [GNU] [policy] [cookies] [netiquette] [sponsors] [FAQ]
Tweet
Polarhome, production since 1999.
Member of Polarhome portal.
Based on Fawad Halim's script.
....................................................................
Vote for polarhome
Free Shell Accounts :: the biggest list on the net