curl man page on IRIX

Man page or keyword search:  
man Server   31559 pages
apropos Keyword Search (all sections)
Output format
IRIX logo
[printable version]

curl(1)			   Curl Manual			  curl(1)

NAME
       curl - transfer a URL

SYNOPSIS
       curl [options] [URL...]

DESCRIPTION
       curl is a client to get documents/files from or send docu-
       ments to a server, using any of	the  supported	protocols
       (HTTP,  HTTPS,  FTP,  GOPHER, DICT, TELNET, LDAP or FILE).
       The command is designed to work without	user  interaction
       or any kind of interactivity.

       curl offers a busload of useful tricks like proxy support,
       user authentication, ftp upload, HTTP post,  SSL	 (https:)
       connections, cookies, file transfer resume and more.

URL
       The  URL	 syntax	 is  protocol  dependent.  You'll  find a
       detailed description in RFC 2396.

       You can specify multiple URLs or parts of URLs by  writing
       part sets within braces as in:

	http://site.{one,two,three}.com

       or  you	can get sequences of alphanumeric series by using
       [] as in:

	ftp://ftp.numericals.com/file[1-100].txt
	ftp://ftp.numericals.com/file[001-100].txt    (with lead-
       ing zeros)
	ftp://ftp.letters.com/file[a-z].txt

       It  is  possible	 to  specify up to 9 sets or series for a
       URL, but no nesting is supported at the moment:

	http://www.any.org/archive[1996-1999]/vol-
       ume[1-4]part{a,b,c,index}.html

       You  can	 specify  any amount of URLs on the command line.
       They will be fetched in a sequential manner in the  speci-
       fied order.

       Curl  will attempt to re-use connections for multiple file
       transfers, so that getting many files from the same server
       will  not do multiple connects / handshakes. This improves
       speed. Of course this is only done on files specified on a
       single  command	line  and cannot be used between separate
       curl invokes.

OPTIONS
       -a/--append
	      (FTP) When used in a ftp	upload,	 this  will  tell
	      curl  to append to the target file instead of over-
	      writing it. If the file doesn't exist, it	 will  be
	      created.

	      If  this	option is used twice, the second one will
	      disable append mode again.

       -A/--user-agent <agent string>
	      (HTTP) Specify the User-Agent string to send to the
	      HTTP  server.  Some badly done CGIs fail if its not
	      set to "Mozilla/4.0".   To  encode  blanks  in  the
	      string,  surround	 the  string  with  single  quote
	      marks.  This can also be set with	 the  -H/--header
	      flag of course.

	      If  this option is set more than once, the last one
	      will be the one that's used.

       -b/--cookie <name=data>
	      (HTTP) Pass the  data  to	 the  HTTP  server  as	a
	      cookie.	It  is	supposedly  the	 data  previously
	      received from the server in a  "Set-Cookie:"  line.
	      The  data	 should	 be  in the format "NAME1=VALUE1;
	      NAME2=VALUE2".

	      If no '=' letter is used in the line, it is treated
	      as  a  filename  to  use	to read previously stored
	      cookie lines from, which should  be  used	 in  this
	      session if they match. Using this method also acti-
	      vates the "cookie	 parser"  which	 will  make  curl
	      record  incoming cookies too, which may be handy if
	      you're  using  this   in	 combination   with   the
	      -L/--location  option.  The file format of the file
	      to read cookies from should be plain  HTTP  headers
	      or the Netscape/Mozilla cookie file format.

	      NOTE  that  the  file specified with -b/--cookie is
	      only used as input. No cookies will  be  stored  in
	      the  file.  To store cookies, save the HTTP headers
	      to a file using -D/--dump-header!

	      If this option is set more than once, the last  one
	      will be the one that's used.

       -B/--use-ascii
	      Use ASCII transfer when getting an FTP file or LDAP
	      info. For FTP, this can also be enforced	by  using
	      an URL that ends with ";type=A". This option causes
	      data sent to stdout to be in text	 mode  for  win32
	      systems.

	      If  this	option is used twice, the second one will
	      disable ASCII usage.

       --ciphers <list of ciphers>
	      (SSL) Specifies which ciphers to use in the connec-
	      tion.  The  list	of  ciphers  must  be using valid
	      ciphers. Read up on SSL cipher list details on this
	      URL:  http://www.openssl.org/docs/apps/ciphers.html
	      (Option added in curl 7.9)

	      If this option is used several times, the last  one
	      will override the others.

       --connect-timeout <seconds>
	      Maximum  time in seconds that you allow the connec-
	      tion to the server to take.  This only  limits  the
	      connection  phase,  once	curl  has  connected this
	      option is of no more use. See also  the  --max-time
	      option.

	      If  this option is used several times, the last one
	      will be used.

       -c/--cookie-jar <file name>
	      Specify to which file you want curl  to  write  all
	      cookies  after  a	 completed operation. Curl writes
	      all cookies previously read from a  specified  file
	      as   well	 as  all  cookies  received  from  remote
	      server(s). If no cookies are known, no file will be
	      written.	 The  file  will  be  written  using  the
	      Netscape cookie file format. If you  set	the  file
	      name  to	a  single  dash, "-", the cookies will be
	      written to stdout. (Option added in curl 7.9)

	      If this option is	 used  several	times,	the  last
	      specfied file name will be used.

       -C/--continue-at <offset>
	      Continue/Resume  a  previous  file  transfer at the
	      given offset. The given offset is the exact  number
	      of  bytes	 that  will  be	 skipped counted from the
	      beginning of the source file before  it  is  trans-
	      fered  to	 the  destination.  If used with uploads,
	      the ftp server command SIZE will	not  be	 used  by
	      curl.

	      Use  "-C	-" to tell curl to automatically find out
	      where/how to resume the transfer. It then uses  the
	      given output/input files to figure that out.

	      If  this option is used several times, the last one
	      will be used.

       ---create-dirs
	      When used in conjunction with the -o  option,  curl
	      will create the necessary local directory hierarchy
	      as needed.

       --crlf (FTP) Convert LF to CRLF in upload. Useful for  MVS
	      (OS/390).

	      If this option is used twice, the second will again
	      disable crlf converting.

       -d/--data <data>
	      (HTTP) Sends the specified data in a  POST  request
	      to the HTTP server, in a way that can emulate as if
	      a user has filled in a HTML form	and  pressed  the
	      submit  button.  Note that the data is sent exactly
	      as specified with no  extra  processing  (with  all
	      newlines	cut  off).   The  data	is expected to be
	      "url-encoded". This will cause  curl  to	pass  the
	      data  to the server using the content-type applica-
	      tion/x-www-form-urlencoded. Compare to -F. If  more
	      than  one -d/--data option is used on the same com-
	      mand line, the data pieces specified will be merged
	      together	with  a	 separating &-letter. Thus, using
	      '-d name=daniel -d skill=lousy'  would  generate	a
	      post	  chunk	      that	 looks	     like
	      'name=daniel&skill=lousy'.

	      If you start the data with the letter @,	the  rest
	      should  be  a file name to read the data from, or -
	      if you want curl to read the data from stdin.   The
	      contents	of  the file must already be url-encoded.
	      Multiple files can also be specified. Posting  data
	      from  a file named 'foobar' would thus be done with
	      "--data @foobar".

	      To post data purely binary, you should instead  use
	      the --data-binary option.

	      -d/--data is the same as --data-ascii.

	      If this option is used several times, the ones fol-
	      lowing the first will append data.

       --data-ascii <data>
	      (HTTP) This is an alias for the -d/--data option.

	      If this option is used several times, the ones fol-
	      lowing the first will append data.

       --data-binary <data>
	      (HTTP)  This  posts  data	 in  a	similar manner as
	      --data-ascii does, although when using this  option
	      the  entire  context of the posted data is kept as-
	      is. If you want to post a binary file  without  the
	      strip-newlines  feature of the --data-ascii option,
	      this is for you.

	      If this option is used several times, the ones fol-
	      lowing the first will append data.

       --disable-epsv
	      (FTP) Tell curl to disable the use of the EPSV com-
	      mand when doing passive FTP  downloads.  Curl  will
	      normally	always	first  attempt to use EPSV before
	      PASV, but with this option, it will not  try  using
	      EPSV.

	      If  this	option is used several times, each occur-
	      rence will toggle this on/off.

       -D/--dump-header <file>
	      Write the protocol headers to the specified file.

	      This option is handy to use when you want to  store
	      the  cookies  that  a  HTTP  site sends to you. The
	      cookies could then be read in a second curl  invoke
	      by using the -b/--cookie option!

	      When used on FTP, the ftp server response lines are
	      considered  being	 "headers"  and	 thus  are  saved
	      there.

	      If  this option is used several times, the last one
	      will be used.

       -e/--referer <URL>
	      (HTTP) Sends the "Referer Page" information to  the
	      HTTP   server.  This  can	 also  be  set	with  the
	      -H/--header  flag	 of  course.   When   used   with
	      -L/--location you can append ";auto" to the referer
	      URL to make curl automatically set the previous URL
	      when  it	follows	 a  Location: header. The ";auto"
	      string can be used alone, even if you don't set  an
	      initial referer.

	      If  this option is used several times, the last one
	      will be used.

       --environment
	      (RISC OS ONLY) Sets a range  of  environment  vari-
	      ables,  using  the names the -w option supports, to
	      easier allow extraction of useful information after
	      having run curl.

	      If  this	option is used several times, each occur-
	      rence will toggle this on/off.

       --egd-file <file>
	      (HTTPS) Specify the path name to the Entropy  Gath-
	      ering Daemon socket. The socket is used to seed the
	      random engine for SSL  connections.  See	also  the
	      --random-file option.

       -E/--cert <certificate[:password]>
	      (HTTPS) Tells curl to use the specified certificate
	      file when getting a file with HTTPS.  The	 certifi-
	      cate  must be in PEM format.  If the optional pass-
	      word isn't specified, it will be queried for on the
	      terminal. Note that this certificate is the private
	      key and the private certificate concatenated!

	      If this option is used several times, the last  one
	      will be used.

       --cacert <CA certificate>
	      (HTTPS) Tells curl to use the specified certificate
	      file to verify the peer. The file may contain  mul-
	      tiple  CA	 certificates. The certificate(s) must be
	      in PEM format.

	      If this option is used several times, the last  one
	      will be used.

       --capath <CA certificate directory>
	      (HTTPS) Tells curl to use the specified certificate
	      directory to verify the peer. The certificates must
	      be  in PEM format, and the directory must have been
	      processed using the c_rehash utility supplied  with
	      openssl.	Certificate directories are not supported
	      under  Windows  (because	c_rehash  uses	symbolink
	      links  to	 create	 them).	 Using --capath can allow
	      curl to make  https  connections	much  more  effi-
	      ciently  than  using  --cacert if the --cacert file
	      contains many CA certificates.

	      If this option is used several times, the last  one
	      will be used.

       -f/--fail
	      (HTTP)  Fail  silently (no output at all) on server
	      errors. This is mostly done  like	 this  to  better
	      enable  scripts  etc  to	better	deal  with failed
	      attempts. In normal cases when a HTTP server  fails
	      to  deliver  a document, it returns a HTML document
	      stating so (which	 often	also  describes	 why  and
	      more).  This flag will prevent curl from outputting
	      that and fail silently instead.

	      If this option is used twice, the second will again
	      disable silent failure.

       -F/--form <name=content>
	      (HTTP)  This  lets curl emulate a filled in form in
	      which a user has pressed the  submit  button.  This
	      causes  curl  to	POST  data using the content-type
	      multipart/form-data  according  to  RFC1867.   This
	      enables uploading of binary files etc. To force the
	      'content' part to be be a	 file,	prefix	the  file
	      name  with  an @ sign. To just get the content part
	      from a file, prefix the file name with  the  letter
	      <.  The  difference  between @ and < is then that @
	      makes a file get attached in the	post  as  a  file
	      upload, while the < makes a text field and just get
	      the contents for that text field from a file.

	      Example, to send your password file to the  server,
	      where  'password'	 is the name of the form-field to
	      which /etc/passwd will be the input:

	      curl -F password=@/etc/passwd www.mypasswords.com

	      To read the file's content from stdin insted  of	a
	      file,  use  -  where  the file name should've been.
	      This goes for both @ and < constructs.

	      This option can be used multiple times.

       -g/--globoff
	      This option switches off the "URL globbing parser".
	      When you set this option, you can specify URLs that
	      contain the letters {}[] without having them  being
	      interpreted by curl itself. Note that these letters
	      are not normal legal URL contents but  they  should
	      be encoded according to the URI standard.

       -G/--get
	      When used, this option will make all data specified
	      with -d/--data or --data-binary to  be  used  in	a
	      HTTP  GET	 request instead of the POST request that
	      otherwise would be used. The data will be	 appended
	      to  the URL with a '?'  separator. (Option added in
	      curl 7.9)

	      If used in combination with -I, the POST data  will
	      instead be appended to the URL with a HEAD request.

	      If used multiple times, nothing special happens.

       -h/--help
	      Usage help.

       -H/--header <header>
	      (HTTP) Extra header to use when getting a web page.
	      You  may	specify any number of extra headers. Note
	      that if you should add a custom header that has the
	      same  name  as  one of the internal ones curl would
	      use,  your  externally  set  header  will	 be  used
	      instead  of  the	internal  one. This allows you to
	      make even trickier stuff than curl  would	 normally
	      do.  You	should not replace internally set headers
	      without knowing perfectly well what  you're  doing.
	      Replacing	 an internal header with one without con-
	      tent on the right side of the  colon  will  prevent
	      that header from appearing.

	      This   option   can   be	used  multiple	times  to
	      add/replace/remove multiple headers.

       -i/--include
	      (HTTP) Include the HTTP-header in the  output.  The
	      HTTP-header  includes things like server-name, date
	      of the document, HTTP-version and more...

	      If this option is used twice, the second will again
	      disable header include.

       --interface <name>
	      Perform  an  operation using a specified interface.
	      You can enter interface name, IP	address	 or  host
	      name. An example could look like:

	      curl --interface eth0:1 http://www.netscape.com/

	      If  this option is used several times, the last one
	      will be used.

       -I/--head
	      (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers
	      feature  the  command  HEAD  which this uses to get
	      nothing but the header of a document. When used  on
	      a FTP file, curl displays the file size only.

	      If this option is used twice, the second will again
	      disable header only.

       -j/--junk-session-cookies
	      (HTTP) When curl is told to  read	 cookies  from	a
	      given  file,  this  option will make it discard all
	      "session cookies". This will basicly have the  same
	      effect  as  if  a	 new  session is started. Typical
	      browsers	always	discard	 session   cookies   when
	      they're closed down. (Added in 7.9.7)

	      If  this	option is used several times, each occur-
	      rence will toggle this on/off.

       -k/--insecure
	      (SSL) This option explicitly allows curl to perform
	      "insecure"  SSL connections and transfers. Starting
	      with  curl  7.10,	 all  SSL  connections	will   be
	      attempted	 to  be	 made secure by using the CA cer-
	      tificate bundle installed by  default.  This  makes
	      all   connections	 considered  "insecure"	 to  fail
	      unless -k/--insecure is used.

	      This option is ignored if --cacert or  --capath  is
	      used!

	      If  this option is used twice, the second time will
	      again disable it.

       --krb4 <level>
	      (FTP) Enable kerberos4 authentication and use.  The
	      level must be entered and should be one of 'clear',
	      'safe', 'confidential' or 'private'. Should you use
	      a	 level	that  is not one of these, 'private' will
	      instead be used.

	      If this option is used several times, the last  one
	      will be used.

       -K/--config <config file>
	      Specify  which  config  file to read curl arguments
	      from. The config file is a text file in which  com-
	      mand  line arguments can be written which then will
	      be used as if they were written on the actual  com-
	      mand  line.  Options  and	 their parameters must be
	      specified on the same  config  file  line.  If  the
	      parameter is to contain white spaces, the parameter
	      must be inclosed within quotes.  If the first  col-
	      umn  of  a config line is a '#' character, the rest
	      of the line will be treated as a comment.

	      Specify the filename as '-' to make curl	read  the
	      file from stdin.

	      Note that to be able to specify a URL in the config
	      file, you	 need  to  specify  it	using  the  --url
	      option,  and  not	 by simply writing the URL on its
	      own line. So, it could look similar to this:

	      url = "http://curl.haxx.se/docs/"

	      This option can be used multiple times.

       --limit-rate <speed>
	      Specify the maximum transfer rate you want curl  to
	      use.  This  feature is useful if you have a limited
	      pipe and you'd prefer you have  your  transfer  not
	      use your entire bandwidth.

	      The given speed is measured in bytes/second, unless
	      a suffix is appended. Appending  'k'  or	'K'  will
	      count  the  number as kilobytes, 'm' or M' makes it
	      megabytes while 'g'  or  'G'  makes  it  gigabytes.
	      Examples: 200K, 3m and 1G.

	      This option was introduced in curl 7.10.

	      If  this option is used several times, the last one
	      will be used.

       -l/--list-only
	      (FTP) When listing an FTP	 directory,  this  switch
	      forces  a name-only view.	 Especially useful if you
	      want to machine-parse the contents of an FTP direc-
	      tory  since the normal directory view doesn't use a
	      standard look or format.

	      This option causes an FTP NLST command to be  sent.
	      Some  FTP servers list only files in their response
	      to NLST; they do	not  include  subdirectories  and
	      symbolic links.

	      If this option is used twice, the second will again
	      disable list only.

       -L/--location
	      (HTTP/HTTPS)  If	the  server  reports   that   the
	      requested	 page has a different location (indicated
	      with the header line Location:) this flag will  let
	      curl attempt to reattempt the get on the new place.
	      If used together with -i or -I,  headers	from  all
	      requested pages will be shown. If this flag is used
	      when making a HTTP POST,	curl  will  automatically
	      switch to GET after the initial POST has been done.

	      If this option is used twice, the second will again
	      disable location following.

       -m/--max-time <seconds>
	      Maximum  time  in	 seconds that you allow the whole
	      operation to take.  This is useful  for  preventing
	      your  batch jobs from hanging for hours due to slow
	      networks or links going down.   This  doesn't  work
	      fully  in	 win32	systems.  See also the --connect-
	      timeout option.

	      If this option is used several times, the last  one
	      will be used.

       -M/--manual
	      Manual. Display the huge help text.

       -n/--netrc
	      Makes  curl scan the .netrc file in the user's home
	      directory for login name and password. This is typ-
	      ically  used  for	 ftp  on unix. If used with http,
	      curl will enable user authentication. See	 netrc(4)
	      or ftp(1) for details on the file format. Curl will
	      not complain if that file hasn't the right  permis-
	      sions  (it should not be world nor group readable).
	      The environment variable "HOME" is used to find the
	      home directory.

	      A	 quick	and very simple example of how to setup a
	      .netrc  to  allow	 curl  to  ftp	to  the	  machine
	      host.domain.com  with  user name 'myself' and pass-
	      word 'secret' should look similar to:

	      machine  host.domain.com	login	myself	 password
	      secret

	      If this option is used twice, the second will again
	      disable netrc usage.

       -N/--no-buffer
	      Disables the buffering of	 the  output  stream.  In
	      normal  work  situations,	 curl will use a standard
	      buffered output stream that will	have  the  effect
	      that  it will output the data in chunks, not neces-
	      sarily exactly when the data arrives.   Using  this
	      option will disable that buffering.

	      If this option is used twice, the second will again
	      switch on buffering.

       -o/--output <file>
	      Write output to <file> instead of	 stdout.  If  you
	      are using {} or [] to fetch multiple documents, you
	      can use '#' followed by  a  number  in  the  <file>
	      specifier.  That variable will be replaced with the
	      current string for the URL being fetched. Like in:

		curl http://{one,two}.site.com -o "file_#1.txt"

	      or use several variables like:

		curl http://{site,host}.host[1-5].com -o "#1_#2"

	      You may use this option as many times as	you  have
	      number of URLs.

	      See  also	 the  --create-dirs  option to create the
	      local directories dynamically.

       -O/--remote-name
	      Write output to a local file named like the  remote
	      file we get. (Only the file part of the remote file
	      is used, the path is cut off.)

	      You may use this option as many times as	you  have
	      number of URLs.

       -p/--proxytunnel
	      When  an HTTP proxy is used, this option will cause
	      non-HTTP protocols to attempt to tunnel through the
	      proxy  instead  of  merely using it to do HTTP-like
	      operations. The tunnel approach is  made	with  the
	      HTTP  proxy  CONNECT  request and requires that the
	      proxy allows direct connect to the remote port num-
	      ber curl wants to tunnel through to.

	      If this option is used twice, the second will again
	      disable proxy tunnel.

       -P/--ftpport <address>
	      (FTP) Reverses the  initiator/listener  roles  when
	      connecting with ftp. This switch makes Curl use the
	      PORT command instead of  PASV.  In  practice,  PORT
	      tells  the server to connect to the client's speci-
	      fied address and port, while PASV asks  the  server
	      for an ip address and port to connect to. <address>
	      should be one of:

	      interface	  i.e "eth0" to specify which interface's
			  IP address you want to use  (Unix only)

	      IP address  i.e "192.168.10.1" to specify exact  IP
			  number

	      host name	  i.e "my.host.domain" to specify machine

	      -		  (any single-letter string) to	 make  it
			  pick the machine's default

       If this option is used several times, the last one will be
       used.

       -q     If used as the first parameter on the command line,
	      the $HOME/.curlrc file will not be read and used as
	      a config file.

       -Q/--quote <comand>
	      (FTP) Send an arbitrary command to the  remote  FTP
	      server,  by  using the QUOTE command of the server.
	      Not all servers support this command, and	 the  set
	      of  QUOTE	 commands are server specific! Quote com-
	      mands are sent BEFORE the transfer is taking place.
	      To  make	commands  take	place  after a successful
	      transfer, prefix them with  a  dash  '-'.	 You  may
	      specify any amount of commands to be run before and
	      after the transfer. If the server	 returns  failure
	      for  one of the commands, the entire operation will
	      be aborted.

	      This option can be used multiple times.

       --random-file <file>
	      (HTTPS) Specify the path name  to	 file  containing
	      what will be considered as random data. The data is
	      used to seed the random engine for SSL connections.
	      See also the --edg-file option.

       -r/--range <range>
	      (HTTP/FTP)  Retrieve  a  byte  range (i.e a partial
	      document) from a HTTP/1.1 or FTP server. Ranges can
	      be specified in a number of ways.

	      0-499	specifies the first 500 bytes

	      500-999	specifies the second 500 bytes

	      -500	specifies the last 500 bytes

	      9500	specifies  the bytes from offset 9500 and
			forward

	      0-0,-1	specifies  the	first	and   last   byte
			only(*)(H)

	      500-700,600-799
			specifies 300 bytes from offset 500(H)

	      100-199,500-599
			specifies    two   separate   100   bytes
			ranges(*)(H)

       (*) = NOTE that this will cause the server to reply with a
       multipart response!

       You should also be aware that many HTTP/1.1 servers do not
       have this feature enabled, so that when you attempt to get
       a range, you'll instead get the whole document.

       FTP range downloads only support the simple syntax 'start-
       stop' (optionally with one of  the  numbers  omitted).  It
       depends on the non-RFC command SIZE.

       If this option is used several times, the last one will be
       used.

       -R/--remote-time
	      When used, this will make libcurl attempt to figure
	      out  the	timestamp of the remote file, and if that
	      is available make the  local  file  get  that  same
	      timestamp.

	      If  this option is used twice, the second time dis-
	      ables this again.

       -s/--silent
	      Silent mode. Don't show  progress	 meter	or  error
	      messages.	 Makes Curl mute.

	      If this option is used twice, the second will again
	      disable mute.

       -S/--show-error
	      When used with -s it makes curl show error  message
	      if it fails.

	      If this option is used twice, the second will again
	      disable show error.

       --stderr <file>
	      Redirect all writes to stderr to the specified file
	      instead.	If  the	 file  name is a plain '-', it is
	      instead written to stdout. This option has no point
	      when  you're  using a shell with decent redirecting
	      capabilities.

	      If this option is used several times, the last  one
	      will be used.

       -t/--telnet-option <OPT=val>
	      Pass  options  to	 the  telnet  protocol. Supported
	      options are:

	      TTYPE=<term> Sets the terminal type.

	      XDISPLOC=<X display> Sets the X display location.

	      NEW_ENV=<var,val> Sets an environment variable.

       -T/--upload-file <file>
	      This transfers the  specified  local  file  to  the
	      remote  URL. If there is no file part in the speci-
	      fied URL, Curl will append  the  local  file  name.
	      NOTE  that  you  must  use a trailing / on the last
	      directory to really prove to Curl that there is  no
	      file  name or curl will think that your last direc-
	      tory name is the remote file name to use. That will
	      most  likely cause the upload operation to fail. If
	      this is used on a http(s) server, the  PUT  command
	      will be used.

	      Use  the file name "-" (a single dash) to use stdin
	      instead of a given file.

	      If this option is used several times, the last  one
	      will be used.

       --trace <file>
	      Enables  a full trace dump of all incoming and out-
	      going data, including descriptive	 information,  to
	      the  given output file. Use "-" as filename to have
	      the output sent to stdout.

	      If this option is used several times, the last  one
	      will be used. (Added in curl 7.9.7)

       --trace-ascii <file>
	      Enables  a full trace dump of all incoming and out-
	      going data, including descriptive	 information,  to
	      the  given output file. Use "-" as filename to have
	      the output sent to stdout.

	      This is very similar to --trace, but leaves out the
	      hex part and only shows the ASCII part of the dump.
	      It makes smaller output that  might  be  easier  to
	      read for untrained humans.

	      If  this option is used several times, the last one
	      will be used. (Added in curl 7.9.7)

       -u/--user <user:password>
	      Specify user and password to use when fetching. See
	      README.curl  for	detailed  examples  of how to use
	      this. If no password is specified,  curl	will  ask
	      for it interactively.

	      If  this option is used several times, the last one
	      will be used.

       -U/--proxy-user <user:password>
	      Specify user and password to use for Proxy  authen-
	      tication.	 If  no	 password is specified, curl will
	      ask for it interactively.

	      If this option is used several times, the last  one
	      will be used.

       --url <URL>
	      Specify a URL to fetch. This option is mostly handy
	      when you want to specify URL(s) in a config file.

	      This option may be used any  number  of  times.  To
	      control  where  this  URL is written, use the -o or
	      the -O options.

       -v/--verbose
	      Makes the fetching more  verbose/talkative.  Mostly
	      usable for debugging. Lines starting with '>' means
	      data sent by curl, '<' means data received by  curl
	      that  is	hidden in normal cases and lines starting
	      with '*' means additional info provided by curl.

	      Note that if you want to see HTTP	 headers  in  the
	      output, -i/--include might be option you're looking
	      for.

	      If this option is used twice, the second will again
	      disable verbose.

       -V/--version
	      Displays	the  full  version  of	curl, libcurl and
	      other 3rd party  libraries  linked  with	the  exe-
	      cutable.

       -w/--write-out <format>
	      Defines  what to display after a completed and suc-
	      cessful operation. The format is a string that  may
	      contain  plain  text mixed with any number of vari-
	      ables. The string can be specified as "string",  to
	      get  read	 from  a  particular  file you specify it
	      "@filename" and to tell curl  to	read  the  format
	      from stdin you write "@-".

	      The  variables present in the output format will be
	      substituted by the value or text that  curl  thinks
	      fit,  as	described below. All variables are speci-
	      fied like %{variable_name} and to output a normal %
	      you  just write them like %%. You can output a new-
	      line by using \n, a carriage return with \r  and	a
	      tab space with \t.

	      NOTE:  The  %-letter  is	a  special  letter in the
	      win32-environment, where all occurrences of %  must
	      be doubled when using this option.

	      Available variables are at this point:

	      url_effective  The  URL that was fetched last. This
			     is mostly meaningful if you've  told
			     curl to follow location: headers.

	      http_code	     The numerical code that was found in
			     the last retrieved HTTP(S) page.

	      time_total     The total time, in seconds, that the
			     full operation lasted. The time will
			     be displayed with millisecond  reso-
			     lution.

	      time_namelookup
			     The  time,	 in seconds, it took from
			     the start until the  name	resolving
			     was completed.

	      time_connect   The  time,	 in seconds, it took from
			     the start until the connect  to  the
			     remote  host  (or	proxy)	was  com-
			     pleted.

	      time_pretransfer
			     The time, in seconds, it  took  from
			     the start until the file transfer is
			     just about to begin.  This	 includes
			     all  pre-transfer commands and nego-
			     tiations that are	specific  to  the
			     particular protocol(s) involved.

	      time_starttransfer
			     The  time,	 in seconds, it took from
			     the start until the  first	 byte  is
			     just  about  to  be transfered. This
			     includes time_pretransfer	and  also
			     the  time the server needs to calcu-
			     late the result.

	      size_download  The total amount of bytes that  were
			     downloaded.

	      size_upload    The  total amount of bytes that were
			     uploaded.

	      size_header    The total amount  of  bytes  of  the
			     downloaded headers.

	      size_request   The  total amount of bytes that were
			     sent in the HTTP request.

	      speed_download The average download speed that curl
			     measured  for the complete download.

	      speed_upload   The average upload speed  that  curl
			     measured for the complete upload.

	      content_type   The  Content-Type	of  the requested
			     document, if there was  any.  (Added
			     in 7.9.5)

       If this option is used several times, the last one will be
       used.

       -x/--proxy <proxyhost[:port]>
	      Use specified HTTP proxy. If the port number is not
	      specified, it is assumed at port 1080.

	      This  option  overrides  existing environment vari-
	      ables that sets proxy to use. If there's	an  envi-
	      ronment variable setting a proxy, you can set proxy
	      to "" to override it.

	      Note that all operations that are performed over	a
	      HTTP proxy will transparantly be converted to HTTP.
	      It means that certain protocol specific  operations
	      might not be available. This is not the case if you
	      can tunnel through the  proxy,  as  done	with  the
	      -p/--proxytunnel option.

	      If  this option is used several times, the last one
	      will be used.

       -X/--request <command>
	      (HTTP) Specifies a custom request to use when  com-
	      municating  with	the  HTTP  server.  The specified
	      request will be used instead of the  standard  GET.
	      Read  the	 HTTP  1.1  specification for details and
	      explanations.

	      (FTP) Specifies a custom FTP command to use instead
	      of LIST when doing file lists with ftp.

	      If  this option is used several times, the last one
	      will be used.

       -y/--speed-time <time>
	      If a download is slower than speed-limit bytes  per
	      second  during  a	 speed-time  period, the download
	      gets aborted. If speed-time is  used,  the  default
	      speed-limit will be 1 unless set with -y.

	      This  option  controls  transfers and thus will not
	      affect slow connects etc. If this is a concern  for
	      you, try the --connect-timeout option.

	      If  this option is used several times, the last one
	      will be used.

       -Y/--speed-limit <speed>
	      If a download is slower than this given  speed,  in
	      bytes  per  second,  for speed-time seconds it gets
	      aborted. speed-time is set with -Y and is 30 if not
	      set.

	      If  this option is used several times, the last one
	      will be used.

       -z/--time-cond <date expression>
	      (HTTP) Request to get a file that has been modified
	      later than the given time and date, or one that has
	      been modified before that time. The date expression
	      can  be  all sorts of date strings or if it doesn't
	      match any internal ones, it tries to get	the  time
	      from a given file name instead! See the GNU date(1)
	      or curl_getdate(3) man pages  for	 date  expression
	      details.

	      Start  the  date expression with a dash (-) to make
	      it request for a document that is	 older	than  the
	      given  date/time,	 default  is  a	 document that is
	      newer than the specified date/time.

	      If this option is used several times, the last  one
	      will be used.

       -Z/--max-redirs <num>
	      Set   maximum   number   of  redirection-followings
	      allowed. If -L/--location is used, this option  can
	      be used to prevent curl from following redirections
	      "in absurdum".

	      If this option is used several times, the last  one
	      will be used.

       -3/--sslv3
	      (HTTPS) Forces curl to use SSL version 3 when nego-
	      tiating with a remote SSL server.

       -2/--sslv2
	      (HTTPS) Forces curl to use SSL version 2 when nego-
	      tiating with a remote SSL server.

       -0/--http1.0
	      (HTTP) Forces curl to issue its requests using HTTP
	      1.0 instead of using its internally preferred: HTTP
	      1.1.

       -#/--progress-bar
	      Make   curl   display  progress  information  as	a
	      progress bar instead of the default statistics.

	      If this option is used twice, the second will again
	      disable the progress bar.

FILES
       ~/.curlrc
	      Default config file.

ENVIRONMENT
       http_proxy [protocol://]<host>[:port]
	      Sets proxy server to use for HTTP.

       HTTPS_PROXY [protocol://]<host>[:port]
	      Sets proxy server to use for HTTPS.

       FTP_PROXY [protocol://]<host>[:port]
	      Sets proxy server to use for FTP.

       GOPHER_PROXY [protocol://]<host>[:port]
	      Sets proxy server to use for GOPHER.

       ALL_PROXY [protocol://]<host>[:port]
	      Sets  proxy  server  to use if no protocol-specific
	      proxy is set.

       NO_PROXY <comma-separated list of hosts>
	      list of host names that shouldn't	 go  through  any
	      proxy. If set to a asterisk

EXIT CODES
       There  exists  a	 bunch of different error codes and their
       corresponding error messages that may  appear  during  bad
       conditions.  At	the  time of this writing, the exit codes
       are:

       1      Unsupported protocol. This build	of  curl  has  no
	      support for this protocol.

       2      Failed to initialize.

       3      URL malformat. The syntax was not correct.

       4      URL  user	 malformatted.	The  user-part of the URL
	      syntax was not correct.

       5      Couldn't resolve proxy. The given proxy host  could
	      not be resolved.

       6      Couldn't	resolve	 host.	The given remote host was
	      not resolved.

       7      Failed to connect to host.

       8      FTP weird server reply. The server sent  data  curl
	      couldn't parse.

       9      FTP access denied. The server denied login.

       10     FTP  user/password  incorrect.  Either  one or both
	      were not accepted by the server.

       11     FTP weird PASS reply. Curl couldn't parse the reply
	      sent to the PASS request.

       12     FTP weird USER reply. Curl couldn't parse the reply
	      sent to the USER request.

       13     FTP weird PASV reply, Curl couldn't parse the reply
	      sent to the PASV request.

       14     FTP  weird  227  format.	Curl  couldn't	parse the
	      227-line the server sent.

       15     FTP can't get host. Couldn't resolve the host IP we
	      got in the 227-line.

       16     FTP  can't  reconnect. Couldn't connect to the host
	      we got in the 227-line.

       17     FTP couldn't set binary. Couldn't	 change	 transfer
	      method to binary.

       18     Partial  file.  Only  a part of the file was trans-
	      fered.

       19     FTP couldn't download/access the	given  file,  the
	      RETR (or similar) command failed.

       20     FTP  write  error. The transfer was reported bad by
	      the server.

       21     FTP quote error. A  quote	 command  returned  error
	      from the server.

       22     HTTP  not	 found. The requested page was not found.
	      This return code only appears if --fail is used.

       23     Write error. Curl couldn't write data  to	 a  local
	      filesystem or similar.

       24     Malformat user. User name badly specified.

       25     FTP  couldn't STOR file. The server denied the STOR
	      operation.

       26     Read error. Various reading problems.

       27     Out of memory. A memory allocation request  failed.

       28     Operation	 timeout.  The	specified time-out period
	      was reached according to the conditions.

       29     FTP couldn't set	ASCII.	The  server  returned  an
	      unknown reply.

       30     FTP PORT failed. The PORT command failed.

       31     FTP couldn't use REST. The REST command failed.

       32     FTP couldn't use SIZE. The SIZE command failed. The
	      command is an extension to the  original	FTP  spec
	      RFC 959.

       33     HTTP  range error. The range "command" didn't work.

       34     HTTP post error. Internal	 post-request  generation
	      error.

       35     SSL connect error. The SSL handshaking failed.

       36     FTP  bad download resume. Couldn't continue an ear-
	      lier aborted download.

       37     FILE couldn't read file. Failed to open  the  file.
	      Permissions?

       38     LDAP cannot bind. LDAP bind operation failed.

       39     LDAP search failed.

       40     Library  not found. The LDAP library was not found.

       41     Function not found. A required  LDAP  function  was
	      not found.

       42     Aborted  by  callback.  An application told curl to
	      abort the operation.

       43     Internal error. A function was called  with  a  bad
	      parameter.

       44     Internal	error.	A  function  was  called in a bad
	      order.

       45     Interface error.	A  specified  outgoing	interface
	      could not be used.

       46     Bad  password  entered.  An error was signaled when
	      the password was entered.

       47     Too many redirects. When following redirects,  curl
	      hit the maximum amount.

       48     Unknown TELNET option specified.

       49     Malformed telnet option.

       51     The remote peer's SSL certificate wasn't ok

       52     The  server  didn't  reply  anything, which here is
	      considered an error.

       53     SSL crypto engine not found

       54     Cannot set SSL crypto engine as default

       55     Failed sending network data

       56     Failure in receiving network data

       57     Share is in use (internal error)

       58     Problem with the local certificate

       59     Couldn't use specified SSL cipher

       60     Problem with the CA cert (path? permission?)

       61     Unrecognized transfer encoding

       XX     There will appear more error codes here  in  future
	      releases.	 The  existing	ones  are  meant to never
	      change.

BUGS
       If you do find bugs, mail them to curl-bug@haxx.se.

AUTHORS / CONTRIBUTORS
       Daniel Stenberg is the main author, but the whole list  of
       contributors is found in the separate THANKS file.

WWW
       http://curl.haxx.se

FTP
       ftp://ftp.sunet.se/pub/www/utilities/curl/

SEE ALSO
       ftp(1), wget(1), snarf(1)

Curl 7.10		   11 Sep 2002			  curl(1)
[top]

List of man pages available for IRIX

Copyright (c) for man pages and the logo by the respective OS vendor.

For those who want to learn more, the polarhome community provides shell access and support.

[legal] [privacy] [GNU] [policy] [cookies] [netiquette] [sponsors] [FAQ]
Tweet
Polarhome, production since 1999.
Member of Polarhome portal.
Based on Fawad Halim's script.
....................................................................
Vote for polarhome
Free Shell Accounts :: the biggest list on the net