hello world

This commit is contained in:
Timothee 'TTimo' Besset 2011-11-22 15:28:15 -06:00
commit fb1609f554
2155 changed files with 1017022 additions and 0 deletions

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
build

643
COPYING.txt Normal file
View File

@ -0,0 +1,643 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
ADDITIONAL TERMS APPLICABLE TO THE DOOM 3 GPL SOURCE CODE.
The following additional terms (“Additional Terms”) supplement and modify the GNU General Public License, Version 3 (“GPL”) applicable to the Doom 3 GPL Source Code (“Doom 3 Source Code”). In addition to the terms and conditions of the GPL, the Doom 3 Source Code is subject to the further restrictions below.
1. Replacement of Section 15. Section 15 of the GPL shall be deleted in its entirety and replaced with the following:
“15. Disclaimer of Warranty.
THE PROGRAM IS PROVIDED WITHOUT ANY WARRANTIES, WHETHER EXPRESSED OR IMPLIED, INCLUDING, WITHOUT LIMITATION, IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE, NON-INFRINGEMENT, TITLE AND MERCHANTABILITY. THE PROGRAM IS BEING DELIVERED OR MADE AVAILABLE “AS IS”, “WITH ALL FAULTS” AND WITHOUT WARRANTY OR REPRESENTATION. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.”
2. Replacement of Section 16. Section 16 of the GPL shall be deleted in its entirety and replaced with the following:
“16. LIMITATION OF LIABILITY.
UNDER NO CIRCUMSTANCES SHALL ANY COPYRIGHT HOLDER OR ITS AFFILIATES, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, FOR ANY DAMAGES OR OTHER LIABILITY, INCLUDING ANY GENERAL, DIRECT, INDIRECT, SPECIAL, INCIDENTAL, CONSEQUENTIAL OR PUNITIVE DAMAGES ARISING FROM, OUT OF OR IN CONNECTION WITH THE USE OR INABILITY TO USE THE PROGRAM OR OTHER DEALINGS WITH THE PROGRAM(INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), WHETHER OR NOT ANY COPYRIGHT HOLDER OR SUCH OTHER PARTY RECEIVES NOTICE OF ANY SUCH DAMAGES AND WHETHER OR NOT SUCH DAMAGES COULD HAVE BEEN FORESEEN.”
3. LEGAL NOTICES; NO TRADEMARK LICENSE; ORIGIN. You must reproduce faithfully all trademark, copyright and other proprietary and legal notices on any copies of the Program or any other required author attributions. This license does not grant you rights to use any copyright holder or any other partys name, logo, or trademarks. Neither the name of the copyright holder or its affiliates, or any other party who modifies and/or conveys the Program may be used to endorse or promote products derived from this software without specific prior written permission. The origin of the Program must not be misrepresented; you must not claim that you wrote the original Program. Altered source versions must be plainly marked as such, and must not be misrepresented as being the original Program.
4. INDEMNIFICATION. IF YOU CONVEY A COVERED WORK AND AGREE WITH ANY RECIPIENT OF THAT COVERED WORK THAT YOU WILL ASSUME ANY LIABILITY FOR THAT COVERED WORK, YOU HEREBY AGREE TO INDEMNIFY, DEFEND AND HOLD HARMLESS THE OTHER LICENSORS AND AUTHORS OF THAT COVERED WORK FOR ANY DAMAEGS, DEMANDS, CLAIMS, LOSSES, CAUSES OF ACTION, LAWSUITS, JUDGMENTS EXPENSES (INCLUDING WITHOUT LIMITATION REASONABLE ATTORNEYS' FEES AND EXPENSES) OR ANY OTHER LIABLITY ARISING FROM, RELATED TO OR IN CONNECTION WITH YOUR ASSUMPTIONS OF LIABILITY.

435
README.txt Normal file
View File

@ -0,0 +1,435 @@
Doom 3 GPL source release
=========================
This file contains the following sections:
GENERAL NOTES
LICENSE
GENERAL NOTES
=============
Game data and patching:
-----------------------
This source release does not contain any game data, the game data is still
covered by the original EULA and must be obeyed as usual.
You must patch the game to the latest version.
Note that Doom 3 and Doom 3: Resurrection of Evil are available from the Steam store at
http://store.steampowered.com/app/9050/
http://store.steampowered.com/app/9070/
Other platforms, updated source code, security issues:
------------------------------------------------------
If you have obtained this source code several weeks after the time of release,
it is likely that you can find modified and improved
versions of the engine in various open source projects across the internet.
Depending what is your interest with the source code, those may be a better
starting point.
Compiling on win32:
-------------------
A project file for Microsoft Visual Studio 2010 is provided in neo\doom.sln
We expect the solution file is compatible with the Express releases
You will need the Microsoft DirectX SDK installed as well.
If it does not reside in "C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)"
you will need to update the project files accordingly.
Compiling on GNU/Linux x86:
---------------------------
The build system on GNU/Linux is based on SCons: http://www.scons.org/
Issue the scons command in the neo/ folder.
Compiling on MacOS X:
---------------------------
XCode 3.2 project is under neo/sys/osx/
Back End Rendering of Stencil Shadows:
--------------------------------------
The Doom 3 GPL source code release does not include functionality enabling rendering
of stencil shadows via the “depth fail” method, a functionality commonly known as
"Carmack's Reverse".
MayaImport:
---------------------------
The code for our Maya export plugin is included, if you are a Maya licensee
you can obtain the SDK from Autodesk.
LICENSE
=======
See COPYING.txt for the GNU GENERAL PUBLIC LICENSE
ADDITIONAL TERMS: The Doom 3 GPL Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU GPL which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
EXCLUDED CODE: The code described below and contained in the Doom 3 GPL Source Code release is not part of the Program covered by the GPL and is expressly excluded from its terms. You are solely responsible for obtaining from the copyright holder a license for such code and complying with the applicable license terms.
Curl library
---------------------------------------------------------------------------
lines file(s)
neo/curl/*, neo/curl/README
COPYRIGHT AND PERMISSION NOTICE
Copyright (c) 1996 - 2004, Daniel Stenberg, <daniel@haxx.se>.
All rights reserved.
Permission to use, copy, modify, and distribute this software for any purpose
with or without fee is hereby granted, provided that the above copyright
notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. IN
NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
OR OTHER DEALINGS IN THE SOFTWARE.
Except as contained in this notice, the name of a copyright holder shall not
be used in advertising or otherwise to promote the sale, use or other dealings
in this Software without prior written authorization of the copyright holder.
JPEG library
-----------------------------------------------------------------------------
neo/renderer/jpeg-6/*
Copyright (C) 1991-1995, Thomas G. Lane
Permission is hereby granted to use, copy, modify, and distribute this
software (or portions thereof) for any purpose, without fee, subject to these
conditions:
(1) If any part of the source code for this software is distributed, then this
README file must be included, with this copyright and no-warranty notice
unaltered; and any additions, deletions, or changes to the original files
must be clearly indicated in accompanying documentation.
(2) If only executable code is distributed, then the accompanying
documentation must state that "this software is based in part on the work of
the Independent JPEG Group".
(3) Permission for use of this software is granted only if the user accepts
full responsibility for any undesirable consequences; the authors accept
NO LIABILITY for damages of any kind.
These conditions apply to any software derived from or based on the IJG code,
not just to the unmodified library. If you use our work, you ought to
acknowledge us.
NOTE: unfortunately the README that came with our copy of the library has
been lost, so the one from release 6b is included instead. There are a few
'glue type' modifications to the library to make it easier to use from
the engine, but otherwise the dependency can be easily cleaned up to a
better release of the library.
OggVorbis
---------------------------------------------------------------------------
neo/sound/OggVorbis/*
neo/sound/OggVorbis/ogg/README
neo/sound/OggVorbis/vorbis/README
Copyright (c) 2002, Xiph.org Foundation
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
- Neither the name of the Xiph.org Foundation nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE FOUNDATION
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
PropTree
---------------------------------------------------------------------------
neo/tools/common/PropTree/*
Copyright (C) 1998-2001 Scott Ramsay
sramsay@gonavi.com
http://www.gonavi.com
This material is provided "as is", with absolutely no warranty expressed
or implied. Any use is at your own risk.
Permission to use or copy this software for any purpose is hereby granted
without fee, provided the above notices are retained on all copies.
Permission to modify the code and to distribute modified code is granted,
provided the above notices are retained, and a notice that the code was
modified is included with the above copyright notice.
If you use this code, drop me an email. I'd like to know if you find the code
useful.
OpenAL SDK
---------------------------------------------------------------------------
neo/openal/docs/*
neo/openal/include/*
neo/openal/lib/*
neo/openal/osx/*
/**
* OpenAL cross platform audio library
* Copyright (C) 1999-2000 by authors.
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
* Or go to http://www.gnu.org/copyleft/lgpl.html
*/
Base64 implementation
---------------------------------------------------------------------------
lines file(s)
234 neo/idlib/Base64.cpp
Copyright (c) 1996 Lars Wirzenius. All rights reserved.
June 14 2003: TTimo <ttimo@idsoftware.com>
modified + endian bug fixes
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=197039
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT,
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
IO on .zip files using portions of zlib
---------------------------------------------------------------------------
lines file(s)
4471 src/framework/Unzip.cpp
Copyright (C) 1998 Gilles Vollant
zlib is Copyright (C) 1995-1998 Jean-loup Gailly and Mark Adler
This software is provided 'as-is', without any express or implied
warranty. In no event will the authors be held liable for any damages
arising from the use of this software.
Permission is granted to anyone to use this software for any purpose,
including commercial applications, and to alter it and redistribute it
freely, subject to the following restrictions:
1. The origin of this software must not be misrepresented; you must not
claim that you wrote the original software. If you use this software
in a product, an acknowledgment in the product documentation would be
appreciated but is not required.
2. Altered source versions must be plainly marked as such, and must not be
misrepresented as being the original software.
3. This notice may not be removed or altered from any source distribution.
MD4 Message-Digest Algorithm
-----------------------------------------------------------------------------
lines file(s)
260 neo/idlib/hashing/MD4.cpp
Copyright (C) 1991-2, RSA Data Security, Inc. Created 1991. All
rights reserved.
License to copy and use this software is granted provided that it
is identified as the "RSA Data Security, Inc. MD4 Message-Digest
Algorithm" in all material mentioning or referencing this software
or this function.
License is also granted to make and use derivative works provided
that such works are identified as "derived from the RSA Data
Security, Inc. MD4 Message-Digest Algorithm" in all material
mentioning or referencing the derived work.
RSA Data Security, Inc. makes no representations concerning either
the merchantability of this software or the suitability of this
software for any particular purpose. It is provided "as is"
without express or implied warranty of any kind.
These notices must be retained in any copies of any part of this
documentation and/or software.
MD5 Message-Digest Algorithm
-----------------------------------------------------------------------------
lines file(s)
273 neo/idlib/hashing/MD5.cpp
This code implements the MD5 message-digest algorithm.
The algorithm is due to Ron Rivest. This code was
written by Colin Plumb in 1993, no copyright is claimed.
This code is in the public domain; do with it what you wish.
CRC32 Checksum
-----------------------------------------------------------------------------
lines file(s)
168 neo/idlib/hashing/CRC32.cpp
Copyright (C) 1995-1998 Mark Adler
OpenGL headers
---------------------------------------------------------------------------
lines file(s)
5920 neo/renderer/glext.h
613 neo/renderer/wglext.h
/*
** License Applicability. Except to the extent portions of this file are
** made subject to an alternative license as permitted in the SGI Free
** Software License B, Version 1.1 (the "License"), the contents of this
** file are subject only to the provisions of the License. You may not use
** this file except in compliance with the License. You may obtain a copy
** of the License at Silicon Graphics, Inc., attn: Legal Services, 1600
** Amphitheatre Parkway, Mountain View, CA 94043-1351, or at:
**
** http://oss.sgi.com/projects/FreeB
**
** Note that, as provided in the License, the Software is distributed on an
** "AS IS" basis, with ALL EXPRESS AND IMPLIED WARRANTIES AND CONDITIONS
** DISCLAIMED, INCLUDING, WITHOUT LIMITATION, ANY IMPLIED WARRANTIES AND
** CONDITIONS OF MERCHANTABILITY, SATISFACTORY QUALITY, FITNESS FOR A
** PARTICULAR PURPOSE, AND NON-INFRINGEMENT.
**
** Original Code. The Original Code is: OpenGL Sample Implementation,
** Version 1.2.1, released January 26, 2000, developed by Silicon Graphics,
** Inc. The Original Code is Copyright (c) 1991-2002 Silicon Graphics, Inc.
** Copyright in any portions created by third parties is as indicated
** elsewhere herein. All Rights Reserved.
**
** Additional Notice Provisions: This software was created using the
** OpenGL(R) version 1.2.1 Sample Implementation published by SGI, but has
** not been independently verified as being compliant with the OpenGL(R)
** version 1.2.1 Specification.
*/
NV-CONTROL X Extension
---------------------------------------------------------------------------
neo/sys/linux/libXNVCtrl/*
Copyright NVIDIA Corporation
ExtUtil.h
---------------------------------------------------------------------------
neo/sys/linux/extutil.h
/*
* $Xorg: extutil.h,v 1.4 2001/02/09 02:03:24 xorgcvs Exp $
*
Copyright 1989, 1998 The Open Group
Permission to use, copy, modify, distribute, and sell this software and its
documentation for any purpose is hereby granted without fee, provided that
the above copyright notice appear in all copies and that both that
copyright notice and this permission notice appear in supporting
documentation.
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
OPEN GROUP BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN
AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Except as contained in this notice, the name of The Open Group shall not be
used in advertising or otherwise to promote the sale, use or other dealings
in this Software without prior written authorization from The Open Group.
*
* Author: Jim Fulton, MIT The Open Group
*
* Xlib Extension-Writing Utilities
*
* This package contains utilities for writing the client API for various
* protocol extensions. THESE INTERFACES ARE NOT PART OF THE X STANDARD AND
* ARE SUBJECT TO CHANGE!
*/
OSS headers
---------------------------------------------------------------------------
neo/sys/linux/oss/*
Copyright by 4Front Technologies 1993-2004
Brandelf utility
---------------------------------------------------------------------------
lines file(s)
225 neo/sys/linux/setup/brandelf.c
/*-
* Copyright (c) 1996 Søren Schmidt
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer
* in this position and unchanged.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. The name of the author may not be used to endorse or promote products
* derived from this software withough specific prior written permission
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
* IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
* OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
* IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*
* $FreeBSD: src/usr.bin/brandelf/brandelf.c,v 1.16 2000/07/02 03:34:08 imp Exp $
*/
makeself - Make self-extractable archives on Unix
---------------------------------------------------------------------------
neo/sys/linux/setup/makeself/*, neo/sys/linux/setup/makeself/README
Copyright (c) Stéphane Peter
Licensing: GPL v2

1
base/default.cfg Normal file
View File

@ -0,0 +1 @@
# empty file

954
neo/MayaImport.vcxproj Normal file
View File

@ -0,0 +1,954 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug with inlines and memory log|Win32">
<Configuration>Debug with inlines and memory log</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug with inlines|Win32">
<Configuration>Debug with inlines</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Dedicated Debug with inlines|Win32">
<Configuration>Dedicated Debug with inlines</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Dedicated Debug|Win32">
<Configuration>Dedicated Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Dedicated Release|Win32">
<Configuration>Dedicated Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectName>MayaImport</ProjectName>
<ProjectGuid>{49BEC5C6-B964-417A-851E-808886B574F1}</ProjectGuid>
<RootNamespace>MayaImport</RootNamespace>
<SccProjectName>
</SccProjectName>
<SccLocalPath>
</SccLocalPath>
<SccProvider>
</SccProvider>
<Keyword>Win32Proj</Keyword>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Dedicated Release|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug with inlines|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug with inlines and memory log|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug with inlines|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>DynamicLibrary</ConfigurationType>
<UseOfMfc>false</UseOfMfc>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Dedicated Release|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Dedicated.props" />
<Import Project="_Release.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug with inlines|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Dedicated.props" />
<Import Project="_Debug.props" />
<Import Project="_WithInlines.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Dedicated.props" />
<Import Project="_Debug.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug with inlines and memory log|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Debug.props" />
<Import Project="_WithInlines.props" />
<Import Project="_WithMemoryLog.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug with inlines|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Debug.props" />
<Import Project="_WithInlines.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Release.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="_Common.props" />
<Import Project="_MayaImport.props" />
<Import Project="_Debug.props" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Debug with inlines and memory log|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Debug with inlines and memory log|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Debug with inlines and memory log|Win32'" />
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Debug with inlines|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Debug with inlines|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Debug with inlines|Win32'" />
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" />
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug with inlines|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug with inlines|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug with inlines|Win32'" />
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug|Win32'" />
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Dedicated Release|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Dedicated Release|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Dedicated Release|Win32'" />
<CodeAnalysisRuleSet Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" />
<CodeAnalysisRuleAssemblies Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" />
</PropertyGroup>
<ItemDefinitionGroup>
</ItemDefinitionGroup>
<ItemGroup>
<ClInclude Include="MayaImport\Maya4.5\maya.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\flib.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\ilib.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\M3dView.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAngle.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimControl.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimCurveChange.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimCurveClipboard.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimCurveClipboardItem.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimCurveClipboardItemArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAnimUtil.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MApiVersion.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MArgDatabase.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MArgList.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MArgParser.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MArrayDataBuilder.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MArrayDataHandle.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAttributeIndex.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAttributeSpec.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MAttributeSpecArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MBoundingBox.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MColor.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MColorArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MCommandResult.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MComputation.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MConditionMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MCursor.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDagMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDagModifier.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDagPath.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDagPathArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDataBlock.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDataHandle.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDeviceChannel.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDeviceState.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDGContext.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDGMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDGModifier.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDistance.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDoubleArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDrawData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDrawInfo.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDrawRequest.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDrawRequestQueue.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDynSweptLine.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MDynSweptTriangle.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MEulerRotation.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MEvent.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MEventMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFeedbackLine.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFileIO.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFileObject.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFloatArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFloatMatrix.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFloatPoint.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFloatPointArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFloatVector.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFloatVectorArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFn.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnAirField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnAmbientLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnAnimCurve.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnAreaLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnArrayAttrsData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnBase.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnBlendShapeDeformer.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnBlinnShader.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnCamera.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnCharacter.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnCircleSweepManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnClip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnComponent.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnComponentListData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnCompoundAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnCurveSegmentManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDagNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDependencyNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDirectionalLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDirectionManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDiscManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDistanceManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDoubleArrayData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDoubleIndexedComponent.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDragField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnDynSweptGeometryData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnEnumAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnExpression.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnFluid.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnFreePointTriadManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnGenericAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnGeometryData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnGeometryFilter.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnGravityField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnIkEffector.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnIkHandle.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnIkJoint.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnIkSolver.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnIntArrayData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnLambertShader.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnLattice.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnLatticeData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnLatticeDeformer.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnLightDataAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnManip3D.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnMatrixAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnMatrixData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnMesh.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnMeshData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnMessageAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnMotionPath.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNewtonField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNonAmbientLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNonExtendedLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNumericAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNumericData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNurbsCurve.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNurbsCurveData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNurbsSurface.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnNurbsSurfaceData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPartition.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPhongShader.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPlugin.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPluginData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPointArrayData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPointLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPointOnCurveManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnPointOnSurfaceManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnRadialField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnReflectShader.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSet.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSingleIndexedComponent.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSkinCluster.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSphereData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSpotLight.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnStateManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnStringArrayData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnStringData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSubd.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSubdData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnSubdNames.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnToggleManip.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnTransform.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnTripleIndexedComponent.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnTurbulenceField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnTypedAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnUInt64ArrayData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnUniformField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnUnitAttribute.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnVectorArrayData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnVolumeAxisField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnVortexField.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnWeightGeometryFilter.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MFnWireDeformer.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MGlobal.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MIkHandleGroup.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MIkSystem.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MImage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MIntArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItCurveCV.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItDag.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItDependencyGraph.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItDependencyNodes.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItGeometry.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItInstancer.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItKeyframe.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItMeshEdge.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItMeshFaceVertex.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItMeshPolygon.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItMeshVertex.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItSelectionList.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MItSurfaceCV.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MLibrary.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MLightLinks.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MManipData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MMaterial.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MMatrix.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MModelMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MNodeMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MObject.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MObjectArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\mocapserial.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\mocapserver.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\mocaptcp.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPlug.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPlugArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPoint.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPointArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxCommand.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxContext.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxContextCommand.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxDeformerNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxDragAndDropBehavior.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxEmitterNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxFieldNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxFileTranslator.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxGeometryData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxGeometryIterator.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxGlBuffer.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxHwShaderNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxIkSolver.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxIkSolverNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxLocatorNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxManipContainer.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxMidiInputDevice.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxPolyTrg.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxSelectionContext.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxSpringNode.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxSurfaceShape.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxSurfaceShapeUI.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxToolCommand.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxTransform.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MPxTransformationMatrix.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MQuaternion.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MRenderCallback.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MRenderData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MRenderShadowData.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MRenderUtil.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MRenderView.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MSceneMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MSelectInfo.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MSelectionList.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MSelectionMask.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MSimple.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MStatus.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MString.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MStringArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MSyntax.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MTesselationParams.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MTime.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MTimeArray.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MToolsInfo.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MTransformationMatrix.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MTypeId.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MTypes.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MUiMessage.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MUint64Array.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MVector.h" />
<ClInclude Include="MayaImport\Maya4.5\include\maya\MVectorArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya.h" />
<ClInclude Include="MayaImport\maya5.0\maya\flib.h" />
<ClInclude Include="MayaImport\maya5.0\maya\ilib.h" />
<ClInclude Include="MayaImport\maya5.0\maya\M3dView.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAngle.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimControl.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimCurveChange.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimCurveClipboard.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimCurveClipboardItem.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimCurveClipboardItemArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAnimUtil.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MApiVersion.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MArgDatabase.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MArgList.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MArgParser.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MArrayDataBuilder.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MArrayDataHandle.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAttributeIndex.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAttributeSpec.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MAttributeSpecArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MBoundingBox.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MColor.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MColorArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MCommandResult.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MComputation.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MConditionMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MCursor.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDagMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDagModifier.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDagPath.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDagPathArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDataBlock.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDataHandle.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDeviceChannel.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDeviceState.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDGContext.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDGMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDGModifier.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDistance.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDoubleArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDrawData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDrawInfo.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDrawRequest.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDrawRequestQueue.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDynSweptLine.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MDynSweptTriangle.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MEulerRotation.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MEvent.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MEventMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFeedbackLine.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFileIO.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFileObject.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFloatArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFloatMatrix.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFloatPoint.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFloatPointArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFloatVector.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFloatVectorArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFn.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnAirField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnAmbientLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnAnimCurve.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnAreaLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnArrayAttrsData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnBase.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnBlendShapeDeformer.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnBlinnShader.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnCamera.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnCharacter.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnCircleSweepManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnClip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnComponent.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnComponentListData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnCompoundAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnCurveSegmentManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDagNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDependencyNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDirectionalLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDirectionManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDiscManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDistanceManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDoubleArrayData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDoubleIndexedComponent.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDragField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnDynSweptGeometryData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnEnumAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnExpression.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnFluid.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnFreePointTriadManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnGenericAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnGeometryData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnGeometryFilter.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnGravityField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnIkEffector.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnIkHandle.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnIkJoint.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnIkSolver.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnIntArrayData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnLambertShader.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnLattice.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnLatticeData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnLatticeDeformer.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnLightDataAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnManip3D.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnMatrixAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnMatrixData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnMesh.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnMeshData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnMessageAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnMotionPath.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNewtonField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNonAmbientLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNonExtendedLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNumericAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNumericData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNurbsCurve.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNurbsCurveData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNurbsSurface.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnNurbsSurfaceData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnParticleSystem.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPartition.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPhongShader.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPlugin.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPluginData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPointArrayData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPointLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPointOnCurveManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnPointOnSurfaceManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnRadialField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnReflectShader.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSet.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSingleIndexedComponent.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSkinCluster.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSphereData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSpotLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnStateManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnStringArrayData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnStringData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSubd.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSubdData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnSubdNames.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnToggleManip.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnTransform.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnTripleIndexedComponent.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnTurbulenceField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnTypedAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnUInt64ArrayData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnUniformField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnUnitAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnVectorArrayData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnVolumeAxisField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnVolumeLight.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnVortexField.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnWeightGeometryFilter.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MFnWireDeformer.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MGlobal.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MIkHandleGroup.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MIkSystem.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MImage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MIntArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MIOStream.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItCurveCV.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItDag.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItDependencyGraph.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItDependencyNodes.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItGeometry.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItInstancer.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItKeyframe.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItMeshEdge.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItMeshFaceVertex.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItMeshPolygon.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItMeshVertex.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItSelectionList.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MItSurfaceCV.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MLibrary.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MLightLinks.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MLockMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MManipData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MMaterial.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MMatrix.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MModelMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MNodeMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MObject.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MObjectArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\mocapserial.h" />
<ClInclude Include="MayaImport\maya5.0\maya\mocapserver.h" />
<ClInclude Include="MayaImport\maya5.0\maya\mocaptcp.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPlug.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPlugArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPoint.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPointArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPx3dModelView.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxCommand.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxContext.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxContextCommand.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxDeformerNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxDragAndDropBehavior.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxEmitterNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxFieldNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxFileTranslator.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxGeometryData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxGeometryIterator.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxGlBuffer.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxHwShaderNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxIkSolver.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxIkSolverNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxLocatorNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxManipContainer.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxMidiInputDevice.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxModelEditorCommand.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxObjectSet.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxPolyTrg.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxPolyTweakUVCommand.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxSelectionContext.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxSpringNode.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxSurfaceShape.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxSurfaceShapeUI.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxToolCommand.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxTransform.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MPxTransformationMatrix.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MQuaternion.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MRampAttribute.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MRenderCallback.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MRenderData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MRenderShadowData.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MRenderUtil.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MRenderView.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MSceneMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MSelectInfo.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MSelectionList.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MSelectionMask.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MSimple.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MStatus.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MString.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MStringArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MSyntax.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MTesselationParams.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MTime.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MTimeArray.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MToolsInfo.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MTransformationMatrix.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MTypeId.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MTypes.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MUiMessage.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MUint64Array.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MVector.h" />
<ClInclude Include="MayaImport\maya5.0\maya\MVectorArray.h" />
<ClInclude Include="MayaImport\Maya6.0\maya.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\flib.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\ilib.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\M3dView.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAngle.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimControl.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimCurveChange.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimCurveClipboard.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimCurveClipboardItem.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimCurveClipboardItemArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAnimUtil.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MApiVersion.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MArgDatabase.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MArgList.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MArgParser.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MArrayDataBuilder.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MArrayDataHandle.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAttributeIndex.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAttributeSpec.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MAttributeSpecArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MBoundingBox.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MColor.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MColorArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MCommandResult.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MComputation.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MConditionMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MCursor.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDagMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDagModifier.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDagPath.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDagPathArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDataBlock.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDataHandle.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDeviceChannel.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDeviceState.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDGContext.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDGMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDGModifier.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDistance.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDoubleArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDrawData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDrawInfo.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDrawRequest.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDrawRequestQueue.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDynSweptLine.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MDynSweptTriangle.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MEulerRotation.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MEvent.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MEventMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFeedbackLine.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFileIO.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFileObject.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFloatArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFloatMatrix.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFloatPoint.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFloatPointArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFloatVector.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFloatVectorArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFn.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnAirField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnAmbientLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnAnimCurve.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnAreaLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnArrayAttrsData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnBase.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnBlendShapeDeformer.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnBlinnShader.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnCamera.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnCharacter.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnCircleSweepManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnClip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnComponent.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnComponentListData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnCompoundAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnCurveSegmentManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDagNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDependencyNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDirectionalLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDirectionManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDiscManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDistanceManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDoubleArrayData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDoubleIndexedComponent.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDragField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnDynSweptGeometryData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnEnumAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnExpression.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnFluid.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnFreePointTriadManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnGenericAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnGeometryData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnGeometryFilter.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnGravityField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnIkEffector.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnIkHandle.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnIkJoint.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnIkSolver.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnIntArrayData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnLambertShader.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnLattice.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnLatticeData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnLatticeDeformer.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnLightDataAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnManip3D.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnMatrixAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnMatrixData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnMesh.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnMeshData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnMessageAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnMotionPath.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNewtonField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNonAmbientLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNonExtendedLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNumericAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNumericData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNurbsCurve.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNurbsCurveData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNurbsSurface.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnNurbsSurfaceData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnParticleSystem.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPartition.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPhongShader.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPlugin.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPluginData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPointArrayData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPointLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPointOnCurveManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnPointOnSurfaceManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnRadialField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnReflectShader.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnRotateManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnScaleManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSet.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSingleIndexedComponent.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSkinCluster.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSphereData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSpotLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnStateManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnStringArrayData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnStringData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSubd.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSubdData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnSubdNames.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnToggleManip.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnTransform.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnTripleIndexedComponent.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnTurbulenceField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnTypedAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnUInt64ArrayData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnUniformField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnUnitAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnVectorArrayData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnVolumeAxisField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnVolumeLight.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnVortexField.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnWeightGeometryFilter.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFnWireDeformer.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MFStream.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MGlobal.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MIkHandleGroup.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MIkSystem.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MImage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MIntArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MIOStream.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItCurveCV.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItDag.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItDependencyGraph.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItDependencyNodes.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItGeometry.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItInstancer.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItKeyframe.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItMeshEdge.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItMeshFaceVertex.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItMeshPolygon.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItMeshVertex.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItSelectionList.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MItSurfaceCV.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MLibrary.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MLightLinks.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MLockMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MManipData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MMaterial.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MMatrix.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MModelMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MNodeMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MObject.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MObjectArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MObjectHandle.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MObjectSetMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\mocapserial.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\mocapserver.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\mocaptcp.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPlug.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPlugArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPoint.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPointArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPolyMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MProgressWindow.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPx3dModelView.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxCommand.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxContext.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxContextCommand.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxDeformerNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxDragAndDropBehavior.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxEmitterNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxFieldNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxFileTranslator.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxGeometryData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxGeometryIterator.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxGlBuffer.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxHwShaderNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxIkSolver.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxIkSolverNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxLocatorNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxManipContainer.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxMidiInputDevice.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxModelEditorCommand.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxObjectSet.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxPolyTrg.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxPolyTweakUVCommand.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxSelectionContext.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxSpringNode.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxSurfaceShape.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxSurfaceShapeUI.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxToolCommand.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxTransform.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MPxTransformationMatrix.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MQuaternion.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MRampAttribute.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MRenderCallback.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MRenderData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MRenderShadowData.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MRenderUtil.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MRenderView.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MSceneMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MSelectInfo.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MSelectionList.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MSelectionMask.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MSimple.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MStatus.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MString.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MStringArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MSyntax.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTesselationParams.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTime.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTimeArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MToolsInfo.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTransformationMatrix.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTrimBoundaryArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTypeId.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MTypes.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MUiMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MUint64Array.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MUintArray.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MUserEventMessage.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MVector.h" />
<ClInclude Include="MayaImport\Maya6.0\include\maya\MVectorArray.h" />
<ClInclude Include="MayaImport\exporter.h" />
<ClInclude Include="MayaImport\maya_main.h" />
</ItemGroup>
<ItemGroup>
<ClCompile Include="MayaImport\maya_main.cpp" />
<ClCompile Include="idlib\precompiled.cpp">
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Debug with inlines and memory log|Win32'">Create</PrecompiledHeader>
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Debug with inlines|Win32'">Create</PrecompiledHeader>
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">Create</PrecompiledHeader>
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug with inlines|Win32'">Create</PrecompiledHeader>
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Dedicated Debug|Win32'">Create</PrecompiledHeader>
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Dedicated Release|Win32'">Create</PrecompiledHeader>
<PrecompiledHeader Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">Create</PrecompiledHeader>
</ClCompile>
</ItemGroup>
<ItemGroup>
<None Include="MayaImport\mayaimport.def" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="idlib.vcxproj">
<Project>{49bec5c6-b964-417a-851e-808886b57400}</Project>
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

File diff suppressed because it is too large Load Diff

457
neo/MayaImport/exporter.h Normal file
View File

@ -0,0 +1,457 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
#define MAYA_DEFAULT_CAMERA "camera1"
#define ANIM_TX BIT( 0 )
#define ANIM_TY BIT( 1 )
#define ANIM_TZ BIT( 2 )
#define ANIM_QX BIT( 3 )
#define ANIM_QY BIT( 4 )
#define ANIM_QZ BIT( 5 )
typedef enum {
WRITE_MESH,
WRITE_ANIM,
WRITE_CAMERA
} exportType_t;
typedef struct {
idCQuat q;
idVec3 t;
} jointFrame_t;
typedef struct {
idCQuat q;
idVec3 t;
float fov;
} cameraFrame_t;
/*
==============================================================================================
idTokenizer
==============================================================================================
*/
class idTokenizer {
private:
int currentToken;
idStrList tokens;
public:
idTokenizer() { Clear(); };
void Clear( void ) { currentToken = 0; tokens.Clear(); };
int SetTokens( const char *buffer );
const char *NextToken( const char *errorstring = NULL );
bool TokenAvailable( void ) { return currentToken < tokens.Num(); };
int Num( void ) { return tokens.Num(); };
void UnGetToken( void ) { if ( currentToken > 0 ) { currentToken--; } };
const char *GetToken( int index ) { if ( ( index >= 0 ) && ( index < tokens.Num() ) ) { return tokens[ index ]; } else { return NULL; } };
const char *CurrentToken( void ) { return GetToken( currentToken ); };
};
/*
==============================================================================================
idExportOptions
==============================================================================================
*/
class idNamePair {
public:
idStr from;
idStr to;
};
class idAnimGroup {
public:
idStr name;
idStrList joints;
};
class idExportOptions {
private:
idTokenizer tokens;
void Reset( const char *commandline );
public:
idStr commandLine;
idStr src;
idStr dest;
idStr game;
idStr prefix;
float scale;
exportType_t type;
bool ignoreMeshes;
bool clearOrigin;
bool clearOriginAxis;
bool ignoreScale;
int startframe;
int endframe;
int framerate;
float xyzPrecision;
float quatPrecision;
idStr align;
idList<idNamePair> renamejoints;
idList<idNamePair> remapjoints;
idStrList keepjoints;
idStrList skipmeshes;
idStrList keepmeshes;
idList<idAnimGroup *> exportgroups;
idList<idAnimGroup> groups;
float rotate;
float jointThreshold;
int cycleStart;
idExportOptions( const char *commandline, const char *ospath );
bool jointInExportGroup( const char *jointname );
};
/*
==============================================================================
idExportJoint
==============================================================================
*/
class idExportJoint {
public:
idStr name;
idStr realname;
idStr longname;
int index;
int exportNum;
bool keep;
float scale;
float invscale;
MFnDagNode *dagnode;
idHierarchy<idExportJoint> mayaNode;
idHierarchy<idExportJoint> exportNode;
idVec3 t;
idMat3 wm;
idVec3 idt;
idMat3 idwm;
idVec3 bindpos;
idMat3 bindmat;
int animBits;
int firstComponent;
jointFrame_t baseFrame;
int depth;
idExportJoint();
idExportJoint &operator=( const idExportJoint &other );
};
/*
==============================================================================
misc structures
==============================================================================
*/
typedef struct {
idExportJoint *joint;
float jointWeight;
idVec3 offset;
} exportWeight_t;
typedef struct {
idVec3 pos;
idVec2 texCoords;
int startweight;
int numWeights;
} exportVertex_t;
typedef struct {
int indexes[ 3 ];
} exportTriangle_t;
typedef struct {
idVec2 uv[ 3 ];
} exportUV_t;
ID_INLINE int operator==( exportVertex_t a, exportVertex_t b ) {
if ( a.pos != b.pos ) {
return false;
}
if ( ( a.texCoords[ 0 ] != b.texCoords[ 0 ] ) || ( a.texCoords[ 1 ] != b.texCoords[ 1 ] ) ) {
return false;
}
if ( ( a.startweight != b.startweight ) || ( a.numWeights != b.numWeights ) ) {
return false;
}
return true;
}
/*
========================================================================
.MD3 triangle model file format
========================================================================
*/
#define MD3_IDENT (('3'<<24)+('P'<<16)+('D'<<8)+'I')
#define MD3_VERSION 15
// limits
#define MD3_MAX_LODS 4
#define MD3_MAX_TRIANGLES 8192 // per surface
#define MD3_MAX_VERTS 4096 // per surface
#define MD3_MAX_SHADERS 256 // per surface
#define MD3_MAX_FRAMES 1024 // per model
#define MD3_MAX_SURFACES 32 // per model
#define MD3_MAX_TAGS 16 // per frame
// vertex scales
#define MD3_XYZ_SCALE (1.0/64)
// surface geometry should not exceed these limits
#define SHADER_MAX_VERTEXES 1000
#define SHADER_MAX_INDEXES (6*SHADER_MAX_VERTEXES)
// the maximum size of game reletive pathnames
#define MAX_Q3PATH 64
typedef struct md3Frame_s {
idVec3 bounds[2];
idVec3 localOrigin;
float radius;
char name[16];
} md3Frame_t;
typedef struct md3Tag_s {
char name[MAX_Q3PATH]; // tag name
idVec3 origin;
idVec3 axis[3];
} md3Tag_t;
/*
** md3Surface_t
**
** CHUNK SIZE
** header sizeof( md3Surface_t )
** shaders sizeof( md3Shader_t ) * numShaders
** triangles[0] sizeof( md3Triangle_t ) * numTriangles
** st sizeof( md3St_t ) * numVerts
** XyzNormals sizeof( md3XyzNormal_t ) * numVerts * numFrames
*/
typedef struct {
int ident; //
char name[MAX_Q3PATH]; // polyset name
int flags;
int numFrames; // all surfaces in a model should have the same
int numShaders; // all surfaces in a model should have the same
int numVerts;
int numTriangles;
int ofsTriangles;
int ofsShaders; // offset from start of md3Surface_t
int ofsSt; // texture coords are common for all frames
int ofsXyzNormals; // numVerts * numFrames
int ofsEnd; // next surface follows
} md3Surface_t;
typedef struct {
char name[MAX_Q3PATH];
int shaderIndex; // for in-game use
} md3Shader_t;
typedef struct {
int indexes[3];
} md3Triangle_t;
typedef struct {
float st[2];
} md3St_t;
typedef struct {
short xyz[3];
short normal;
} md3XyzNormal_t;
typedef struct {
int ident;
int version;
char name[MAX_Q3PATH]; // model name
int flags;
int numFrames;
int numTags;
int numSurfaces;
int numSkins;
int ofsFrames; // offset for first frame
int ofsTags; // numFrames * numTags
int ofsSurfaces; // first surface, others follow
int ofsEnd; // end of file
} md3Header_t;
/*
==============================================================================
idExportMesh
==============================================================================
*/
class idExportMesh {
public:
idStr name;
idStr shader;
bool keep;
idList<exportVertex_t> verts;
idList<exportTriangle_t> tris;
idList<exportWeight_t> weights;
idList<exportUV_t> uv;
idExportMesh() { keep = true; };
void ShareVerts( void );
void GetBounds( idBounds &bounds ) const;
void Merge( idExportMesh *mesh );
};
/*
==============================================================================
idExportModel
==============================================================================
*/
class idExportModel {
public:
idExportJoint *exportOrigin;
idList<idExportJoint> joints;
idHierarchy<idExportJoint> mayaHead;
idHierarchy<idExportJoint> exportHead;
idList<int> cameraCuts;
idList<cameraFrame_t> camera;
idList<idBounds> bounds;
idList<jointFrame_t> jointFrames;
idList<jointFrame_t *> frames;
int frameRate;
int numFrames;
int skipjoints;
int export_joints;
idList<idExportMesh *> meshes;
idExportModel();
~idExportModel();
idExportJoint *FindJointReal( const char *name );
idExportJoint *FindJoint( const char *name );
bool WriteMesh( const char *filename, idExportOptions &options );
bool WriteAnim( const char *filename, idExportOptions &options );
bool WriteCamera( const char *filename, idExportOptions &options );
};
/*
==============================================================================
Maya
==============================================================================
*/
class idMayaExport {
private:
idExportModel model;
idExportOptions &options;
void FreeDagNodes( void );
float TimeForFrame( int num ) const;
int GetMayaFrameNum( int num ) const;
void SetFrame( int num );
void GetBindPose( MObject &jointNode, idExportJoint *joint, float scale );
void GetLocalTransform( idExportJoint *joint, idVec3 &pos, idMat3 &mat );
void GetWorldTransform( idExportJoint *joint, idVec3 &pos, idMat3 &mat, float scale );
void CreateJoints( float scale );
void PruneJoints( idStrList &keepjoints, idStr &prefix );
void RenameJoints( idList<idNamePair> &renamejoints, idStr &prefix );
bool RemapParents( idList<idNamePair> &remapjoints );
MObject FindShader( MObject& setNode );
void GetTextureForMesh( idExportMesh *mesh, MFnDagNode &dagNode );
idExportMesh *CopyMesh( MFnSkinCluster &skinCluster, float scale );
void CreateMesh( float scale );
void CombineMeshes( void );
void GetAlignment( idStr &alignName, idMat3 &align, float rotate, int startframe );
const char *GetObjectType( MObject object );
float GetCameraFov( idExportJoint *joint );
void GetCameraFrame( idExportJoint *camera, idMat3 &align, cameraFrame_t *cam );
void CreateCameraAnim( idMat3 &align );
void GetDefaultPose( idMat3 &align );
void CreateAnimation( idMat3 &align );
public:
idMayaExport( idExportOptions &exportOptions ) : options( exportOptions ) { };
~idMayaExport();
void ConvertModel( void );
void ConvertToMD3( void );
};

3149
neo/MayaImport/maya_main.cpp Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,45 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
#ifndef __MAYA_MAIN_H__
#define __MAYA_MAIN_H__
/*
==============================================================
Maya Import
==============================================================
*/
typedef bool ( *exporterDLLEntry_t )( int version, idCommon *common, idSys *sys );
typedef const char *( *exporterInterface_t )( const char *ospath, const char *commandline );
typedef void ( *exporterShutdown_t )( void );
#endif /* !__MAYA_MAIN_H__ */

View File

@ -0,0 +1,4 @@
EXPORTS
dllEntry
Maya_ConvertModel
Maya_Shutdown

539
neo/SConstruct Normal file
View File

@ -0,0 +1,539 @@
# -*- mode: python -*-
# DOOM build script
# TTimo <ttimo@idsoftware.com>
# http://scons.sourceforge.net
import sys, os, time, commands, re, pickle, StringIO, popen2, commands, pdb, zipfile, string
import SCons
sys.path.append( 'sys/scons' )
import scons_utils
conf_filename='site.conf'
# choose configuration variables which should be saved between runs
# ( we handle all those as strings )
serialized=['CC', 'CXX', 'JOBS', 'BUILD', 'IDNET_HOST', 'GL_HARDLINK', 'DEDICATED',
'DEBUG_MEMORY', 'LIBC_MALLOC', 'ID_NOLANADDRESS', 'ID_MCHECK', 'ALSA',
'TARGET_CORE', 'TARGET_GAME', 'TARGET_D3XP', 'TARGET_MONO', 'TARGET_DEMO', 'NOCURL',
'BUILD_ROOT', 'BUILD_GAMEPAK', 'BASEFLAGS', 'SILENT' ]
# global build mode ------------------------------
g_sdk = not os.path.exists( 'sys/scons/SConscript.core' )
# ------------------------------------------------
# help -------------------------------------------
help_string = """
Usage: scons [OPTIONS] [TARGET] [CONFIG]
[OPTIONS] and [TARGET] are covered in command line options, use scons -H
[CONFIG]: KEY="VALUE" [...]
a number of configuration options saved between runs in the """ + conf_filename + """ file
erase """ + conf_filename + """ to start with default settings again
CC (default gcc)
CXX (default g++)
Specify C and C++ compilers (defaults gcc and g++)
ex: CC="gcc-3.3"
You can use ccache and distcc, for instance:
CC="ccache distcc gcc" CXX="ccache distcc g++"
JOBS (default 1)
Parallel build
BUILD (default debug)
Use debug-all/debug/release to select build settings
ex: BUILD="release"
debug-all: no optimisations, debugging symbols
debug: -O -g
release: all optimisations, including CPU target etc.
BUILD_ROOT (default 'build')
change the build root directory
TARGET_GAME (default 1)
Build the base game code
TARGET_D3XP (default 1)
Build the d3xp game code
BUILD_GAMEPAK (default 0)
Build a game pak
BASEFLAGS (default '')
Add compile flags
NOCONF (default 0, not saved)
ignore site configuration and use defaults + command line only
SILENT ( default 0, saved )
hide the compiler output, unless error
"""
if ( not g_sdk ):
help_string += """
DEDICATED (default 0)
Control regular / dedicated type of build:
0 - client
1 - dedicated server
2 - both
TARGET_CORE (default 1)
Build the core
TARGET_MONO (default 0)
Build a monolithic binary
TARGET_DEMO (default 0)
Build demo client ( both a core and game, no mono )
NOTE: if you *only* want the demo client, set TARGET_CORE and TARGET_GAME to 0
IDNET_HOST (default to source hardcoded)
Override builtin IDNET_HOST with your own settings
GL_HARDLINK (default 0)
Instead of dynamically loading the OpenGL libraries, use implicit dependencies
NOTE: no GL logging capability and no r_glDriver with GL_HARDLINK 1
DEBUG_MEMORY (default 0)
Enables memory logging to file
LIBC_MALLOC (default 1)
Toggle idHeap memory / libc malloc usage
When libc malloc is on, memory size statistics are wrong ( no _msize )
ID_NOLANADDRESS (default 0)
Don't recognize any IP as LAN address. This is useful when debugging network
code where LAN / not LAN influences application behaviour
ID_MCHECK (default 2)
Perform heap consistency checking
0: on in Debug / off in Release
1 forces on, 2 forces off
note that Doom has it's own block allocator/checking
this should not be considered a replacement, but an additional tool
ALSA (default 1)
enable ALSA sound backend support
SETUP (default 0, not saved)
build a setup. implies release build
SDK (default 0, not saved)
build an SDK release
NOCURL (default 0)
set to 1 to disable usage of libcurl and http/ftp downloads feature
"""
Help( help_string )
# end help ---------------------------------------
# sanity -----------------------------------------
EnsureSConsVersion( 0, 96 )
# end sanity -------------------------------------
# system detection -------------------------------
# CPU type
cpu = commands.getoutput('uname -m')
exp = re.compile('.*i?86.*')
if exp.match(cpu):
cpu = 'x86'
else:
cpu = commands.getoutput('uname -p')
if ( cpu == 'powerpc' ):
cpu = 'ppc'
else:
cpu = 'cpu'
g_os = 'Linux'
# end system detection ---------------------------
# default settings -------------------------------
CC = 'gcc'
CXX = 'g++'
JOBS = '1'
BUILD = 'debug'
DEDICATED = '0'
TARGET_CORE = '1'
TARGET_GAME = '1'
TARGET_D3XP = '1'
TARGET_MONO = '0'
TARGET_DEMO = '0'
IDNET_HOST = ''
GL_HARDLINK = '0'
DEBUG_MEMORY = '0'
LIBC_MALLOC = '1'
ID_NOLANADDRESS = '0'
ID_MCHECK = '2'
BUILD_ROOT = 'build'
ALSA = '1'
SETUP = '0'
SDK = '0'
NOCONF = '0'
NOCURL = '0'
BUILD_GAMEPAK = '0'
BASEFLAGS = ''
SILENT = '0'
# end default settings ---------------------------
# site settings ----------------------------------
if ( not ARGUMENTS.has_key( 'NOCONF' ) or ARGUMENTS['NOCONF'] != '1' ):
site_dict = {}
if (os.path.exists(conf_filename)):
site_file = open(conf_filename, 'r')
p = pickle.Unpickler(site_file)
site_dict = p.load()
print 'Loading build configuration from ' + conf_filename + ':'
for k, v in site_dict.items():
exec_cmd = k + '=\'' + v + '\''
print ' ' + exec_cmd
exec(exec_cmd)
else:
print 'Site settings ignored'
# end site settings ------------------------------
# command line settings --------------------------
for k in ARGUMENTS.keys():
exec_cmd = k + '=\'' + ARGUMENTS[k] + '\''
print 'Command line: ' + exec_cmd
exec( exec_cmd )
# end command line settings ----------------------
# save site configuration ----------------------
if ( not ARGUMENTS.has_key( 'NOCONF' ) or ARGUMENTS['NOCONF'] != '1' ):
for k in serialized:
exec_cmd = 'site_dict[\'' + k + '\'] = ' + k
exec(exec_cmd)
site_file = open(conf_filename, 'w')
p = pickle.Pickler(site_file)
p.dump(site_dict)
site_file.close()
# end save site configuration ------------------
# configuration rules --------------------------
if ( SETUP != '0' ):
DEDICATED = '2'
BUILD = 'release'
if ( g_sdk or SDK != '0' ):
TARGET_CORE = '0'
TARGET_GAME = '1'
TARGET_D3XP = '1'
TARGET_MONO = '0'
TARGET_DEMO = '0'
# end configuration rules ----------------------
# general configuration, target selection --------
g_build = BUILD_ROOT + '/' + BUILD
SConsignFile( 'scons.signatures' )
if ( GL_HARDLINK != '0' ):
g_build += '-hardlink'
if ( DEBUG_MEMORY != '0' ):
g_build += '-debugmem'
if ( LIBC_MALLOC != '1' ):
g_build += '-nolibcmalloc'
SetOption('num_jobs', JOBS)
LINK = CXX
# common flags
# BASE + CORE + OPT for engine
# BASE + GAME + OPT for game
# _noopt versions of the environements are built without the OPT
BASECPPFLAGS = [ ]
CORECPPPATH = [ ]
CORELIBPATH = [ ]
CORECPPFLAGS = [ ]
GAMECPPFLAGS = [ ]
BASELINKFLAGS = [ ]
CORELINKFLAGS = [ ]
# for release build, further optimisations that may not work on all files
OPTCPPFLAGS = [ ]
BASECPPFLAGS.append( BASEFLAGS )
BASECPPFLAGS.append( '-pipe' )
# warn all
BASECPPFLAGS.append( '-Wall' )
BASECPPFLAGS.append( '-Wno-unknown-pragmas' )
# this define is necessary to make sure threading support is enabled in X
CORECPPFLAGS.append( '-DXTHREADS' )
# don't wrap gcc messages
BASECPPFLAGS.append( '-fmessage-length=0' )
# gcc 4.0
BASECPPFLAGS.append( '-fpermissive' )
if ( g_os == 'Linux' ):
# gcc 4.x option only - only export what we mean to from the game SO
BASECPPFLAGS.append( '-fvisibility=hidden' )
# get the 64 bits machine on the distcc array to produce 32 bit binaries :)
BASECPPFLAGS.append( '-m32' )
BASELINKFLAGS.append( '-m32' )
if ( g_sdk or SDK != '0' ):
BASECPPFLAGS.append( '-D_D3SDK' )
if ( BUILD == 'debug-all' ):
OPTCPPFLAGS = [ '-g', '-D_DEBUG' ]
if ( ID_MCHECK == '0' ):
ID_MCHECK = '1'
elif ( BUILD == 'debug' ):
OPTCPPFLAGS = [ '-g', '-O1', '-D_DEBUG' ]
if ( ID_MCHECK == '0' ):
ID_MCHECK = '1'
elif ( BUILD == 'release' ):
# -fomit-frame-pointer: "-O also turns on -fomit-frame-pointer on machines where doing so does not interfere with debugging."
# on x86 have to set it explicitely
# -finline-functions: implicit at -O3
# -fschedule-insns2: implicit at -O2
# no-unsafe-math-optimizations: that should be on by default really. hit some wonko bugs in physics code because of that
OPTCPPFLAGS = [ '-O3', '-march=pentium3', '-Winline', '-ffast-math', '-fno-unsafe-math-optimizations', '-fomit-frame-pointer' ]
if ( ID_MCHECK == '0' ):
ID_MCHECK = '2'
else:
print 'Unknown build configuration ' + BUILD
sys.exit(0)
if ( GL_HARDLINK != '0' ):
CORECPPFLAGS.append( '-DID_GL_HARDLINK' )
if ( DEBUG_MEMORY != '0' ):
BASECPPFLAGS += [ '-DID_DEBUG_MEMORY', '-DID_REDIRECT_NEWDELETE' ]
if ( LIBC_MALLOC != '1' ):
BASECPPFLAGS.append( '-DUSE_LIBC_MALLOC=0' )
if ( len( IDNET_HOST ) ):
CORECPPFLAGS.append( '-DIDNET_HOST=\\"%s\\"' % IDNET_HOST)
if ( ID_NOLANADDRESS != '0' ):
CORECPPFLAGS.append( '-DID_NOLANADDRESS' )
if ( ID_MCHECK == '1' ):
BASECPPFLAGS.append( '-DID_MCHECK' )
# create the build environements
g_base_env = Environment( ENV = os.environ, CC = CC, CXX = CXX, LINK = LINK, CPPFLAGS = BASECPPFLAGS, LINKFLAGS = BASELINKFLAGS, CPPPATH = CORECPPPATH, LIBPATH = CORELIBPATH )
scons_utils.SetupUtils( g_base_env )
g_env = g_base_env.Clone()
g_env['CPPFLAGS'] += OPTCPPFLAGS
g_env['CPPFLAGS'] += CORECPPFLAGS
g_env['LINKFLAGS'] += CORELINKFLAGS
g_env_noopt = g_base_env.Clone()
g_env_noopt['CPPFLAGS'] += CORECPPFLAGS
g_game_env = g_base_env.Clone()
g_game_env['CPPFLAGS'] += OPTCPPFLAGS
g_game_env['CPPFLAGS'] += GAMECPPFLAGS
# maintain this dangerous optimization off at all times
g_env.Append( CPPFLAGS = '-fno-strict-aliasing' )
g_env_noopt.Append( CPPFLAGS = '-fno-strict-aliasing' )
g_game_env.Append( CPPFLAGS = '-fno-strict-aliasing' )
if ( int(JOBS) > 1 ):
print 'Using buffered process output'
silent = False
if ( SILENT == '1' ):
silent = True
scons_utils.SetupBufferedOutput( g_env, silent )
scons_utils.SetupBufferedOutput( g_game_env, silent )
# mark the globals
local_dedicated = 0
# 0 for monolithic build
local_gamedll = 1
# carry around rather than using .a, avoids binutils bugs
idlib_objects = []
game_objects = []
local_demo = 0
# curl usage. there is a global toggle flag
local_curl = 0
curl_lib = []
# if idlib should produce PIC objects ( depending on core or game inclusion )
local_idlibpic = 0
# switch between base game build and d3xp game build
local_d3xp = 0
GLOBALS = 'g_env g_env_noopt g_game_env g_os ID_MCHECK ALSA idlib_objects game_objects local_dedicated local_gamedll local_demo local_idlibpic curl_lib local_curl local_d3xp OPTCPPFLAGS'
# end general configuration ----------------------
# targets ----------------------------------------
Export( 'GLOBALS ' + GLOBALS )
doom = None
doomded = None
game = None
doom_mono = None
doom_demo = None
game_demo = None
# build curl if needed
if ( NOCURL == '0' and ( TARGET_CORE == '1' or TARGET_MONO == '1' ) ):
# 1: debug, 2: release
if ( BUILD == 'release' ):
local_curl = 2
else:
local_curl = 1
Export( 'GLOBALS ' + GLOBALS )
curl_lib = SConscript( 'sys/scons/SConscript.curl' )
if ( TARGET_CORE == '1' ):
local_gamedll = 1
local_demo = 0
local_idlibpic = 0
if ( DEDICATED == '0' or DEDICATED == '2' ):
local_dedicated = 0
Export( 'GLOBALS ' + GLOBALS )
VariantDir( g_build + '/core/glimp', '.', duplicate = 1 )
SConscript( g_build + '/core/glimp/sys/scons/SConscript.gl' )
VariantDir( g_build + '/core', '.', duplicate = 0 )
idlib_objects = SConscript( g_build + '/core/sys/scons/SConscript.idlib' )
Export( 'GLOBALS ' + GLOBALS ) # update idlib_objects
doom = SConscript( g_build + '/core/sys/scons/SConscript.core' )
InstallAs( '#doom.' + cpu, doom )
if ( DEDICATED == '1' or DEDICATED == '2' ):
local_dedicated = 1
Export( 'GLOBALS ' + GLOBALS )
VariantDir( g_build + '/dedicated/glimp', '.', duplicate = 1 )
SConscript( g_build + '/dedicated/glimp/sys/scons/SConscript.gl' )
VariantDir( g_build + '/dedicated', '.', duplicate = 0 )
idlib_objects = SConscript( g_build + '/dedicated/sys/scons/SConscript.idlib' )
Export( 'GLOBALS ' + GLOBALS )
doomded = SConscript( g_build + '/dedicated/sys/scons/SConscript.core' )
InstallAs( '#doomded.' + cpu, doomded )
if ( TARGET_GAME == '1' or TARGET_D3XP == '1' ):
local_gamedll = 1
local_demo = 0
local_dedicated = 0
local_idlibpic = 1
Export( 'GLOBALS ' + GLOBALS )
dupe = 0
if ( SDK == '1' ):
# building an SDK, use scons for dependencies walking
# clear the build directory to be safe
g_env.PreBuildSDK( [ g_build + '/game', g_build + '/d3xp' ] )
dupe = 1
VariantDir( g_build + '/game', '.', duplicate = dupe )
idlib_objects = SConscript( g_build + '/game/sys/scons/SConscript.idlib' )
if ( TARGET_GAME == '1' ):
local_d3xp = 0
Export( 'GLOBALS ' + GLOBALS )
game = SConscript( g_build + '/game/sys/scons/SConscript.game' )
game_base = InstallAs( '#game%s-base.so' % cpu, game )
if ( BUILD_GAMEPAK == '1' ):
Command( '#game01-base.pk4', [ game_base, game ], Action( g_env.BuildGamePak ) )
if ( TARGET_D3XP == '1' ):
# uses idlib as compiled for game/
local_d3xp = 1
VariantDir( g_build + '/d3xp', '.', duplicate = dupe )
Export( 'GLOBALS ' + GLOBALS )
d3xp = SConscript( g_build + '/d3xp/sys/scons/SConscript.game' )
game_d3xp = InstallAs( '#game%s-d3xp.so' % cpu, d3xp )
if ( BUILD_GAMEPAK == '1' ):
Command( '#game01-d3xp.pk4', [ game_d3xp, d3xp ], Action( g_env.BuildGamePak ) )
if ( TARGET_MONO == '1' ):
# NOTE: no D3XP atm. add a TARGET_MONO_D3XP
local_gamedll = 0
local_dedicated = 0
local_demo = 0
local_idlibpic = 0
local_d3xp = 0
Export( 'GLOBALS ' + GLOBALS )
VariantDir( g_build + '/mono/glimp', '.', duplicate = 1 )
SConscript( g_build + '/mono/glimp/sys/scons/SConscript.gl' )
VariantDir( g_build + '/mono', '.', duplicate = 0 )
idlib_objects = SConscript( g_build + '/mono/sys/scons/SConscript.idlib' )
game_objects = SConscript( g_build + '/mono/sys/scons/SConscript.game' )
Export( 'GLOBALS ' + GLOBALS )
doom_mono = SConscript( g_build + '/mono/sys/scons/SConscript.core' )
InstallAs( '#doom-mon.' + cpu, doom_mono )
if ( TARGET_DEMO == '1' ):
# NOTE: no D3XP atm. add a TARGET_DEMO_D3XP
local_demo = 1
local_dedicated = 0
local_gamedll = 1
local_idlibpic = 0
local_curl = 0
local_d3xp = 0
curl_lib = []
Export( 'GLOBALS ' + GLOBALS )
VariantDir( g_build + '/demo/glimp', '.', duplicate = 1 )
SConscript( g_build + '/demo/glimp/sys/scons/SConscript.gl' )
VariantDir( g_build + '/demo', '.', duplicate = 0 )
idlib_objects = SConscript( g_build + '/demo/sys/scons/SConscript.idlib' )
Export( 'GLOBALS ' + GLOBALS )
doom_demo = SConscript( g_build + '/demo/sys/scons/SConscript.core' )
InstallAs( '#doom-demo.' + cpu, doom_demo )
local_idlibpic = 1
Export( 'GLOBALS ' + GLOBALS )
VariantDir( g_build + '/demo/game', '.', duplicate = 0 )
idlib_objects = SConscript( g_build + '/demo/game/sys/scons/SConscript.idlib' )
Export( 'GLOBALS ' + GLOBALS )
game_demo = SConscript( g_build + '/demo/game/sys/scons/SConscript.game' )
InstallAs( '#game%s-demo.so' % cpu, game_demo )
if ( SETUP != '0' ):
brandelf = Program( 'brandelf', 'sys/linux/setup/brandelf.c' )
if ( TARGET_CORE == '1' and TARGET_GAME == '1' and TARGET_D3XP == '1' ):
setup = Command( 'setup', [ brandelf, doom, doomded, game, d3xp ], Action( g_env.BuildSetup ) )
else:
print 'Skipping main setup: TARGET_CORE == 0 or TARGET_GAME == 0'
if ( TARGET_DEMO == '1' ):
setup_demo = Command( 'setup-demo', [ brandelf, doom_demo, game_demo ], Action( g_env.BuildSetup ) )
# if building two setups, make sure JOBS doesn't parallelize them
try:
g_env.Depends( setup_demo, setup )
except:
pass
else:
print 'Skipping demo setup ( TARGET_DEMO == 0 )'
if ( SDK != '0' ):
setup_sdk = Command( 'sdk', [ ], Action( g_env.BuildSDK ) )
g_env.Depends( setup_sdk, [ game, d3xp ] )
# end targets ------------------------------------

1050
neo/TypeInfo/TypeInfoGen.cpp Normal file

File diff suppressed because it is too large Load Diff

116
neo/TypeInfo/TypeInfoGen.h Normal file
View File

@ -0,0 +1,116 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
#ifndef __TYPEINFOGEN_H__
#define __TYPEINFOGEN_H__
/*
===================================================================================
Type Info Generator
- template classes are commented out (different instantiations are not identified)
- bit fields are commented out (cannot get the address of bit fields)
- multiple inheritance is not supported (only tracks a single super type)
===================================================================================
*/
class idConstantInfo {
public:
idStr name;
idStr type;
idStr value;
};
class idEnumValueInfo {
public:
idStr name;
int value;
};
class idEnumTypeInfo {
public:
idStr typeName;
idStr scope;
bool unnamed;
bool isTemplate;
idList<idEnumValueInfo> values;
};
class idClassVariableInfo {
public:
idStr name;
idStr type;
int bits;
};
class idClassTypeInfo {
public:
idStr typeName;
idStr superType;
idStr scope;
bool unnamed;
bool isTemplate;
idList<idClassVariableInfo> variables;
};
class idTypeInfoGen {
public:
idTypeInfoGen( void );
~idTypeInfoGen( void );
void AddDefine( const char *define );
void CreateTypeInfo( const char *path );
void WriteTypeInfo( const char *fileName ) const;
private:
idStrList defines;
idList<idConstantInfo *> constants;
idList<idEnumTypeInfo *> enums;
idList<idClassTypeInfo *> classes;
int numTemplates;
int maxInheritance;
idStr maxInheritanceClass;
int GetInheritance( const char *typeName ) const;
int EvaluateIntegerString( const idStr &string );
float EvaluateFloatString( const idStr &string );
idConstantInfo * FindConstant( const char *name );
int GetIntegerConstant( const char *scope, const char *name, idParser &src );
float GetFloatConstant( const char *scope, const char *name, idParser &src );
int ParseArraySize( const char *scope, idParser &src );
void ParseConstantValue( const char *scope, idParser &src, idStr &value );
idEnumTypeInfo * ParseEnumType( const char *scope, bool isTemplate, bool typeDef, idParser &src );
idClassTypeInfo * ParseClassType( const char *scope, const char *templateArgs, bool isTemplate, bool typeDef, idParser &src );
void ParseScope( const char *scope, bool isTemplate, idParser &src, idClassTypeInfo *typeInfo );
};
#endif /* !__TYPEINFOGEN_H__ */

307
neo/TypeInfo/main.cpp Normal file
View File

@ -0,0 +1,307 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
#include "../idlib/precompiled.h"
#include "../sys/sys_local.h"
#pragma hdrstop
#include "TypeInfoGen.h"
idSession * session = NULL;
idDeclManager * declManager = NULL;
idEventLoop * eventLoop = NULL;
int idEventLoop::JournalLevel( void ) const { return 0; }
/*
==============================================================
idCommon
==============================================================
*/
#define STDIO_PRINT( pre, post ) \
va_list argptr; \
va_start( argptr, fmt ); \
printf( pre ); \
vprintf( fmt, argptr ); \
printf( post ); \
va_end( argptr )
class idCommonLocal : public idCommon {
public:
idCommonLocal( void ) {}
virtual void Init( int argc, const char **argv, const char *cmdline ) {}
virtual void Shutdown( void ) {}
virtual void Quit( void ) {}
virtual bool IsInitialized( void ) const { return true; }
virtual void Frame( void ) {}
virtual void GUIFrame( bool execCmd, bool network ) {}
virtual void Async( void ) {}
virtual void StartupVariable( const char *match, bool once ) {}
virtual void InitTool( const toolFlag_t tool, const idDict *dict ) {}
virtual void ActivateTool( bool active ) {}
virtual void WriteConfigToFile( const char *filename ) {}
virtual void WriteFlaggedCVarsToFile( const char *filename, int flags, const char *setCmd ) {}
virtual void BeginRedirect( char *buffer, int buffersize, void (*flush)( const char * ) ) {}
virtual void EndRedirect( void ) {}
virtual void SetRefreshOnPrint( bool set ) {}
virtual void Printf( const char *fmt, ... ) { STDIO_PRINT( "", "" ); }
virtual void VPrintf( const char *fmt, va_list arg ) { vprintf( fmt, arg ); }
virtual void DPrintf( const char *fmt, ... ) { /*STDIO_PRINT( "", "" );*/ }
virtual void Warning( const char *fmt, ... ) { STDIO_PRINT( "WARNING: ", "\n" ); }
virtual void DWarning( const char *fmt, ...) { /*STDIO_PRINT( "WARNING: ", "\n" );*/ }
virtual void PrintWarnings( void ) {}
virtual void ClearWarnings( const char *reason ) {}
virtual void Error( const char *fmt, ... ) { STDIO_PRINT( "ERROR: ", "\n" ); exit(0); }
virtual void FatalError( const char *fmt, ... ) { STDIO_PRINT( "FATAL ERROR: ", "\n" ); exit(0); }
virtual const idLangDict *GetLanguageDict() { return NULL; }
virtual const char * KeysFromBinding( const char *bind ) { return NULL; }
virtual const char * BindingFromKey( const char *key ) { return NULL; }
virtual int ButtonState( int key ) { return 0; }
virtual int KeyState( int key ) { return 0; }
};
idCVar com_developer( "developer", "0", CVAR_BOOL|CVAR_SYSTEM, "developer mode" );
idCommonLocal commonLocal;
idCommon * common = &commonLocal;
/*
==============================================================
idSys
==============================================================
*/
void Sys_Mkdir( const char *path ) {}
ID_TIME_T Sys_FileTimeStamp( FILE *fp ) { return 0; }
#ifdef _WIN32
#include <io.h>
#include <direct.h>
const char *Sys_Cwd( void ) {
static char cwd[1024];
_getcwd( cwd, sizeof( cwd ) - 1 );
cwd[sizeof( cwd ) - 1] = 0;
int i = idStr::FindText( cwd, CD_BASEDIR, false );
if ( i >= 0 ) {
cwd[i + strlen( CD_BASEDIR )] = '\0';
}
return cwd;
}
const char *Sys_DefaultCDPath( void ) {
return "";
}
const char *Sys_DefaultBasePath( void ) {
return Sys_Cwd();
}
const char *Sys_DefaultSavePath( void ) {
return cvarSystem->GetCVarString( "fs_basepath" );
}
const char *Sys_EXEPath( void ) {
return "";
}
int Sys_ListFiles( const char *directory, const char *extension, idStrList &list ) {
idStr search;
struct _finddata_t findinfo;
int findhandle;
int flag;
if ( !extension) {
extension = "";
}
// passing a slash as extension will find directories
if ( extension[0] == '/' && extension[1] == 0 ) {
extension = "";
flag = 0;
} else {
flag = _A_SUBDIR;
}
sprintf( search, "%s\\*%s", directory, extension );
// search
list.Clear();
findhandle = _findfirst( search, &findinfo );
if ( findhandle == -1 ) {
return -1;
}
do {
if ( flag ^ ( findinfo.attrib & _A_SUBDIR ) ) {
list.Append( findinfo.name );
}
} while ( _findnext( findhandle, &findinfo ) != -1 );
_findclose( findhandle );
return list.Num();
}
#else
const char * Sys_DefaultCDPath( void ) { return ""; }
const char * Sys_DefaultBasePath( void ) { return ""; }
const char * Sys_DefaultSavePath( void ) { return ""; }
int Sys_ListFiles( const char *directory, const char *extension, idStrList &list ) { return 0; }
#endif
xthreadInfo * g_threads[MAX_THREADS];
int g_thread_count;
void Sys_CreateThread( xthread_t function, void *parms, xthreadPriority priority, xthreadInfo &info, const char *name, xthreadInfo *threads[MAX_THREADS], int *thread_count ) {}
void Sys_DestroyThread( xthreadInfo& info ) {}
void Sys_EnterCriticalSection( int index ) {}
void Sys_LeaveCriticalSection( int index ) {}
void Sys_WaitForEvent( int index ) {}
void Sys_TriggerEvent( int index ) {}
/*
==============
idSysLocal stub
==============
*/
void idSysLocal::DebugPrintf( const char *fmt, ... ) {}
void idSysLocal::DebugVPrintf( const char *fmt, va_list arg ) {}
double idSysLocal::GetClockTicks( void ) { return 0.0; }
double idSysLocal::ClockTicksPerSecond( void ) { return 1.0; }
cpuid_t idSysLocal::GetProcessorId( void ) { return (cpuid_t)0; }
const char * idSysLocal::GetProcessorString( void ) { return ""; }
const char * idSysLocal::FPU_GetState( void ) { return ""; }
bool idSysLocal::FPU_StackIsEmpty( void ) { return true; }
void idSysLocal::FPU_SetFTZ( bool enable ) {}
void idSysLocal::FPU_SetDAZ( bool enable ) {}
bool idSysLocal::LockMemory( void *ptr, int bytes ) { return false; }
bool idSysLocal::UnlockMemory( void *ptr, int bytes ) { return false; }
void idSysLocal::GetCallStack( address_t *callStack, const int callStackSize ) { memset( callStack, 0, callStackSize * sizeof( callStack[0] ) ); }
const char * idSysLocal::GetCallStackStr( const address_t *callStack, const int callStackSize ) { return ""; }
const char * idSysLocal::GetCallStackCurStr( int depth ) { return ""; }
void idSysLocal::ShutdownSymbols( void ) {}
int idSysLocal::DLL_Load( const char *dllName ) { return 0; }
void * idSysLocal::DLL_GetProcAddress( int dllHandle, const char *procName ) { return NULL; }
void idSysLocal::DLL_Unload( int dllHandle ) { }
void idSysLocal::DLL_GetFileName( const char *baseName, char *dllName, int maxLength ) { }
sysEvent_t idSysLocal::GenerateMouseButtonEvent( int button, bool down ) { sysEvent_t ev; memset( &ev, 0, sizeof( ev ) ); return ev; }
sysEvent_t idSysLocal::GenerateMouseMoveEvent( int deltax, int deltay ) { sysEvent_t ev; memset( &ev, 0, sizeof( ev ) ); return ev; }
void idSysLocal::OpenURL( const char *url, bool quit ) { }
void idSysLocal::StartProcess( const char *exeName, bool quit ) { }
void idSysLocal::FPU_EnableExceptions( int exceptions ) { }
idSysLocal sysLocal;
idSys * sys = &sysLocal;
/*
==============================================================
main
==============================================================
*/
int main( int argc, char** argv ) {
idStr fileName, sourcePath;
idTypeInfoGen *generator;
idLib::common = common;
idLib::cvarSystem = cvarSystem;
idLib::fileSystem = fileSystem;
idLib::sys = sys;
idLib::Init();
cmdSystem->Init();
cvarSystem->Init();
idCVar::RegisterStaticVars();
fileSystem->Init();
generator = new idTypeInfoGen;
if ( argc > 1 ) {
sourcePath = idStr( "../"SOURCE_CODE_BASE_FOLDER"/" ) + argv[1];
} else {
sourcePath = "../"SOURCE_CODE_BASE_FOLDER"/game";
}
if ( argc > 2 ) {
fileName = idStr( "../"SOURCE_CODE_BASE_FOLDER"/" ) + argv[2];
} else {
fileName = "../"SOURCE_CODE_BASE_FOLDER"/game/gamesys/GameTypeInfo.h";
}
if ( argc > 3 ) {
for ( int i = 3; i < argc; i++ ) {
generator->AddDefine( argv[i] );
}
} else {
generator->AddDefine( "__cplusplus" );
generator->AddDefine( "GAME_DLL" );
generator->AddDefine( "ID_TYPEINFO" );
}
generator->CreateTypeInfo( sourcePath );
generator->WriteTypeInfo( fileName );
delete generator;
fileName.Clear();
sourcePath.Clear();
fileSystem->Shutdown( false );
cvarSystem->Shutdown();
cmdSystem->Shutdown();
idLib::ShutDown();
return 0;
}

17
neo/_Common.props Normal file
View File

@ -0,0 +1,17 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Common Project Properties</_PropertySheetDisplayName>
<OutDir>..\build\$(PlatformName)\$(Configuration)\</OutDir>
<IntDir>..\build\$(PlatformName)\$(Configuration)\intermediate\$(ProjectName)\</IntDir>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>WIN32;_WINDOWS;_CRT_SECURE_NO_DEPRECATE;_CRT_NONSTDC_NO_DEPRECATE;_USE_32BIT_TIME_T;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<WarningLevel>Level4</WarningLevel>
<PrecompiledHeaderFile>
</PrecompiledHeaderFile>
</ClCompile>
</ItemDefinitionGroup>
</Project>

13
neo/_Curl.props Normal file
View File

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Curl Library</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PreprocessorDefinitions>USRDLL;CURLLIB_EXPORTS;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
</ItemDefinitionGroup>
</Project>

24
neo/_Debug.props Normal file
View File

@ -0,0 +1,24 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Debug</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>_DEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks>
<SmallerTypeCheck>false</SmallerTypeCheck>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
<RuntimeTypeInfo>true</RuntimeTypeInfo>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<AdditionalDependencies>nafxcwd.lib;libcmtd.lib;%(AdditionalDependencies)</AdditionalDependencies>
<IgnoreSpecificDefaultLibraries>nafxcwd.lib;libcmtd.lib;%(IgnoreSpecificDefaultLibraries)</IgnoreSpecificDefaultLibraries>
<GenerateDebugInformation>true</GenerateDebugInformation>
<GenerateMapFile>true</GenerateMapFile>
</Link>
</ItemDefinitionGroup>
</Project>

12
neo/_Dedicated.props Normal file
View File

@ -0,0 +1,12 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Dedicated</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>ID_DEDICATED;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
</ItemDefinitionGroup>
</Project>

26
neo/_DoomDLL.props Normal file
View File

@ -0,0 +1,26 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Doom III Executable</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<AdditionalIncludeDirectories>sound/vorbis/include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>__DOOM__;__DOOM_DLL__;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
</ClCompile>
<Link>
<AdditionalDependencies>dbghelp.lib;dinput8.lib;dsound.lib;dxguid.lib;DxErr.lib;eaxguid.lib;glu32.lib;iphlpapi.lib;odbc32.lib;odbccp32.lib;opengl32.lib;winmm.lib;wsock32.lib;%(AdditionalDependencies)</AdditionalDependencies>
<AdditionalLibraryDirectories>C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\Lib\x86;openal\lib;sys\win32\dongle;%(AdditionalLibraryDirectories)</AdditionalLibraryDirectories>
<OutputFile>$(OutDir)DOOM3.exe</OutputFile>
<AdditionalManifestDependencies>%(AdditionalManifestDependencies)</AdditionalManifestDependencies>
<SubSystem>Windows</SubSystem>
<StackReserveSize>16777216</StackReserveSize>
<StackCommitSize>16777216</StackCommitSize>
<LargeAddressAware>true</LargeAddressAware>
</Link>
</ItemDefinitionGroup>
</Project>

20
neo/_Game-d3xp.props Normal file
View File

@ -0,0 +1,20 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Game d3xp Library</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>__DOOM__;GAME_DLL;_D3XP;CTF;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<OutputFile>$(OutDir)gamex86.dll</OutputFile>
<ModuleDefinitionFile>.\d3xp\game.def</ModuleDefinitionFile>
</Link>
<PreBuildEvent>
<Command>..\build\Win32\"$(Configuration)"\TypeInfo.exe</Command>
</PreBuildEvent>
</ItemDefinitionGroup>
</Project>

22
neo/_Game.props Normal file
View File

@ -0,0 +1,22 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Game Library</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>__DOOM__;GAME_DLL;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PrecompiledHeader>
</PrecompiledHeader>
</ClCompile>
<Link>
<OutputFile>$(OutDir)gamex86.dll</OutputFile>
<ModuleDefinitionFile>.\game\game.def</ModuleDefinitionFile>
</Link>
<PreBuildEvent>
<Command>..\build\Win32\"$(Configuration)"\TypeInfo.exe</Command>
</PreBuildEvent>
</ItemDefinitionGroup>
</Project>

19
neo/_MayaImport.props Normal file
View File

@ -0,0 +1,19 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>MayaImport Library</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<AdditionalIncludeDirectories>mssdk/include;MayaImport/Maya5.0;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>_USRDLL;MAYAIMPORT_EXPORTS;MAYA_IMPORT;REQUIRE_IOSTREAM;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<PrecompiledHeader>Use</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
</ClCompile>
<Link>
<AdditionalDependencies>Foundation.lib;OpenMaya.lib;OpenMayaAnim.lib;%(AdditionalDependencies)</AdditionalDependencies>
<AdditionalLibraryDirectories>MayaImport/maya5.0/libs;%(AdditionalLibraryDirectories)</AdditionalLibraryDirectories>
</Link>
</ItemDefinitionGroup>
</Project>

30
neo/_Release.props Normal file
View File

@ -0,0 +1,30 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>Release</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<Optimization>MaxSpeed</Optimization>
<InlineFunctionExpansion>AnySuitable</InlineFunctionExpansion>
<IntrinsicFunctions>true</IntrinsicFunctions>
<OmitFramePointers>true</OmitFramePointers>
<PreprocessorDefinitions>NDEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<StringPooling>true</StringPooling>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<BufferSecurityCheck>false</BufferSecurityCheck>
<FunctionLevelLinking>true</FunctionLevelLinking>
<RuntimeTypeInfo>true</RuntimeTypeInfo>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<AdditionalDependencies>nafxcw.lib;libcmt.lib;%(AdditionalDependencies)</AdditionalDependencies>
<IgnoreSpecificDefaultLibraries>nafxcw.lib;libcmt.lib;%(IgnoreSpecificDefaultLibraries)</IgnoreSpecificDefaultLibraries>
<GenerateDebugInformation>false</GenerateDebugInformation>
<GenerateMapFile>false</GenerateMapFile>
<OptimizeReferences>true</OptimizeReferences>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
</Link>
</ItemDefinitionGroup>
</Project>

16
neo/_TypeInfo.props Normal file
View File

@ -0,0 +1,16 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>TypeInfo</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>ID_ENABLE_CURL=0;ID_TYPEINFO;__DOOM_DLL__;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<PrecompiledHeader>Use</PrecompiledHeader>
</ClCompile>
<Link>
<SubSystem>Console</SubSystem>
</Link>
</ItemDefinitionGroup>
</Project>

13
neo/_WithInlines.props Normal file
View File

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>With Inlines</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<InlineFunctionExpansion>OnlyExplicitInline</InlineFunctionExpansion>
<PreprocessorDefinitions>_INLINEDEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
</ItemDefinitionGroup>
</Project>

12
neo/_WithMemoryLog.props Normal file
View File

@ -0,0 +1,12 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>With Memory Log</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>ID_REDIRECT_NEWDELETE;ID_DEBUG_MEMORY;ID_DEBUG_UNINITIALIZED_MEMORY;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
</ItemDefinitionGroup>
</Project>

13
neo/_idlib.props Normal file
View File

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<_PropertySheetDisplayName>idlib</_PropertySheetDisplayName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<PreprocessorDefinitions>__IDLIB__;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<PrecompiledHeader>Use</PrecompiledHeader>
</ClCompile>
</ItemDefinitionGroup>
</Project>

53
neo/clean.bat Normal file
View File

@ -0,0 +1,53 @@
rmdir debug /s /q
rmdir release /s /q
del quake3.ncb
del quake3.opt
del quake3.plg
del quake3.stt
rmdir cgame\debug /s /q
rmdir cgame\release /s /q
del cgame\cgame.ncb
del cgame\cgame.opt
del cgame\cgame.plg
del cgame\cgame.stt
rmdir game\debug /s /q
rmdir game\release /s /q
del game\game.ncb
del game\game.opt
del game\game.plg
del game\game.stt
rmdir ui\debug /s /q
rmdir ui\release /s /q
del ui\ui.ncb
del ui\ui.opt
del ui\ui.plg
del ui\ui.stt
rmdir renderer\debug /s /q
rmdir renderer\release /s /q
del renderer\renderer.ncb
del renderer\renderer.opt
del renderer\renderer.plg
del renderer\renderer.stt
rmdir botlib\debug /s /q
rmdir botlib\release /s /q
del botlib\botlib.ncb
del botlib\botlib.opt
del botlib\botlib.plg
del botlib\botlib.stt
rmdir botlai\debug /s /q
rmdir botlai\release /s /q
del botai\botai.dsp
del botai\botai.plg
rmdir bspc\debug /s /q
rmdir bspc\release /s /q
del bspc\bspc.exe
del bspc\bspc.log
del bspc\bspc.ncb
del bspc\bspc.opt
del bspc\bspc.pdb
del bspc\bspc.plg
rmdir unix\debugi386-glibc /s /q
rmdir unix\releasei386-glibc /s /q
rmdir "mac\MacQuake3 Data" /s /q
rmdir macosx\Client\Q3Test.app /s /q
rmdir macosx\Client\Q3Test.build /s /q

148
neo/cm/CollisionModel.h Normal file
View File

@ -0,0 +1,148 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
#ifndef __COLLISIONMODELMANAGER_H__
#define __COLLISIONMODELMANAGER_H__
/*
===============================================================================
Trace model vs. polygonal model collision detection.
Short translations are the least expensive. Retrieving contact points is
about as cheap as a short translation. Position tests are more expensive
and rotations are most expensive.
There is no position test at the start of a translation or rotation. In other
words if a translation with start != end or a rotation with angle != 0 starts
in solid, this goes unnoticed and the collision result is undefined.
A translation with start == end or a rotation with angle == 0 performs
a position test and fills in the trace_t structure accordingly.
===============================================================================
*/
// contact type
typedef enum {
CONTACT_NONE, // no contact
CONTACT_EDGE, // trace model edge hits model edge
CONTACT_MODELVERTEX, // model vertex hits trace model polygon
CONTACT_TRMVERTEX // trace model vertex hits model polygon
} contactType_t;
// contact info
typedef struct {
contactType_t type; // contact type
idVec3 point; // point of contact
idVec3 normal; // contact plane normal
float dist; // contact plane distance
int contents; // contents at other side of surface
const idMaterial * material; // surface material
int modelFeature; // contact feature on model
int trmFeature; // contact feature on trace model
int entityNum; // entity the contact surface is a part of
int id; // id of clip model the contact surface is part of
} contactInfo_t;
// trace result
typedef struct trace_s {
float fraction; // fraction of movement completed, 1.0 = didn't hit anything
idVec3 endpos; // final position of trace model
idMat3 endAxis; // final axis of trace model
contactInfo_t c; // contact information, only valid if fraction < 1.0
} trace_t;
typedef int cmHandle_t;
#define CM_CLIP_EPSILON 0.25f // always stay this distance away from any model
#define CM_BOX_EPSILON 1.0f // should always be larger than clip epsilon
#define CM_MAX_TRACE_DIST 4096.0f // maximum distance a trace model may be traced, point traces are unlimited
class idCollisionModelManager {
public:
virtual ~idCollisionModelManager( void ) {}
// Loads collision models from a map file.
virtual void LoadMap( const idMapFile *mapFile ) = 0;
// Frees all the collision models.
virtual void FreeMap( void ) = 0;
// Gets the clip handle for a model.
virtual cmHandle_t LoadModel( const char *modelName, const bool precache ) = 0;
// Sets up a trace model for collision with other trace models.
virtual cmHandle_t SetupTrmModel( const idTraceModel &trm, const idMaterial *material ) = 0;
// Creates a trace model from a collision model, returns true if succesfull.
virtual bool TrmFromModel( const char *modelName, idTraceModel &trm ) = 0;
// Gets the name of a model.
virtual const char * GetModelName( cmHandle_t model ) const = 0;
// Gets the bounds of a model.
virtual bool GetModelBounds( cmHandle_t model, idBounds &bounds ) const = 0;
// Gets all contents flags of brushes and polygons of a model ored together.
virtual bool GetModelContents( cmHandle_t model, int &contents ) const = 0;
// Gets a vertex of a model.
virtual bool GetModelVertex( cmHandle_t model, int vertexNum, idVec3 &vertex ) const = 0;
// Gets an edge of a model.
virtual bool GetModelEdge( cmHandle_t model, int edgeNum, idVec3 &start, idVec3 &end ) const = 0;
// Gets a polygon of a model.
virtual bool GetModelPolygon( cmHandle_t model, int polygonNum, idFixedWinding &winding ) const = 0;
// Translates a trace model and reports the first collision if any.
virtual void Translation( trace_t *results, const idVec3 &start, const idVec3 &end,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis ) = 0;
// Rotates a trace model and reports the first collision if any.
virtual void Rotation( trace_t *results, const idVec3 &start, const idRotation &rotation,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis ) = 0;
// Returns the contents touched by the trace model or 0 if the trace model is in free space.
virtual int Contents( const idVec3 &start,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis ) = 0;
// Stores all contact points of the trace model with the model, returns the number of contacts.
virtual int Contacts( contactInfo_t *contacts, const int maxContacts, const idVec3 &start, const idVec6 &dir, const float depth,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis ) = 0;
// Tests collision detection.
virtual void DebugOutput( const idVec3 &origin ) = 0;
// Draws a model.
virtual void DrawModel( cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis,
const idVec3 &viewOrigin, const float radius ) = 0;
// Prints model information, use -1 handle for accumulated model info.
virtual void ModelInfo( cmHandle_t model ) = 0;
// Lists all loaded models.
virtual void ListModels( void ) = 0;
// Writes a collision model file for the given map entity.
virtual bool WriteCollisionModelForMapEntity( const idMapEntity *mapEnt, const char *filename, const bool testTraceModel = true ) = 0;
};
extern idCollisionModelManager * collisionModelManager;
#endif /* !__COLLISIONMODELMANAGER_H__ */

View File

@ -0,0 +1,75 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
/*
===============================================================================
Trace model vs. polygonal model collision detection.
===============================================================================
*/
#include "../idlib/precompiled.h"
#pragma hdrstop
#include "CollisionModel_local.h"
/*
===============================================================================
Retrieving contacts
===============================================================================
*/
/*
==================
idCollisionModelManagerLocal::Contacts
==================
*/
int idCollisionModelManagerLocal::Contacts( contactInfo_t *contacts, const int maxContacts, const idVec3 &start, const idVec6 &dir, const float depth,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &origin, const idMat3 &modelAxis ) {
trace_t results;
idVec3 end;
// same as Translation but instead of storing the first collision we store all collisions as contacts
idCollisionModelManagerLocal::getContacts = true;
idCollisionModelManagerLocal::contacts = contacts;
idCollisionModelManagerLocal::maxContacts = maxContacts;
idCollisionModelManagerLocal::numContacts = 0;
end = start + dir.SubVec3(0) * depth;
idCollisionModelManagerLocal::Translation( &results, start, end, trm, trmAxis, contentMask, model, origin, modelAxis );
if ( dir.SubVec3(1).LengthSqr() != 0.0f ) {
// FIXME: rotational contacts
}
idCollisionModelManagerLocal::getContacts = false;
idCollisionModelManagerLocal::maxContacts = 0;
return idCollisionModelManagerLocal::numContacts;
}

View File

@ -0,0 +1,641 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
/*
===============================================================================
Trace model vs. polygonal model collision detection.
===============================================================================
*/
#include "../idlib/precompiled.h"
#pragma hdrstop
#include "CollisionModel_local.h"
/*
===============================================================================
Contents test
===============================================================================
*/
/*
================
idCollisionModelManagerLocal::TestTrmVertsInBrush
returns true if any of the trm vertices is inside the brush
================
*/
bool idCollisionModelManagerLocal::TestTrmVertsInBrush( cm_traceWork_t *tw, cm_brush_t *b ) {
int i, j, numVerts, bestPlane;
float d, bestd;
idVec3 *p;
if ( b->checkcount == idCollisionModelManagerLocal::checkCount ) {
return false;
}
b->checkcount = idCollisionModelManagerLocal::checkCount;
if ( !(b->contents & tw->contents) ) {
return false;
}
// if the brush bounds don't intersect the trace bounds
if ( !b->bounds.IntersectsBounds( tw->bounds ) ) {
return false;
}
if ( tw->pointTrace ) {
numVerts = 1;
}
else {
numVerts = tw->numVerts;
}
for ( j = 0; j < numVerts; j++ ) {
p = &tw->vertices[j].p;
// see if the point is inside the brush
bestPlane = 0;
bestd = -idMath::INFINITY;
for ( i = 0; i < b->numPlanes; i++ ) {
d = b->planes[i].Distance( *p );
if ( d >= 0.0f ) {
break;
}
if ( d > bestd ) {
bestd = d;
bestPlane = i;
}
}
if ( i >= b->numPlanes ) {
tw->trace.fraction = 0.0f;
tw->trace.c.type = CONTACT_TRMVERTEX;
tw->trace.c.normal = b->planes[bestPlane].Normal();
tw->trace.c.dist = b->planes[bestPlane].Dist();
tw->trace.c.contents = b->contents;
tw->trace.c.material = b->material;
tw->trace.c.point = *p;
tw->trace.c.modelFeature = 0;
tw->trace.c.trmFeature = j;
return true;
}
}
return false;
}
/*
================
CM_SetTrmEdgeSidedness
================
*/
#define CM_SetTrmEdgeSidedness( edge, bpl, epl, bitNum ) { \
if ( !(edge->sideSet & (1<<bitNum)) ) { \
float fl; \
fl = (bpl).PermutedInnerProduct( epl ); \
edge->side = (edge->side & ~(1<<bitNum)) | (FLOATSIGNBITSET(fl) << bitNum); \
edge->sideSet |= (1 << bitNum); \
} \
}
/*
================
CM_SetTrmPolygonSidedness
================
*/
#define CM_SetTrmPolygonSidedness( v, plane, bitNum ) { \
if ( !((v)->sideSet & (1<<bitNum)) ) { \
float fl; \
fl = plane.Distance( (v)->p ); \
/* cannot use float sign bit because it is undetermined when fl == 0.0f */ \
if ( fl < 0.0f ) { \
(v)->side |= (1 << bitNum); \
} \
else { \
(v)->side &= ~(1 << bitNum); \
} \
(v)->sideSet |= (1 << bitNum); \
} \
}
/*
================
idCollisionModelManagerLocal::TestTrmInPolygon
returns true if the trm intersects the polygon
================
*/
bool idCollisionModelManagerLocal::TestTrmInPolygon( cm_traceWork_t *tw, cm_polygon_t *p ) {
int i, j, k, edgeNum, flip, trmEdgeNum, bitNum, bestPlane;
int sides[MAX_TRACEMODEL_VERTS];
float d, bestd;
cm_trmEdge_t *trmEdge;
cm_edge_t *edge;
cm_vertex_t *v, *v1, *v2;
// if already checked this polygon
if ( p->checkcount == idCollisionModelManagerLocal::checkCount ) {
return false;
}
p->checkcount = idCollisionModelManagerLocal::checkCount;
// if this polygon does not have the right contents behind it
if ( !(p->contents & tw->contents) ) {
return false;
}
// if the polygon bounds don't intersect the trace bounds
if ( !p->bounds.IntersectsBounds( tw->bounds ) ) {
return false;
}
// bounds should cross polygon plane
switch( tw->bounds.PlaneSide( p->plane ) ) {
case PLANESIDE_CROSS:
break;
case PLANESIDE_FRONT:
if ( tw->model->isConvex ) {
tw->quickExit = true;
return true;
}
default:
return false;
}
// if the trace model is convex
if ( tw->isConvex ) {
// test if any polygon vertices are inside the trm
for ( i = 0; i < p->numEdges; i++ ) {
edgeNum = p->edges[i];
edge = tw->model->edges + abs(edgeNum);
// if this edge is already tested
if ( edge->checkcount == idCollisionModelManagerLocal::checkCount ) {
continue;
}
for ( j = 0; j < 2; j++ ) {
v = &tw->model->vertices[edge->vertexNum[j]];
// if this vertex is already tested
if ( v->checkcount == idCollisionModelManagerLocal::checkCount ) {
continue;
}
bestPlane = 0;
bestd = -idMath::INFINITY;
for ( k = 0; k < tw->numPolys; k++ ) {
d = tw->polys[k].plane.Distance( v->p );
if ( d >= 0.0f ) {
break;
}
if ( d > bestd ) {
bestd = d;
bestPlane = k;
}
}
if ( k >= tw->numPolys ) {
tw->trace.fraction = 0.0f;
tw->trace.c.type = CONTACT_MODELVERTEX;
tw->trace.c.normal = -tw->polys[bestPlane].plane.Normal();
tw->trace.c.dist = -tw->polys[bestPlane].plane.Dist();
tw->trace.c.contents = p->contents;
tw->trace.c.material = p->material;
tw->trace.c.point = v->p;
tw->trace.c.modelFeature = edge->vertexNum[j];
tw->trace.c.trmFeature = 0;
return true;
}
}
}
}
for ( i = 0; i < p->numEdges; i++ ) {
edgeNum = p->edges[i];
edge = tw->model->edges + abs(edgeNum);
// reset sidedness cache if this is the first time we encounter this edge
if ( edge->checkcount != idCollisionModelManagerLocal::checkCount ) {
edge->sideSet = 0;
}
// pluecker coordinate for edge
tw->polygonEdgePlueckerCache[i].FromLine( tw->model->vertices[edge->vertexNum[0]].p,
tw->model->vertices[edge->vertexNum[1]].p );
v = &tw->model->vertices[edge->vertexNum[INTSIGNBITSET(edgeNum)]];
// reset sidedness cache if this is the first time we encounter this vertex
if ( v->checkcount != idCollisionModelManagerLocal::checkCount ) {
v->sideSet = 0;
}
v->checkcount = idCollisionModelManagerLocal::checkCount;
}
// get side of polygon for each trm vertex
for ( i = 0; i < tw->numVerts; i++ ) {
d = p->plane.Distance( tw->vertices[i].p );
sides[i] = d < 0.0f ? -1 : 1;
}
// test if any trm edges go through the polygon
for ( i = 1; i <= tw->numEdges; i++ ) {
// if the trm edge does not cross the polygon plane
if ( sides[tw->edges[i].vertexNum[0]] == sides[tw->edges[i].vertexNum[1]] ) {
continue;
}
// check from which side to which side the trm edge goes
flip = INTSIGNBITSET( sides[tw->edges[i].vertexNum[0]] );
// test if trm edge goes through the polygon between the polygon edges
for ( j = 0; j < p->numEdges; j++ ) {
edgeNum = p->edges[j];
edge = tw->model->edges + abs(edgeNum);
#if 1
CM_SetTrmEdgeSidedness( edge, tw->edges[i].pl, tw->polygonEdgePlueckerCache[j], i );
if ( INTSIGNBITSET(edgeNum) ^ ((edge->side >> i) & 1) ^ flip ) {
break;
}
#else
d = tw->edges[i].pl.PermutedInnerProduct( tw->polygonEdgePlueckerCache[j] );
if ( flip ) {
d = -d;
}
if ( edgeNum > 0 ) {
if ( d <= 0.0f ) {
break;
}
}
else {
if ( d >= 0.0f ) {
break;
}
}
#endif
}
if ( j >= p->numEdges ) {
tw->trace.fraction = 0.0f;
tw->trace.c.type = CONTACT_EDGE;
tw->trace.c.normal = p->plane.Normal();
tw->trace.c.dist = p->plane.Dist();
tw->trace.c.contents = p->contents;
tw->trace.c.material = p->material;
tw->trace.c.point = tw->vertices[tw->edges[i].vertexNum[ !flip ]].p;
tw->trace.c.modelFeature = *reinterpret_cast<int *>(&p);
tw->trace.c.trmFeature = i;
return true;
}
}
// test if any polygon edges go through the trm polygons
for ( i = 0; i < p->numEdges; i++ ) {
edgeNum = p->edges[i];
edge = tw->model->edges + abs(edgeNum);
if ( edge->checkcount == idCollisionModelManagerLocal::checkCount ) {
continue;
}
edge->checkcount = idCollisionModelManagerLocal::checkCount;
for ( j = 0; j < tw->numPolys; j++ ) {
#if 1
v1 = tw->model->vertices + edge->vertexNum[0];
CM_SetTrmPolygonSidedness( v1, tw->polys[j].plane, j );
v2 = tw->model->vertices + edge->vertexNum[1];
CM_SetTrmPolygonSidedness( v2, tw->polys[j].plane, j );
// if the polygon edge does not cross the trm polygon plane
if ( !(((v1->side ^ v2->side) >> j) & 1) ) {
continue;
}
flip = (v1->side >> j) & 1;
#else
float d1, d2;
v1 = tw->model->vertices + edge->vertexNum[0];
d1 = tw->polys[j].plane.Distance( v1->p );
v2 = tw->model->vertices + edge->vertexNum[1];
d2 = tw->polys[j].plane.Distance( v2->p );
// if the polygon edge does not cross the trm polygon plane
if ( (d1 >= 0.0f && d2 >= 0.0f) || (d1 <= 0.0f && d2 <= 0.0f) ) {
continue;
}
flip = false;
if ( d1 < 0.0f ) {
flip = true;
}
#endif
// test if polygon edge goes through the trm polygon between the trm polygon edges
for ( k = 0; k < tw->polys[j].numEdges; k++ ) {
trmEdgeNum = tw->polys[j].edges[k];
trmEdge = tw->edges + abs(trmEdgeNum);
#if 1
bitNum = abs(trmEdgeNum);
CM_SetTrmEdgeSidedness( edge, trmEdge->pl, tw->polygonEdgePlueckerCache[i], bitNum );
if ( INTSIGNBITSET(trmEdgeNum) ^ ((edge->side >> bitNum) & 1) ^ flip ) {
break;
}
#else
d = trmEdge->pl.PermutedInnerProduct( tw->polygonEdgePlueckerCache[i] );
if ( flip ) {
d = -d;
}
if ( trmEdgeNum > 0 ) {
if ( d <= 0.0f ) {
break;
}
}
else {
if ( d >= 0.0f ) {
break;
}
}
#endif
}
if ( k >= tw->polys[j].numEdges ) {
tw->trace.fraction = 0.0f;
tw->trace.c.type = CONTACT_EDGE;
tw->trace.c.normal = -tw->polys[j].plane.Normal();
tw->trace.c.dist = -tw->polys[j].plane.Dist();
tw->trace.c.contents = p->contents;
tw->trace.c.material = p->material;
tw->trace.c.point = tw->model->vertices[edge->vertexNum[ !flip ]].p;
tw->trace.c.modelFeature = edgeNum;
tw->trace.c.trmFeature = j;
return true;
}
}
}
return false;
}
/*
================
idCollisionModelManagerLocal::PointNode
================
*/
cm_node_t *idCollisionModelManagerLocal::PointNode( const idVec3 &p, cm_model_t *model ) {
cm_node_t *node;
node = model->node;
while ( node->planeType != -1 ) {
if (p[node->planeType] > node->planeDist) {
node = node->children[0];
}
else {
node = node->children[1];
}
assert( node != NULL );
}
return node;
}
/*
================
idCollisionModelManagerLocal::PointContents
================
*/
int idCollisionModelManagerLocal::PointContents( const idVec3 p, cmHandle_t model ) {
int i;
float d;
cm_node_t *node;
cm_brushRef_t *bref;
cm_brush_t *b;
idPlane *plane;
node = idCollisionModelManagerLocal::PointNode( p, idCollisionModelManagerLocal::models[model] );
for ( bref = node->brushes; bref; bref = bref->next ) {
b = bref->b;
// test if the point is within the brush bounds
for ( i = 0; i < 3; i++ ) {
if ( p[i] < b->bounds[0][i] ) {
break;
}
if ( p[i] > b->bounds[1][i] ) {
break;
}
}
if ( i < 3 ) {
continue;
}
// test if the point is inside the brush
plane = b->planes;
for ( i = 0; i < b->numPlanes; i++, plane++ ) {
d = plane->Distance( p );
if ( d >= 0 ) {
break;
}
}
if ( i >= b->numPlanes ) {
return b->contents;
}
}
return 0;
}
/*
==================
idCollisionModelManagerLocal::TransformedPointContents
==================
*/
int idCollisionModelManagerLocal::TransformedPointContents( const idVec3 &p, cmHandle_t model, const idVec3 &origin, const idMat3 &modelAxis ) {
idVec3 p_l;
// subtract origin offset
p_l = p - origin;
if ( modelAxis.IsRotated() ) {
p_l *= modelAxis;
}
return idCollisionModelManagerLocal::PointContents( p_l, model );
}
/*
==================
idCollisionModelManagerLocal::ContentsTrm
==================
*/
int idCollisionModelManagerLocal::ContentsTrm( trace_t *results, const idVec3 &start,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis ) {
int i;
bool model_rotated, trm_rotated;
idMat3 invModelAxis, tmpAxis;
idVec3 dir;
ALIGN16( cm_traceWork_t tw );
// fast point case
if ( !trm || ( trm->bounds[1][0] - trm->bounds[0][0] <= 0.0f &&
trm->bounds[1][1] - trm->bounds[0][1] <= 0.0f &&
trm->bounds[1][2] - trm->bounds[0][2] <= 0.0f ) ) {
results->c.contents = idCollisionModelManagerLocal::TransformedPointContents( start, model, modelOrigin, modelAxis );
results->fraction = ( results->c.contents == 0 );
results->endpos = start;
results->endAxis = trmAxis;
return results->c.contents;
}
idCollisionModelManagerLocal::checkCount++;
tw.trace.fraction = 1.0f;
tw.trace.c.contents = 0;
tw.trace.c.type = CONTACT_NONE;
tw.contents = contentMask;
tw.isConvex = true;
tw.rotation = false;
tw.positionTest = true;
tw.pointTrace = false;
tw.quickExit = false;
tw.numContacts = 0;
tw.model = idCollisionModelManagerLocal::models[model];
tw.start = start - modelOrigin;
tw.end = tw.start;
model_rotated = modelAxis.IsRotated();
if ( model_rotated ) {
invModelAxis = modelAxis.Transpose();
}
// setup trm structure
idCollisionModelManagerLocal::SetupTrm( &tw, trm );
trm_rotated = trmAxis.IsRotated();
// calculate vertex positions
if ( trm_rotated ) {
for ( i = 0; i < tw.numVerts; i++ ) {
// rotate trm around the start position
tw.vertices[i].p *= trmAxis;
}
}
for ( i = 0; i < tw.numVerts; i++ ) {
// set trm at start position
tw.vertices[i].p += tw.start;
}
if ( model_rotated ) {
for ( i = 0; i < tw.numVerts; i++ ) {
// rotate trm around model instead of rotating the model
tw.vertices[i].p *= invModelAxis;
}
}
// add offset to start point
if ( trm_rotated ) {
dir = trm->offset * trmAxis;
tw.start += dir;
tw.end += dir;
} else {
tw.start += trm->offset;
tw.end += trm->offset;
}
if ( model_rotated ) {
// rotate trace instead of model
tw.start *= invModelAxis;
tw.end *= invModelAxis;
}
// setup trm vertices
tw.size.Clear();
for ( i = 0; i < tw.numVerts; i++ ) {
// get axial trm size after rotations
tw.size.AddPoint( tw.vertices[i].p - tw.start );
}
// setup trm edges
for ( i = 1; i <= tw.numEdges; i++ ) {
// edge start, end and pluecker coordinate
tw.edges[i].start = tw.vertices[tw.edges[i].vertexNum[0]].p;
tw.edges[i].end = tw.vertices[tw.edges[i].vertexNum[1]].p;
tw.edges[i].pl.FromLine( tw.edges[i].start, tw.edges[i].end );
}
// setup trm polygons
if ( trm_rotated & model_rotated ) {
tmpAxis = trmAxis * invModelAxis;
for ( i = 0; i < tw.numPolys; i++ ) {
tw.polys[i].plane *= tmpAxis;
}
} else if ( trm_rotated ) {
for ( i = 0; i < tw.numPolys; i++ ) {
tw.polys[i].plane *= trmAxis;
}
} else if ( model_rotated ) {
for ( i = 0; i < tw.numPolys; i++ ) {
tw.polys[i].plane *= invModelAxis;
}
}
for ( i = 0; i < tw.numPolys; i++ ) {
tw.polys[i].plane.FitThroughPoint( tw.edges[abs(tw.polys[i].edges[0])].start );
}
// bounds for full trace, a little bit larger for epsilons
for ( i = 0; i < 3; i++ ) {
if ( tw.start[i] < tw.end[i] ) {
tw.bounds[0][i] = tw.start[i] + tw.size[0][i] - CM_BOX_EPSILON;
tw.bounds[1][i] = tw.end[i] + tw.size[1][i] + CM_BOX_EPSILON;
} else {
tw.bounds[0][i] = tw.end[i] + tw.size[0][i] - CM_BOX_EPSILON;
tw.bounds[1][i] = tw.start[i] + tw.size[1][i] + CM_BOX_EPSILON;
}
if ( idMath::Fabs(tw.size[0][i]) > idMath::Fabs(tw.size[1][i]) ) {
tw.extents[i] = idMath::Fabs( tw.size[0][i] ) + CM_BOX_EPSILON;
} else {
tw.extents[i] = idMath::Fabs( tw.size[1][i] ) + CM_BOX_EPSILON;
}
}
// trace through the model
idCollisionModelManagerLocal::TraceThroughModel( &tw );
*results = tw.trace;
results->fraction = ( results->c.contents == 0 );
results->endpos = start;
results->endAxis = trmAxis;
return results->c.contents;
}
/*
==================
idCollisionModelManagerLocal::Contents
==================
*/
int idCollisionModelManagerLocal::Contents( const idVec3 &start,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis ) {
trace_t results;
if ( model < 0 || model > idCollisionModelManagerLocal::maxModels || model > MAX_SUBMODELS ) {
common->Printf("idCollisionModelManagerLocal::Contents: invalid model handle\n");
return 0;
}
if ( !idCollisionModelManagerLocal::models || !idCollisionModelManagerLocal::models[model] ) {
common->Printf("idCollisionModelManagerLocal::Contents: invalid model\n");
return 0;
}
return ContentsTrm( &results, start, trm, trmAxis, contentMask, model, modelOrigin, modelAxis );
}

View File

@ -0,0 +1,488 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
/*
===============================================================================
Trace model vs. polygonal model collision detection.
===============================================================================
*/
#include "../idlib/precompiled.h"
#pragma hdrstop
#include "CollisionModel_local.h"
/*
===============================================================================
Visualisation code
===============================================================================
*/
const char *cm_contentsNameByIndex[] = {
"none", // 0
"solid", // 1
"opaque", // 2
"water", // 3
"playerclip", // 4
"monsterclip", // 5
"moveableclip", // 6
"ikclip", // 7
"blood", // 8
"body", // 9
"corpse", // 10
"trigger", // 11
"aas_solid", // 12
"aas_obstacle", // 13
"flashlight_trigger", // 14
NULL
};
int cm_contentsFlagByIndex[] = {
-1, // 0
CONTENTS_SOLID, // 1
CONTENTS_OPAQUE, // 2
CONTENTS_WATER, // 3
CONTENTS_PLAYERCLIP, // 4
CONTENTS_MONSTERCLIP, // 5
CONTENTS_MOVEABLECLIP, // 6
CONTENTS_IKCLIP, // 7
CONTENTS_BLOOD, // 8
CONTENTS_BODY, // 9
CONTENTS_CORPSE, // 10
CONTENTS_TRIGGER, // 11
CONTENTS_AAS_SOLID, // 12
CONTENTS_AAS_OBSTACLE, // 13
CONTENTS_FLASHLIGHT_TRIGGER, // 14
0
};
idCVar cm_drawMask( "cm_drawMask", "none", CVAR_GAME, "collision mask", cm_contentsNameByIndex, idCmdSystem::ArgCompletion_String<cm_contentsNameByIndex> );
idCVar cm_drawColor( "cm_drawColor", "1 0 0 .5", CVAR_GAME, "color used to draw the collision models" );
idCVar cm_drawFilled( "cm_drawFilled", "0", CVAR_GAME | CVAR_BOOL, "draw filled polygons" );
idCVar cm_drawInternal( "cm_drawInternal", "1", CVAR_GAME | CVAR_BOOL, "draw internal edges green" );
idCVar cm_drawNormals( "cm_drawNormals", "0", CVAR_GAME | CVAR_BOOL, "draw polygon and edge normals" );
idCVar cm_backFaceCull( "cm_backFaceCull", "0", CVAR_GAME | CVAR_BOOL, "cull back facing polygons" );
idCVar cm_debugCollision( "cm_debugCollision", "0", CVAR_GAME | CVAR_BOOL, "debug the collision detection" );
static idVec4 cm_color;
/*
================
idCollisionModelManagerLocal::ContentsFromString
================
*/
int idCollisionModelManagerLocal::ContentsFromString( const char *string ) const {
int i, contents = 0;
idLexer src( string, idStr::Length( string ), "ContentsFromString" );
idToken token;
while( src.ReadToken( &token ) ) {
if ( token == "," ) {
continue;
}
for ( i = 1; cm_contentsNameByIndex[i] != NULL; i++ ) {
if ( token.Icmp( cm_contentsNameByIndex[i] ) == 0 ) {
contents |= cm_contentsFlagByIndex[i];
break;
}
}
}
return contents;
}
/*
================
idCollisionModelManagerLocal::StringFromContents
================
*/
const char *idCollisionModelManagerLocal::StringFromContents( const int contents ) const {
int i, length = 0;
static char contentsString[MAX_STRING_CHARS];
contentsString[0] = '\0';
for ( i = 1; cm_contentsFlagByIndex[i] != 0; i++ ) {
if ( contents & cm_contentsFlagByIndex[i] ) {
if ( length != 0 ) {
length += idStr::snPrintf( contentsString + length, sizeof( contentsString ) - length, "," );
}
length += idStr::snPrintf( contentsString + length, sizeof( contentsString ) - length, cm_contentsNameByIndex[i] );
}
}
return contentsString;
}
/*
================
idCollisionModelManagerLocal::DrawEdge
================
*/
void idCollisionModelManagerLocal::DrawEdge( cm_model_t *model, int edgeNum, const idVec3 &origin, const idMat3 &axis ) {
int side;
cm_edge_t *edge;
idVec3 start, end, mid;
bool isRotated;
isRotated = axis.IsRotated();
edge = model->edges + abs(edgeNum);
side = edgeNum < 0;
start = model->vertices[edge->vertexNum[side]].p;
end = model->vertices[edge->vertexNum[!side]].p;
if ( isRotated ) {
start *= axis;
end *= axis;
}
start += origin;
end += origin;
if ( edge->internal ) {
if ( cm_drawInternal.GetBool() ) {
session->rw->DebugArrow( colorGreen, start, end, 1 );
}
} else {
if ( edge->numUsers > 2 ) {
session->rw->DebugArrow( colorBlue, start, end, 1 );
} else {
session->rw->DebugArrow( cm_color, start, end, 1 );
}
}
if ( cm_drawNormals.GetBool() ) {
mid = (start + end) * 0.5f;
if ( isRotated ) {
end = mid + 5 * (axis * edge->normal);
} else {
end = mid + 5 * edge->normal;
}
session->rw->DebugArrow( colorCyan, mid, end, 1 );
}
}
/*
================
idCollisionModelManagerLocal::DrawPolygon
================
*/
void idCollisionModelManagerLocal::DrawPolygon( cm_model_t *model, cm_polygon_t *p, const idVec3 &origin, const idMat3 &axis, const idVec3 &viewOrigin ) {
int i, edgeNum;
cm_edge_t *edge;
idVec3 center, end, dir;
if ( cm_backFaceCull.GetBool() ) {
edgeNum = p->edges[0];
edge = model->edges + abs(edgeNum);
dir = model->vertices[edge->vertexNum[0]].p - viewOrigin;
if ( dir * p->plane.Normal() > 0.0f ) {
return;
}
}
if ( cm_drawNormals.GetBool() ) {
center = vec3_origin;
for ( i = 0; i < p->numEdges; i++ ) {
edgeNum = p->edges[i];
edge = model->edges + abs(edgeNum);
center += model->vertices[edge->vertexNum[edgeNum < 0]].p;
}
center *= (1.0f / p->numEdges);
if ( axis.IsRotated() ) {
center = center * axis + origin;
end = center + 5 * (axis * p->plane.Normal());
} else {
center += origin;
end = center + 5 * p->plane.Normal();
}
session->rw->DebugArrow( colorMagenta, center, end, 1 );
}
if ( cm_drawFilled.GetBool() ) {
idFixedWinding winding;
for ( i = p->numEdges - 1; i >= 0; i-- ) {
edgeNum = p->edges[i];
edge = model->edges + abs(edgeNum);
winding += origin + model->vertices[edge->vertexNum[INTSIGNBITSET(edgeNum)]].p * axis;
}
session->rw->DebugPolygon( cm_color, winding );
} else {
for ( i = 0; i < p->numEdges; i++ ) {
edgeNum = p->edges[i];
edge = model->edges + abs(edgeNum);
if ( edge->checkcount == checkCount ) {
continue;
}
edge->checkcount = checkCount;
DrawEdge( model, edgeNum, origin, axis );
}
}
}
/*
================
idCollisionModelManagerLocal::DrawNodePolygons
================
*/
void idCollisionModelManagerLocal::DrawNodePolygons( cm_model_t *model, cm_node_t *node,
const idVec3 &origin, const idMat3 &axis,
const idVec3 &viewOrigin, const float radius ) {
int i;
cm_polygon_t *p;
cm_polygonRef_t *pref;
while (1) {
for ( pref = node->polygons; pref; pref = pref->next ) {
p = pref->p;
if ( radius ) {
// polygon bounds should overlap with trace bounds
for ( i = 0; i < 3; i++ ) {
if ( p->bounds[0][i] > viewOrigin[i] + radius ) {
break;
}
if ( p->bounds[1][i] < viewOrigin[i] - radius ) {
break;
}
}
if ( i < 3 ) {
continue;
}
}
if ( p->checkcount == checkCount ) {
continue;
}
if ( !( p->contents & cm_contentsFlagByIndex[cm_drawMask.GetInteger()] ) ) {
continue;
}
DrawPolygon( model, p, origin, axis, viewOrigin );
p->checkcount = checkCount;
}
if ( node->planeType == -1 ) {
break;
}
if ( radius && viewOrigin[node->planeType] > node->planeDist + radius ) {
node = node->children[0];
} else if ( radius && viewOrigin[node->planeType] < node->planeDist - radius ) {
node = node->children[1];
} else {
DrawNodePolygons( model, node->children[1], origin, axis, viewOrigin, radius );
node = node->children[0];
}
}
}
/*
================
idCollisionModelManagerLocal::DrawModel
================
*/
void idCollisionModelManagerLocal::DrawModel( cmHandle_t handle, const idVec3 &modelOrigin, const idMat3 &modelAxis,
const idVec3 &viewOrigin, const float radius ) {
cm_model_t *model;
idVec3 viewPos;
if ( handle < 0 && handle >= numModels ) {
return;
}
if ( cm_drawColor.IsModified() ) {
sscanf( cm_drawColor.GetString(), "%f %f %f %f", &cm_color.x, &cm_color.y, &cm_color.z, &cm_color.w );
cm_drawColor.ClearModified();
}
model = models[ handle ];
viewPos = (viewOrigin - modelOrigin) * modelAxis.Transpose();
checkCount++;
DrawNodePolygons( model, model->node, modelOrigin, modelAxis, viewPos, radius );
}
/*
===============================================================================
Speed test code
===============================================================================
*/
static idCVar cm_testCollision( "cm_testCollision", "0", CVAR_GAME | CVAR_BOOL, "" );
static idCVar cm_testRotation( "cm_testRotation", "1", CVAR_GAME | CVAR_BOOL, "" );
static idCVar cm_testModel( "cm_testModel", "0", CVAR_GAME | CVAR_INTEGER, "" );
static idCVar cm_testTimes( "cm_testTimes", "1000", CVAR_GAME | CVAR_INTEGER, "" );
static idCVar cm_testRandomMany( "cm_testRandomMany", "0", CVAR_GAME | CVAR_BOOL, "" );
static idCVar cm_testOrigin( "cm_testOrigin", "0 0 0", CVAR_GAME, "" );
static idCVar cm_testReset( "cm_testReset", "0", CVAR_GAME | CVAR_BOOL, "" );
static idCVar cm_testBox( "cm_testBox", "-16 -16 0 16 16 64", CVAR_GAME, "" );
static idCVar cm_testBoxRotation( "cm_testBoxRotation", "0 0 0", CVAR_GAME, "" );
static idCVar cm_testWalk( "cm_testWalk", "1", CVAR_GAME | CVAR_BOOL, "" );
static idCVar cm_testLength( "cm_testLength", "1024", CVAR_GAME | CVAR_FLOAT, "" );
static idCVar cm_testRadius( "cm_testRadius", "64", CVAR_GAME | CVAR_FLOAT, "" );
static idCVar cm_testAngle( "cm_testAngle", "60", CVAR_GAME | CVAR_FLOAT, "" );
static int total_translation;
static int min_translation = 999999;
static int max_translation = -999999;
static int num_translation = 0;
static int total_rotation;
static int min_rotation = 999999;
static int max_rotation = -999999;
static int num_rotation = 0;
static idVec3 start;
static idVec3 *testend;
#include "../sys/sys_public.h"
void idCollisionModelManagerLocal::DebugOutput( const idVec3 &origin ) {
int i, k, t;
char buf[128];
idVec3 end;
idAngles boxAngles;
idMat3 modelAxis, boxAxis;
idBounds bounds;
trace_t trace;
if ( !cm_testCollision.GetBool() ) {
return;
}
testend = (idVec3 *) Mem_Alloc( cm_testTimes.GetInteger() * sizeof(idVec3) );
if ( cm_testReset.GetBool() || ( cm_testWalk.GetBool() && !start.Compare( start ) ) ) {
total_translation = total_rotation = 0;
min_translation = min_rotation = 999999;
max_translation = max_rotation = -999999;
num_translation = num_rotation = 0;
cm_testReset.SetBool( false );
}
if ( cm_testWalk.GetBool() ) {
start = origin;
cm_testOrigin.SetString( va( "%1.2f %1.2f %1.2f", start[0], start[1], start[2] ) );
} else {
sscanf( cm_testOrigin.GetString(), "%f %f %f", &start[0], &start[1], &start[2] );
}
sscanf( cm_testBox.GetString(), "%f %f %f %f %f %f", &bounds[0][0], &bounds[0][1], &bounds[0][2],
&bounds[1][0], &bounds[1][1], &bounds[1][2] );
sscanf( cm_testBoxRotation.GetString(), "%f %f %f", &boxAngles[0], &boxAngles[1], &boxAngles[2] );
boxAxis = boxAngles.ToMat3();
modelAxis.Identity();
idTraceModel itm( bounds );
idRandom random( 0 );
idTimer timer;
if ( cm_testRandomMany.GetBool() ) {
// if many traces in one random direction
for ( i = 0; i < 3; i++ ) {
testend[0][i] = start[i] + random.CRandomFloat() * cm_testLength.GetFloat();
}
for ( k = 1; k < cm_testTimes.GetInteger(); k++ ) {
testend[k] = testend[0];
}
} else {
// many traces each in a different random direction
for ( k = 0; k < cm_testTimes.GetInteger(); k++ ) {
for ( i = 0; i < 3; i++ ) {
testend[k][i] = start[i] + random.CRandomFloat() * cm_testLength.GetFloat();
}
}
}
// translational collision detection
timer.Clear();
timer.Start();
for ( i = 0; i < cm_testTimes.GetInteger(); i++ ) {
Translation( &trace, start, testend[i], &itm, boxAxis, CONTENTS_SOLID|CONTENTS_PLAYERCLIP, cm_testModel.GetInteger(), vec3_origin, modelAxis );
}
timer.Stop();
t = timer.Milliseconds();
if ( t < min_translation ) min_translation = t;
if ( t > max_translation ) max_translation = t;
num_translation++;
total_translation += t;
if ( cm_testTimes.GetInteger() > 9999 ) {
sprintf( buf, "%3dK", (int ) ( cm_testTimes.GetInteger() / 1000 ) );
} else {
sprintf( buf, "%4d", cm_testTimes.GetInteger() );
}
common->Printf("%s translations: %4d milliseconds, (min = %d, max = %d, av = %1.1f)\n", buf, t, min_translation, max_translation, (float) total_translation / num_translation );
if ( cm_testRandomMany.GetBool() ) {
// if many traces in one random direction
for ( i = 0; i < 3; i++ ) {
testend[0][i] = start[i] + random.CRandomFloat() * cm_testRadius.GetFloat();
}
for ( k = 1; k < cm_testTimes.GetInteger(); k++ ) {
testend[k] = testend[0];
}
} else {
// many traces each in a different random direction
for ( k = 0; k < cm_testTimes.GetInteger(); k++ ) {
for ( i = 0; i < 3; i++ ) {
testend[k][i] = start[i] + random.CRandomFloat() * cm_testRadius.GetFloat();
}
}
}
if ( cm_testRotation.GetBool() ) {
// rotational collision detection
idVec3 vec( random.CRandomFloat(), random.CRandomFloat(), random.RandomFloat() );
vec.Normalize();
idRotation rotation( vec3_origin, vec, cm_testAngle.GetFloat() );
timer.Clear();
timer.Start();
for ( i = 0; i < cm_testTimes.GetInteger(); i++ ) {
rotation.SetOrigin( testend[i] );
Rotation( &trace, start, rotation, &itm, boxAxis, CONTENTS_SOLID|CONTENTS_PLAYERCLIP, cm_testModel.GetInteger(), vec3_origin, modelAxis );
}
timer.Stop();
t = timer.Milliseconds();
if ( t < min_rotation ) min_rotation = t;
if ( t > max_rotation ) max_rotation = t;
num_rotation++;
total_rotation += t;
if ( cm_testTimes.GetInteger() > 9999 ) {
sprintf( buf, "%3dK", (int ) ( cm_testTimes.GetInteger() / 1000 ) );
} else {
sprintf( buf, "%4d", cm_testTimes.GetInteger() );
}
common->Printf("%s rotation: %4d milliseconds, (min = %d, max = %d, av = %1.1f)\n", buf, t, min_rotation, max_rotation, (float) total_rotation / num_rotation );
}
Mem_Free( testend );
testend = NULL;
}

View File

@ -0,0 +1,616 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
/*
===============================================================================
Trace model vs. polygonal model collision detection.
===============================================================================
*/
#include "../idlib/precompiled.h"
#pragma hdrstop
#include "CollisionModel_local.h"
#define CM_FILE_EXT "cm"
#define CM_FILEID "CM"
#define CM_FILEVERSION "1.00"
/*
===============================================================================
Writing of collision model file
===============================================================================
*/
void CM_GetNodeBounds( idBounds *bounds, cm_node_t *node );
int CM_GetNodeContents( cm_node_t *node );
/*
================
idCollisionModelManagerLocal::WriteNodes
================
*/
void idCollisionModelManagerLocal::WriteNodes( idFile *fp, cm_node_t *node ) {
fp->WriteFloatString( "\t( %d %f )\n", node->planeType, node->planeDist );
if ( node->planeType != -1 ) {
WriteNodes( fp, node->children[0] );
WriteNodes( fp, node->children[1] );
}
}
/*
================
idCollisionModelManagerLocal::CountPolygonMemory
================
*/
int idCollisionModelManagerLocal::CountPolygonMemory( cm_node_t *node ) const {
cm_polygonRef_t *pref;
cm_polygon_t *p;
int memory;
memory = 0;
for ( pref = node->polygons; pref; pref = pref->next ) {
p = pref->p;
if ( p->checkcount == checkCount ) {
continue;
}
p->checkcount = checkCount;
memory += sizeof( cm_polygon_t ) + ( p->numEdges - 1 ) * sizeof( p->edges[0] );
}
if ( node->planeType != -1 ) {
memory += CountPolygonMemory( node->children[0] );
memory += CountPolygonMemory( node->children[1] );
}
return memory;
}
/*
================
idCollisionModelManagerLocal::WritePolygons
================
*/
void idCollisionModelManagerLocal::WritePolygons( idFile *fp, cm_node_t *node ) {
cm_polygonRef_t *pref;
cm_polygon_t *p;
int i;
for ( pref = node->polygons; pref; pref = pref->next ) {
p = pref->p;
if ( p->checkcount == checkCount ) {
continue;
}
p->checkcount = checkCount;
fp->WriteFloatString( "\t%d (", p->numEdges );
for ( i = 0; i < p->numEdges; i++ ) {
fp->WriteFloatString( " %d", p->edges[i] );
}
fp->WriteFloatString( " ) ( %f %f %f ) %f", p->plane.Normal()[0], p->plane.Normal()[1], p->plane.Normal()[2], p->plane.Dist() );
fp->WriteFloatString( " ( %f %f %f )", p->bounds[0][0], p->bounds[0][1], p->bounds[0][2] );
fp->WriteFloatString( " ( %f %f %f )", p->bounds[1][0], p->bounds[1][1], p->bounds[1][2] );
fp->WriteFloatString( " \"%s\"\n", p->material->GetName() );
}
if ( node->planeType != -1 ) {
WritePolygons( fp, node->children[0] );
WritePolygons( fp, node->children[1] );
}
}
/*
================
idCollisionModelManagerLocal::CountBrushMemory
================
*/
int idCollisionModelManagerLocal::CountBrushMemory( cm_node_t *node ) const {
cm_brushRef_t *bref;
cm_brush_t *b;
int memory;
memory = 0;
for ( bref = node->brushes; bref; bref = bref->next ) {
b = bref->b;
if ( b->checkcount == checkCount ) {
continue;
}
b->checkcount = checkCount;
memory += sizeof( cm_brush_t ) + ( b->numPlanes - 1 ) * sizeof( b->planes[0] );
}
if ( node->planeType != -1 ) {
memory += CountBrushMemory( node->children[0] );
memory += CountBrushMemory( node->children[1] );
}
return memory;
}
/*
================
idCollisionModelManagerLocal::WriteBrushes
================
*/
void idCollisionModelManagerLocal::WriteBrushes( idFile *fp, cm_node_t *node ) {
cm_brushRef_t *bref;
cm_brush_t *b;
int i;
for ( bref = node->brushes; bref; bref = bref->next ) {
b = bref->b;
if ( b->checkcount == checkCount ) {
continue;
}
b->checkcount = checkCount;
fp->WriteFloatString( "\t%d {\n", b->numPlanes );
for ( i = 0; i < b->numPlanes; i++ ) {
fp->WriteFloatString( "\t\t( %f %f %f ) %f\n", b->planes[i].Normal()[0], b->planes[i].Normal()[1], b->planes[i].Normal()[2], b->planes[i].Dist() );
}
fp->WriteFloatString( "\t} ( %f %f %f )", b->bounds[0][0], b->bounds[0][1], b->bounds[0][2] );
fp->WriteFloatString( " ( %f %f %f ) \"%s\"\n", b->bounds[1][0], b->bounds[1][1], b->bounds[1][2], StringFromContents( b->contents ) );
}
if ( node->planeType != -1 ) {
WriteBrushes( fp, node->children[0] );
WriteBrushes( fp, node->children[1] );
}
}
/*
================
idCollisionModelManagerLocal::WriteCollisionModel
================
*/
void idCollisionModelManagerLocal::WriteCollisionModel( idFile *fp, cm_model_t *model ) {
int i, polygonMemory, brushMemory;
fp->WriteFloatString( "collisionModel \"%s\" {\n", model->name.c_str() );
// vertices
fp->WriteFloatString( "\tvertices { /* numVertices = */ %d\n", model->numVertices );
for ( i = 0; i < model->numVertices; i++ ) {
fp->WriteFloatString( "\t/* %d */ ( %f %f %f )\n", i, model->vertices[i].p[0], model->vertices[i].p[1], model->vertices[i].p[2] );
}
fp->WriteFloatString( "\t}\n" );
// edges
fp->WriteFloatString( "\tedges { /* numEdges = */ %d\n", model->numEdges );
for ( i = 0; i < model->numEdges; i++ ) {
fp->WriteFloatString( "\t/* %d */ ( %d %d ) %d %d\n", i, model->edges[i].vertexNum[0], model->edges[i].vertexNum[1], model->edges[i].internal, model->edges[i].numUsers );
}
fp->WriteFloatString( "\t}\n" );
// nodes
fp->WriteFloatString( "\tnodes {\n" );
WriteNodes( fp, model->node );
fp->WriteFloatString( "\t}\n" );
// polygons
checkCount++;
polygonMemory = CountPolygonMemory( model->node );
fp->WriteFloatString( "\tpolygons /* polygonMemory = */ %d {\n", polygonMemory );
checkCount++;
WritePolygons( fp, model->node );
fp->WriteFloatString( "\t}\n" );
// brushes
checkCount++;
brushMemory = CountBrushMemory( model->node );
fp->WriteFloatString( "\tbrushes /* brushMemory = */ %d {\n", brushMemory );
checkCount++;
WriteBrushes( fp, model->node );
fp->WriteFloatString( "\t}\n" );
// closing brace
fp->WriteFloatString( "}\n" );
}
/*
================
idCollisionModelManagerLocal::WriteCollisionModelsToFile
================
*/
void idCollisionModelManagerLocal::WriteCollisionModelsToFile( const char *filename, int firstModel, int lastModel, unsigned int mapFileCRC ) {
int i;
idFile *fp;
idStr name;
name = filename;
name.SetFileExtension( CM_FILE_EXT );
common->Printf( "writing %s\n", name.c_str() );
// _D3XP was saving to fs_cdpath
fp = fileSystem->OpenFileWrite( name, "fs_devpath" );
if ( !fp ) {
common->Warning( "idCollisionModelManagerLocal::WriteCollisionModelsToFile: Error opening file %s\n", name.c_str() );
return;
}
// write file id and version
fp->WriteFloatString( "%s \"%s\"\n\n", CM_FILEID, CM_FILEVERSION );
// write the map file crc
fp->WriteFloatString( "%u\n\n", mapFileCRC );
// write the collision models
for ( i = firstModel; i < lastModel; i++ ) {
WriteCollisionModel( fp, models[ i ] );
}
fileSystem->CloseFile( fp );
}
/*
================
idCollisionModelManagerLocal::WriteCollisionModelForMapEntity
================
*/
bool idCollisionModelManagerLocal::WriteCollisionModelForMapEntity( const idMapEntity *mapEnt, const char *filename, const bool testTraceModel ) {
idFile *fp;
idStr name;
cm_model_t *model;
SetupHash();
model = CollisionModelForMapEntity( mapEnt );
model->name = filename;
name = filename;
name.SetFileExtension( CM_FILE_EXT );
common->Printf( "writing %s\n", name.c_str() );
fp = fileSystem->OpenFileWrite( name, "fs_devpath" );
if ( !fp ) {
common->Printf( "idCollisionModelManagerLocal::WriteCollisionModelForMapEntity: Error opening file %s\n", name.c_str() );
FreeModel( model );
return false;
}
// write file id and version
fp->WriteFloatString( "%s \"%s\"\n\n", CM_FILEID, CM_FILEVERSION );
// write the map file crc
fp->WriteFloatString( "%u\n\n", 0 );
// write the collision model
WriteCollisionModel( fp, model );
fileSystem->CloseFile( fp );
if ( testTraceModel ) {
idTraceModel trm;
TrmFromModel( model, trm );
}
FreeModel( model );
return true;
}
/*
===============================================================================
Loading of collision model file
===============================================================================
*/
/*
================
idCollisionModelManagerLocal::ParseVertices
================
*/
void idCollisionModelManagerLocal::ParseVertices( idLexer *src, cm_model_t *model ) {
int i;
src->ExpectTokenString( "{" );
model->numVertices = src->ParseInt();
model->maxVertices = model->numVertices;
model->vertices = (cm_vertex_t *) Mem_Alloc( model->maxVertices * sizeof( cm_vertex_t ) );
for ( i = 0; i < model->numVertices; i++ ) {
src->Parse1DMatrix( 3, model->vertices[i].p.ToFloatPtr() );
model->vertices[i].side = 0;
model->vertices[i].sideSet = 0;
model->vertices[i].checkcount = 0;
}
src->ExpectTokenString( "}" );
}
/*
================
idCollisionModelManagerLocal::ParseEdges
================
*/
void idCollisionModelManagerLocal::ParseEdges( idLexer *src, cm_model_t *model ) {
int i;
src->ExpectTokenString( "{" );
model->numEdges = src->ParseInt();
model->maxEdges = model->numEdges;
model->edges = (cm_edge_t *) Mem_Alloc( model->maxEdges * sizeof( cm_edge_t ) );
for ( i = 0; i < model->numEdges; i++ ) {
src->ExpectTokenString( "(" );
model->edges[i].vertexNum[0] = src->ParseInt();
model->edges[i].vertexNum[1] = src->ParseInt();
src->ExpectTokenString( ")" );
model->edges[i].side = 0;
model->edges[i].sideSet = 0;
model->edges[i].internal = src->ParseInt();
model->edges[i].numUsers = src->ParseInt();
model->edges[i].normal = vec3_origin;
model->edges[i].checkcount = 0;
model->numInternalEdges += model->edges[i].internal;
}
src->ExpectTokenString( "}" );
}
/*
================
idCollisionModelManagerLocal::ParseNodes
================
*/
cm_node_t *idCollisionModelManagerLocal::ParseNodes( idLexer *src, cm_model_t *model, cm_node_t *parent ) {
cm_node_t *node;
model->numNodes++;
node = AllocNode( model, model->numNodes < NODE_BLOCK_SIZE_SMALL ? NODE_BLOCK_SIZE_SMALL : NODE_BLOCK_SIZE_LARGE );
node->brushes = NULL;
node->polygons = NULL;
node->parent = parent;
src->ExpectTokenString( "(" );
node->planeType = src->ParseInt();
node->planeDist = src->ParseFloat();
src->ExpectTokenString( ")" );
if ( node->planeType != -1 ) {
node->children[0] = ParseNodes( src, model, node );
node->children[1] = ParseNodes( src, model, node );
}
return node;
}
/*
================
idCollisionModelManagerLocal::ParsePolygons
================
*/
void idCollisionModelManagerLocal::ParsePolygons( idLexer *src, cm_model_t *model ) {
cm_polygon_t *p;
int i, numEdges;
idVec3 normal;
idToken token;
if ( src->CheckTokenType( TT_NUMBER, 0, &token ) ) {
model->polygonBlock = (cm_polygonBlock_t *) Mem_Alloc( sizeof( cm_polygonBlock_t ) + token.GetIntValue() );
model->polygonBlock->bytesRemaining = token.GetIntValue();
model->polygonBlock->next = ( (byte *) model->polygonBlock ) + sizeof( cm_polygonBlock_t );
}
src->ExpectTokenString( "{" );
while ( !src->CheckTokenString( "}" ) ) {
// parse polygon
numEdges = src->ParseInt();
p = AllocPolygon( model, numEdges );
p->numEdges = numEdges;
src->ExpectTokenString( "(" );
for ( i = 0; i < p->numEdges; i++ ) {
p->edges[i] = src->ParseInt();
}
src->ExpectTokenString( ")" );
src->Parse1DMatrix( 3, normal.ToFloatPtr() );
p->plane.SetNormal( normal );
p->plane.SetDist( src->ParseFloat() );
src->Parse1DMatrix( 3, p->bounds[0].ToFloatPtr() );
src->Parse1DMatrix( 3, p->bounds[1].ToFloatPtr() );
src->ExpectTokenType( TT_STRING, 0, &token );
// get material
p->material = declManager->FindMaterial( token );
p->contents = p->material->GetContentFlags();
p->checkcount = 0;
// filter polygon into tree
R_FilterPolygonIntoTree( model, model->node, NULL, p );
}
}
/*
================
idCollisionModelManagerLocal::ParseBrushes
================
*/
void idCollisionModelManagerLocal::ParseBrushes( idLexer *src, cm_model_t *model ) {
cm_brush_t *b;
int i, numPlanes;
idVec3 normal;
idToken token;
if ( src->CheckTokenType( TT_NUMBER, 0, &token ) ) {
model->brushBlock = (cm_brushBlock_t *) Mem_Alloc( sizeof( cm_brushBlock_t ) + token.GetIntValue() );
model->brushBlock->bytesRemaining = token.GetIntValue();
model->brushBlock->next = ( (byte *) model->brushBlock ) + sizeof( cm_brushBlock_t );
}
src->ExpectTokenString( "{" );
while ( !src->CheckTokenString( "}" ) ) {
// parse brush
numPlanes = src->ParseInt();
b = AllocBrush( model, numPlanes );
b->numPlanes = numPlanes;
src->ExpectTokenString( "{" );
for ( i = 0; i < b->numPlanes; i++ ) {
src->Parse1DMatrix( 3, normal.ToFloatPtr() );
b->planes[i].SetNormal( normal );
b->planes[i].SetDist( src->ParseFloat() );
}
src->ExpectTokenString( "}" );
src->Parse1DMatrix( 3, b->bounds[0].ToFloatPtr() );
src->Parse1DMatrix( 3, b->bounds[1].ToFloatPtr() );
src->ReadToken( &token );
if ( token.type == TT_NUMBER ) {
b->contents = token.GetIntValue(); // old .cm files use a single integer
} else {
b->contents = ContentsFromString( token );
}
b->checkcount = 0;
b->primitiveNum = 0;
// filter brush into tree
R_FilterBrushIntoTree( model, model->node, NULL, b );
}
}
/*
================
idCollisionModelManagerLocal::ParseCollisionModel
================
*/
bool idCollisionModelManagerLocal::ParseCollisionModel( idLexer *src ) {
cm_model_t *model;
idToken token;
if ( numModels >= MAX_SUBMODELS ) {
common->Error( "LoadModel: no free slots" );
return false;
}
model = AllocModel();
models[numModels ] = model;
numModels++;
// parse the file
src->ExpectTokenType( TT_STRING, 0, &token );
model->name = token;
src->ExpectTokenString( "{" );
while ( !src->CheckTokenString( "}" ) ) {
src->ReadToken( &token );
if ( token == "vertices" ) {
ParseVertices( src, model );
continue;
}
if ( token == "edges" ) {
ParseEdges( src, model );
continue;
}
if ( token == "nodes" ) {
src->ExpectTokenString( "{" );
model->node = ParseNodes( src, model, NULL );
src->ExpectTokenString( "}" );
continue;
}
if ( token == "polygons" ) {
ParsePolygons( src, model );
continue;
}
if ( token == "brushes" ) {
ParseBrushes( src, model );
continue;
}
src->Error( "ParseCollisionModel: bad token \"%s\"", token.c_str() );
}
// calculate edge normals
checkCount++;
CalculateEdgeNormals( model, model->node );
// get model bounds from brush and polygon bounds
CM_GetNodeBounds( &model->bounds, model->node );
// get model contents
model->contents = CM_GetNodeContents( model->node );
// total memory used by this model
model->usedMemory = model->numVertices * sizeof(cm_vertex_t) +
model->numEdges * sizeof(cm_edge_t) +
model->polygonMemory +
model->brushMemory +
model->numNodes * sizeof(cm_node_t) +
model->numPolygonRefs * sizeof(cm_polygonRef_t) +
model->numBrushRefs * sizeof(cm_brushRef_t);
return true;
}
/*
================
idCollisionModelManagerLocal::LoadCollisionModelFile
================
*/
bool idCollisionModelManagerLocal::LoadCollisionModelFile( const char *name, unsigned int mapFileCRC ) {
idStr fileName;
idToken token;
idLexer *src;
unsigned int crc;
// load it
fileName = name;
fileName.SetFileExtension( CM_FILE_EXT );
src = new idLexer( fileName );
src->SetFlags( LEXFL_NOSTRINGCONCAT | LEXFL_NODOLLARPRECOMPILE );
if ( !src->IsLoaded() ) {
delete src;
return false;
}
if ( !src->ExpectTokenString( CM_FILEID ) ) {
common->Warning( "%s is not an CM file.", fileName.c_str() );
delete src;
return false;
}
if ( !src->ReadToken( &token ) || token != CM_FILEVERSION ) {
common->Warning( "%s has version %s instead of %s", fileName.c_str(), token.c_str(), CM_FILEVERSION );
delete src;
return false;
}
if ( !src->ExpectTokenType( TT_NUMBER, TT_INTEGER, &token ) ) {
common->Warning( "%s has no map file CRC", fileName.c_str() );
delete src;
return false;
}
crc = token.GetUnsignedLongValue();
if ( mapFileCRC && crc != mapFileCRC ) {
common->Printf( "%s is out of date\n", fileName.c_str() );
delete src;
return false;
}
// parse the file
while ( 1 ) {
if ( !src->ReadToken( &token ) ) {
break;
}
if ( token == "collisionModel" ) {
if ( !ParseCollisionModel( src ) ) {
delete src;
return false;
}
continue;
}
src->Error( "idCollisionModelManagerLocal::LoadCollisionModelFile: bad token \"%s\"", token.c_str() );
}
delete src;
return true;
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,527 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
/*
===============================================================================
Trace model vs. polygonal model collision detection.
===============================================================================
*/
#include "CollisionModel.h"
#define MIN_NODE_SIZE 64.0f
#define MAX_NODE_POLYGONS 128
#define CM_MAX_POLYGON_EDGES 64
#define CIRCLE_APPROXIMATION_LENGTH 64.0f
#define MAX_SUBMODELS 2048
#define TRACE_MODEL_HANDLE MAX_SUBMODELS
#define VERTEX_HASH_BOXSIZE (1<<6) // must be power of 2
#define VERTEX_HASH_SIZE (VERTEX_HASH_BOXSIZE*VERTEX_HASH_BOXSIZE)
#define EDGE_HASH_SIZE (1<<14)
#define NODE_BLOCK_SIZE_SMALL 8
#define NODE_BLOCK_SIZE_LARGE 256
#define REFERENCE_BLOCK_SIZE_SMALL 8
#define REFERENCE_BLOCK_SIZE_LARGE 256
#define MAX_WINDING_LIST 128 // quite a few are generated at times
#define INTEGRAL_EPSILON 0.01f
#define VERTEX_EPSILON 0.1f
#define CHOP_EPSILON 0.1f
typedef struct cm_windingList_s {
int numWindings; // number of windings
idFixedWinding w[MAX_WINDING_LIST]; // windings
idVec3 normal; // normal for all windings
idBounds bounds; // bounds of all windings in list
idVec3 origin; // origin for radius
float radius; // radius relative to origin for all windings
int contents; // winding surface contents
int primitiveNum; // number of primitive the windings came from
} cm_windingList_t;
/*
===============================================================================
Collision model
===============================================================================
*/
typedef struct cm_vertex_s {
idVec3 p; // vertex point
int checkcount; // for multi-check avoidance
unsigned long side; // each bit tells at which side this vertex passes one of the trace model edges
unsigned long sideSet; // each bit tells if sidedness for the trace model edge has been calculated yet
} cm_vertex_t;
typedef struct cm_edge_s {
int checkcount; // for multi-check avoidance
unsigned short internal; // a trace model can never collide with internal edges
unsigned short numUsers; // number of polygons using this edge
unsigned long side; // each bit tells at which side of this edge one of the trace model vertices passes
unsigned long sideSet; // each bit tells if sidedness for the trace model vertex has been calculated yet
int vertexNum[2]; // start and end point of edge
idVec3 normal; // edge normal
} cm_edge_t;
typedef struct cm_polygonBlock_s {
int bytesRemaining;
byte * next;
} cm_polygonBlock_t;
typedef struct cm_polygon_s {
idBounds bounds; // polygon bounds
int checkcount; // for multi-check avoidance
int contents; // contents behind polygon
const idMaterial * material; // material
idPlane plane; // polygon plane
int numEdges; // number of edges
int edges[1]; // variable sized, indexes into cm_edge_t list
} cm_polygon_t;
typedef struct cm_polygonRef_s {
cm_polygon_t * p; // pointer to polygon
struct cm_polygonRef_s *next; // next polygon in chain
} cm_polygonRef_t;
typedef struct cm_polygonRefBlock_s {
cm_polygonRef_t * nextRef; // next polygon reference in block
struct cm_polygonRefBlock_s *next; // next block with polygon references
} cm_polygonRefBlock_t;
typedef struct cm_brushBlock_s {
int bytesRemaining;
byte * next;
} cm_brushBlock_t;
typedef struct cm_brush_s {
int checkcount; // for multi-check avoidance
idBounds bounds; // brush bounds
int contents; // contents of brush
const idMaterial * material; // material
int primitiveNum; // number of brush primitive
int numPlanes; // number of bounding planes
idPlane planes[1]; // variable sized
} cm_brush_t;
typedef struct cm_brushRef_s {
cm_brush_t * b; // pointer to brush
struct cm_brushRef_s * next; // next brush in chain
} cm_brushRef_t;
typedef struct cm_brushRefBlock_s {
cm_brushRef_t * nextRef; // next brush reference in block
struct cm_brushRefBlock_s *next; // next block with brush references
} cm_brushRefBlock_t;
typedef struct cm_node_s {
int planeType; // node axial plane type
float planeDist; // node plane distance
cm_polygonRef_t * polygons; // polygons in node
cm_brushRef_t * brushes; // brushes in node
struct cm_node_s * parent; // parent of this node
struct cm_node_s * children[2]; // node children
} cm_node_t;
typedef struct cm_nodeBlock_s {
cm_node_t * nextNode; // next node in block
struct cm_nodeBlock_s *next; // next block with nodes
} cm_nodeBlock_t;
typedef struct cm_model_s {
idStr name; // model name
idBounds bounds; // model bounds
int contents; // all contents of the model ored together
bool isConvex; // set if model is convex
// model geometry
int maxVertices; // size of vertex array
int numVertices; // number of vertices
cm_vertex_t * vertices; // array with all vertices used by the model
int maxEdges; // size of edge array
int numEdges; // number of edges
cm_edge_t * edges; // array with all edges used by the model
cm_node_t * node; // first node of spatial subdivision
// blocks with allocated memory
cm_nodeBlock_t * nodeBlocks; // list with blocks of nodes
cm_polygonRefBlock_t * polygonRefBlocks; // list with blocks of polygon references
cm_brushRefBlock_t * brushRefBlocks; // list with blocks of brush references
cm_polygonBlock_t * polygonBlock; // memory block with all polygons
cm_brushBlock_t * brushBlock; // memory block with all brushes
// statistics
int numPolygons;
int polygonMemory;
int numBrushes;
int brushMemory;
int numNodes;
int numBrushRefs;
int numPolygonRefs;
int numInternalEdges;
int numSharpEdges;
int numRemovedPolys;
int numMergedPolys;
int usedMemory;
} cm_model_t;
/*
===============================================================================
Data used during collision detection calculations
===============================================================================
*/
typedef struct cm_trmVertex_s {
int used; // true if this vertex is used for collision detection
idVec3 p; // vertex position
idVec3 endp; // end point of vertex after movement
int polygonSide; // side of polygon this vertex is on (rotational collision)
idPluecker pl; // pluecker coordinate for vertex movement
idVec3 rotationOrigin; // rotation origin for this vertex
idBounds rotationBounds; // rotation bounds for this vertex
} cm_trmVertex_t;
typedef struct cm_trmEdge_s {
int used; // true when vertex is used for collision detection
idVec3 start; // start of edge
idVec3 end; // end of edge
int vertexNum[2]; // indexes into cm_traceWork_t->vertices
idPluecker pl; // pluecker coordinate for edge
idVec3 cross; // (z,-y,x) of cross product between edge dir and movement dir
idBounds rotationBounds; // rotation bounds for this edge
idPluecker plzaxis; // pluecker coordinate for rotation about the z-axis
unsigned short bitNum; // vertex bit number
} cm_trmEdge_t;
typedef struct cm_trmPolygon_s {
int used;
idPlane plane; // polygon plane
int numEdges; // number of edges
int edges[MAX_TRACEMODEL_POLYEDGES]; // index into cm_traceWork_t->edges
idBounds rotationBounds; // rotation bounds for this polygon
} cm_trmPolygon_t;
typedef struct cm_traceWork_s {
int numVerts;
cm_trmVertex_t vertices[MAX_TRACEMODEL_VERTS]; // trm vertices
int numEdges;
cm_trmEdge_t edges[MAX_TRACEMODEL_EDGES+1]; // trm edges
int numPolys;
cm_trmPolygon_t polys[MAX_TRACEMODEL_POLYS]; // trm polygons
cm_model_t *model; // model colliding with
idVec3 start; // start of trace
idVec3 end; // end of trace
idVec3 dir; // trace direction
idBounds bounds; // bounds of full trace
idBounds size; // bounds of transformed trm relative to start
idVec3 extents; // largest of abs(size[0]) and abs(size[1]) for BSP trace
int contents; // ignore polygons that do not have any of these contents flags
trace_t trace; // collision detection result
bool rotation; // true if calculating rotational collision
bool pointTrace; // true if only tracing a point
bool positionTest; // true if not tracing but doing a position test
bool isConvex; // true if the trace model is convex
bool axisIntersectsTrm; // true if the rotation axis intersects the trace model
bool getContacts; // true if retrieving contacts
bool quickExit; // set to quickly stop the collision detection calculations
idVec3 origin; // origin of rotation in model space
idVec3 axis; // rotation axis in model space
idMat3 matrix; // rotates axis of rotation to the z-axis
float angle; // angle for rotational collision
float maxTan; // max tangent of half the positive angle used instead of fraction
float radius; // rotation radius of trm start
idRotation modelVertexRotation; // inverse rotation for model vertices
contactInfo_t *contacts; // array with contacts
int maxContacts; // max size of contact array
int numContacts; // number of contacts found
idPlane heartPlane1; // polygons should be near anough the trace heart planes
float maxDistFromHeartPlane1;
idPlane heartPlane2;
float maxDistFromHeartPlane2;
idPluecker polygonEdgePlueckerCache[CM_MAX_POLYGON_EDGES];
idPluecker polygonVertexPlueckerCache[CM_MAX_POLYGON_EDGES];
idVec3 polygonRotationOriginCache[CM_MAX_POLYGON_EDGES];
} cm_traceWork_t;
/*
===============================================================================
Collision Map
===============================================================================
*/
typedef struct cm_procNode_s {
idPlane plane;
int children[2]; // negative numbers are (-1 - areaNumber), 0 = solid
} cm_procNode_t;
class idCollisionModelManagerLocal : public idCollisionModelManager {
public:
// load collision models from a map file
void LoadMap( const idMapFile *mapFile );
// frees all the collision models
void FreeMap( void );
// get clip handle for model
cmHandle_t LoadModel( const char *modelName, const bool precache );
// sets up a trace model for collision with other trace models
cmHandle_t SetupTrmModel( const idTraceModel &trm, const idMaterial *material );
// create trace model from a collision model, returns true if succesfull
bool TrmFromModel( const char *modelName, idTraceModel &trm );
// name of the model
const char * GetModelName( cmHandle_t model ) const;
// bounds of the model
bool GetModelBounds( cmHandle_t model, idBounds &bounds ) const;
// all contents flags of brushes and polygons ored together
bool GetModelContents( cmHandle_t model, int &contents ) const;
// get the vertex of a model
bool GetModelVertex( cmHandle_t model, int vertexNum, idVec3 &vertex ) const;
// get the edge of a model
bool GetModelEdge( cmHandle_t model, int edgeNum, idVec3 &start, idVec3 &end ) const;
// get the polygon of a model
bool GetModelPolygon( cmHandle_t model, int polygonNum, idFixedWinding &winding ) const;
// translates a trm and reports the first collision if any
void Translation( trace_t *results, const idVec3 &start, const idVec3 &end,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis );
// rotates a trm and reports the first collision if any
void Rotation( trace_t *results, const idVec3 &start, const idRotation &rotation,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis );
// returns the contents the trm is stuck in or 0 if the trm is in free space
int Contents( const idVec3 &start,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis );
// stores all contact points of the trm with the model, returns the number of contacts
int Contacts( contactInfo_t *contacts, const int maxContacts, const idVec3 &start, const idVec6 &dir, const float depth,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis );
// test collision detection
void DebugOutput( const idVec3 &origin );
// draw a model
void DrawModel( cmHandle_t model, const idVec3 &origin, const idMat3 &axis,
const idVec3 &viewOrigin, const float radius );
// print model information, use -1 handle for accumulated model info
void ModelInfo( cmHandle_t model );
// list all loaded models
void ListModels( void );
// write a collision model file for the map entity
bool WriteCollisionModelForMapEntity( const idMapEntity *mapEnt, const char *filename, const bool testTraceModel = true );
private: // CollisionMap_translate.cpp
int TranslateEdgeThroughEdge( idVec3 &cross, idPluecker &l1, idPluecker &l2, float *fraction );
void TranslateTrmEdgeThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *poly, cm_trmEdge_t *trmEdge );
void TranslateTrmVertexThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *poly, cm_trmVertex_t *v, int bitNum );
void TranslatePointThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *poly, cm_trmVertex_t *v );
void TranslateVertexThroughTrmPolygon( cm_traceWork_t *tw, cm_trmPolygon_t *trmpoly, cm_polygon_t *poly, cm_vertex_t *v, idVec3 &endp, idPluecker &pl );
bool TranslateTrmThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *p );
void SetupTranslationHeartPlanes( cm_traceWork_t *tw );
void SetupTrm( cm_traceWork_t *tw, const idTraceModel *trm );
private: // CollisionMap_rotate.cpp
int CollisionBetweenEdgeBounds( cm_traceWork_t *tw, const idVec3 &va, const idVec3 &vb,
const idVec3 &vc, const idVec3 &vd, float tanHalfAngle,
idVec3 &collisionPoint, idVec3 &collisionNormal );
int RotateEdgeThroughEdge( cm_traceWork_t *tw, const idPluecker &pl1,
const idVec3 &vc, const idVec3 &vd,
const float minTan, float &tanHalfAngle );
int EdgeFurthestFromEdge( cm_traceWork_t *tw, const idPluecker &pl1,
const idVec3 &vc, const idVec3 &vd,
float &tanHalfAngle, float &dir );
void RotateTrmEdgeThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *poly, cm_trmEdge_t *trmEdge );
int RotatePointThroughPlane( const cm_traceWork_t *tw, const idVec3 &point, const idPlane &plane,
const float angle, const float minTan, float &tanHalfAngle );
int PointFurthestFromPlane( const cm_traceWork_t *tw, const idVec3 &point, const idPlane &plane,
const float angle, float &tanHalfAngle, float &dir );
int RotatePointThroughEpsilonPlane( const cm_traceWork_t *tw, const idVec3 &point, const idVec3 &endPoint,
const idPlane &plane, const float angle, const idVec3 &origin,
float &tanHalfAngle, idVec3 &collisionPoint, idVec3 &endDir );
void RotateTrmVertexThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *poly, cm_trmVertex_t *v, int vertexNum);
void RotateVertexThroughTrmPolygon( cm_traceWork_t *tw, cm_trmPolygon_t *trmpoly, cm_polygon_t *poly,
cm_vertex_t *v, idVec3 &rotationOrigin );
bool RotateTrmThroughPolygon( cm_traceWork_t *tw, cm_polygon_t *p );
void BoundsForRotation( const idVec3 &origin, const idVec3 &axis, const idVec3 &start, const idVec3 &end, idBounds &bounds );
void Rotation180( trace_t *results, const idVec3 &rorg, const idVec3 &axis,
const float startAngle, const float endAngle, const idVec3 &start,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &origin, const idMat3 &modelAxis );
private: // CollisionMap_contents.cpp
bool TestTrmVertsInBrush( cm_traceWork_t *tw, cm_brush_t *b );
bool TestTrmInPolygon( cm_traceWork_t *tw, cm_polygon_t *p );
cm_node_t * PointNode( const idVec3 &p, cm_model_t *model );
int PointContents( const idVec3 p, cmHandle_t model );
int TransformedPointContents( const idVec3 &p, cmHandle_t model, const idVec3 &origin, const idMat3 &modelAxis );
int ContentsTrm( trace_t *results, const idVec3 &start,
const idTraceModel *trm, const idMat3 &trmAxis, int contentMask,
cmHandle_t model, const idVec3 &modelOrigin, const idMat3 &modelAxis );
private: // CollisionMap_trace.cpp
void TraceTrmThroughNode( cm_traceWork_t *tw, cm_node_t *node );
void TraceThroughAxialBSPTree_r( cm_traceWork_t *tw, cm_node_t *node, float p1f, float p2f, idVec3 &p1, idVec3 &p2);
void TraceThroughModel( cm_traceWork_t *tw );
void RecurseProcBSP_r( trace_t *results, int parentNodeNum, int nodeNum, float p1f, float p2f, const idVec3 &p1, const idVec3 &p2 );
private: // CollisionMap_load.cpp
void Clear( void );
void FreeTrmModelStructure( void );
// model deallocation
void RemovePolygonReferences_r( cm_node_t *node, cm_polygon_t *p );
void RemoveBrushReferences_r( cm_node_t *node, cm_brush_t *b );
void FreeNode( cm_node_t *node );
void FreePolygonReference( cm_polygonRef_t *pref );
void FreeBrushReference( cm_brushRef_t *bref );
void FreePolygon( cm_model_t *model, cm_polygon_t *poly );
void FreeBrush( cm_model_t *model, cm_brush_t *brush );
void FreeTree_r( cm_model_t *model, cm_node_t *headNode, cm_node_t *node );
void FreeModel( cm_model_t *model );
// merging polygons
void ReplacePolygons( cm_model_t *model, cm_node_t *node, cm_polygon_t *p1, cm_polygon_t *p2, cm_polygon_t *newp );
cm_polygon_t * TryMergePolygons( cm_model_t *model, cm_polygon_t *p1, cm_polygon_t *p2 );
bool MergePolygonWithTreePolygons( cm_model_t *model, cm_node_t *node, cm_polygon_t *polygon );
void MergeTreePolygons( cm_model_t *model, cm_node_t *node );
// finding internal edges
bool PointInsidePolygon( cm_model_t *model, cm_polygon_t *p, idVec3 &v );
void FindInternalEdgesOnPolygon( cm_model_t *model, cm_polygon_t *p1, cm_polygon_t *p2 );
void FindInternalPolygonEdges( cm_model_t *model, cm_node_t *node, cm_polygon_t *polygon );
void FindInternalEdges( cm_model_t *model, cm_node_t *node );
void FindContainedEdges( cm_model_t *model, cm_polygon_t *p );
// loading of proc BSP tree
void ParseProcNodes( idLexer *src );
void LoadProcBSP( const char *name );
// removal of contained polygons
int R_ChoppedAwayByProcBSP( int nodeNum, idFixedWinding *w, const idVec3 &normal, const idVec3 &origin, const float radius );
int ChoppedAwayByProcBSP( const idFixedWinding &w, const idPlane &plane, int contents );
void ChopWindingListWithBrush( cm_windingList_t *list, cm_brush_t *b );
void R_ChopWindingListWithTreeBrushes( cm_windingList_t *list, cm_node_t *node );
idFixedWinding *WindingOutsideBrushes( idFixedWinding *w, const idPlane &plane, int contents, int patch, cm_node_t *headNode );
// creation of axial BSP tree
cm_model_t * AllocModel( void );
cm_node_t * AllocNode( cm_model_t *model, int blockSize );
cm_polygonRef_t*AllocPolygonReference( cm_model_t *model, int blockSize );
cm_brushRef_t * AllocBrushReference( cm_model_t *model, int blockSize );
cm_polygon_t * AllocPolygon( cm_model_t *model, int numEdges );
cm_brush_t * AllocBrush( cm_model_t *model, int numPlanes );
void AddPolygonToNode( cm_model_t *model, cm_node_t *node, cm_polygon_t *p );
void AddBrushToNode( cm_model_t *model, cm_node_t *node, cm_brush_t *b );
void SetupTrmModelStructure( void );
void R_FilterPolygonIntoTree( cm_model_t *model, cm_node_t *node, cm_polygonRef_t *pref, cm_polygon_t *p );
void R_FilterBrushIntoTree( cm_model_t *model, cm_node_t *node, cm_brushRef_t *pref, cm_brush_t *b );
cm_node_t * R_CreateAxialBSPTree( cm_model_t *model, cm_node_t *node, const idBounds &bounds );
cm_node_t * CreateAxialBSPTree( cm_model_t *model, cm_node_t *node );
// creation of raw polygons
void SetupHash(void);
void ShutdownHash(void);
void ClearHash( idBounds &bounds );
int HashVec(const idVec3 &vec);
int GetVertex( cm_model_t *model, const idVec3 &v, int *vertexNum );
int GetEdge( cm_model_t *model, const idVec3 &v1, const idVec3 &v2, int *edgeNum, int v1num );
void CreatePolygon( cm_model_t *model, idFixedWinding *w, const idPlane &plane, const idMaterial *material, int primitiveNum );
void PolygonFromWinding( cm_model_t *model, idFixedWinding *w, const idPlane &plane, const idMaterial *material, int primitiveNum );
void CalculateEdgeNormals( cm_model_t *model, cm_node_t *node );
void CreatePatchPolygons( cm_model_t *model, idSurface_Patch &mesh, const idMaterial *material, int primitiveNum );
void ConvertPatch( cm_model_t *model, const idMapPatch *patch, int primitiveNum );
void ConvertBrushSides( cm_model_t *model, const idMapBrush *mapBrush, int primitiveNum );
void ConvertBrush( cm_model_t *model, const idMapBrush *mapBrush, int primitiveNum );
void PrintModelInfo( const cm_model_t *model );
void AccumulateModelInfo( cm_model_t *model );
void RemapEdges( cm_node_t *node, int *edgeRemap );
void OptimizeArrays( cm_model_t *model );
void FinishModel( cm_model_t *model );
void BuildModels( const idMapFile *mapFile );
cmHandle_t FindModel( const char *name );
cm_model_t * CollisionModelForMapEntity( const idMapEntity *mapEnt ); // brush/patch model from .map
cm_model_t * LoadRenderModel( const char *fileName ); // ASE/LWO models
bool TrmFromModel_r( idTraceModel &trm, cm_node_t *node );
bool TrmFromModel( const cm_model_t *model, idTraceModel &trm );
private: // CollisionMap_files.cpp
// writing
void WriteNodes( idFile *fp, cm_node_t *node );
int CountPolygonMemory( cm_node_t *node ) const;
void WritePolygons( idFile *fp, cm_node_t *node );
int CountBrushMemory( cm_node_t *node ) const;
void WriteBrushes( idFile *fp, cm_node_t *node );
void WriteCollisionModel( idFile *fp, cm_model_t *model );
void WriteCollisionModelsToFile( const char *filename, int firstModel, int lastModel, unsigned int mapFileCRC );
// loading
cm_node_t * ParseNodes( idLexer *src, cm_model_t *model, cm_node_t *parent );
void ParseVertices( idLexer *src, cm_model_t *model );
void ParseEdges( idLexer *src, cm_model_t *model );
void ParsePolygons( idLexer *src, cm_model_t *model );
void ParseBrushes( idLexer *src, cm_model_t *model );
bool ParseCollisionModel( idLexer *src );
bool LoadCollisionModelFile( const char *name, unsigned int mapFileCRC );
private: // CollisionMap_debug
int ContentsFromString( const char *string ) const;
const char * StringFromContents( const int contents ) const;
void DrawEdge( cm_model_t *model, int edgeNum, const idVec3 &origin, const idMat3 &axis );
void DrawPolygon( cm_model_t *model, cm_polygon_t *p, const idVec3 &origin, const idMat3 &axis,
const idVec3 &viewOrigin );
void DrawNodePolygons( cm_model_t *model, cm_node_t *node, const idVec3 &origin, const idMat3 &axis,
const idVec3 &viewOrigin, const float radius );
private: // collision map data
idStr mapName;
ID_TIME_T mapFileTime;
int loaded;
// for multi-check avoidance
int checkCount;
// models
int maxModels;
int numModels;
cm_model_t ** models;
// polygons and brush for trm model
cm_polygonRef_t*trmPolygons[MAX_TRACEMODEL_POLYS];
cm_brushRef_t * trmBrushes[1];
const idMaterial *trmMaterial;
// for data pruning
int numProcNodes;
cm_procNode_t * procNodes;
// for retrieving contact points
bool getContacts;
contactInfo_t * contacts;
int maxContacts;
int numContacts;
};
// for debugging
extern idCVar cm_debugCollision;

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,256 @@
/*
===========================================================================
Doom 3 GPL Source Code
Copyright (C) 1999-2011 id Software LLC, a ZeniMax Media company.
This file is part of the Doom 3 GPL Source Code (?Doom 3 Source Code?).
Doom 3 Source Code is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
Doom 3 Source Code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with Doom 3 Source Code. If not, see <http://www.gnu.org/licenses/>.
In addition, the Doom 3 Source Code is also subject to certain additional terms. You should have received a copy of these additional terms immediately following the terms and conditions of the GNU General Public License which accompanied the Doom 3 Source Code. If not, please request a copy in writing from id Software at the address below.
If you have questions concerning this license or the applicable additional terms, you may contact in writing id Software LLC, c/o ZeniMax Media Inc., Suite 120, Rockville, Maryland 20850 USA.
===========================================================================
*/
/*
===============================================================================
Trace model vs. polygonal model collision detection.
===============================================================================
*/
#include "../idlib/precompiled.h"
#pragma hdrstop
#include "CollisionModel_local.h"
/*
===============================================================================
Trace through the spatial subdivision
===============================================================================
*/
/*
================
idCollisionModelManagerLocal::TraceTrmThroughNode
================
*/
void idCollisionModelManagerLocal::TraceTrmThroughNode( cm_traceWork_t *tw, cm_node_t *node ) {
cm_polygonRef_t *pref;
cm_brushRef_t *bref;
// position test
if ( tw->positionTest ) {
// if already stuck in solid
if ( tw->trace.fraction == 0.0f ) {
return;
}
// test if any of the trm vertices is inside a brush
for ( bref = node->brushes; bref; bref = bref->next ) {
if ( idCollisionModelManagerLocal::TestTrmVertsInBrush( tw, bref->b ) ) {
return;
}
}
// if just testing a point we're done
if ( tw->pointTrace ) {
return;
}
// test if the trm is stuck in any polygons
for ( pref = node->polygons; pref; pref = pref->next ) {
if ( idCollisionModelManagerLocal::TestTrmInPolygon( tw, pref->p ) ) {
return;
}
}
}
else if ( tw->rotation ) {
// rotate through all polygons in this leaf
for ( pref = node->polygons; pref; pref = pref->next ) {
if ( idCollisionModelManagerLocal::RotateTrmThroughPolygon( tw, pref->p ) ) {
return;
}
}
}
else {
// trace through all polygons in this leaf
for ( pref = node->polygons; pref; pref = pref->next ) {
if ( idCollisionModelManagerLocal::TranslateTrmThroughPolygon( tw, pref->p ) ) {
return;
}
}
}
}
/*
================
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r
================
*/
//#define NO_SPATIAL_SUBDIVISION
void idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( cm_traceWork_t *tw, cm_node_t *node, float p1f, float p2f, idVec3 &p1, idVec3 &p2) {
float t1, t2, offset;
float frac, frac2;
float idist;
idVec3 mid;
int side;
float midf;
if ( !node ) {
return;
}
if ( tw->quickExit ) {
return; // stop immediately
}
if ( tw->trace.fraction <= p1f ) {
return; // already hit something nearer
}
// if we need to test this node for collisions
if ( node->polygons || (tw->positionTest && node->brushes) ) {
// trace through node with collision data
idCollisionModelManagerLocal::TraceTrmThroughNode( tw, node );
}
// if already stuck in solid
if ( tw->positionTest && tw->trace.fraction == 0.0f ) {
return;
}
// if this is a leaf node
if ( node->planeType == -1 ) {
return;
}
#ifdef NO_SPATIAL_SUBDIVISION
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, node->children[0], p1f, p2f, p1, p2 );
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, node->children[1], p1f, p2f, p1, p2 );
return;
#endif
// distance from plane for trace start and end
t1 = p1[node->planeType] - node->planeDist;
t2 = p2[node->planeType] - node->planeDist;
// adjust the plane distance appropriately for mins/maxs
offset = tw->extents[node->planeType];
// see which sides we need to consider
if ( t1 >= offset && t2 >= offset ) {
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, node->children[0], p1f, p2f, p1, p2 );
return;
}
if ( t1 < -offset && t2 < -offset ) {
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, node->children[1], p1f, p2f, p1, p2 );
return;
}
if ( t1 < t2 ) {
idist = 1.0f / (t1-t2);
side = 1;
frac2 = (t1 + offset) * idist;
frac = (t1 - offset) * idist;
} else if (t1 > t2) {
idist = 1.0f / (t1-t2);
side = 0;
frac2 = (t1 - offset) * idist;
frac = (t1 + offset) * idist;
} else {
side = 0;
frac = 1.0f;
frac2 = 0.0f;
}
// move up to the node
if ( frac < 0.0f ) {
frac = 0.0f;
}
else if ( frac > 1.0f ) {
frac = 1.0f;
}
midf = p1f + (p2f - p1f)*frac;
mid[0] = p1[0] + frac*(p2[0] - p1[0]);
mid[1] = p1[1] + frac*(p2[1] - p1[1]);
mid[2] = p1[2] + frac*(p2[2] - p1[2]);
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, node->children[side], p1f, midf, p1, mid );
// go past the node
if ( frac2 < 0.0f ) {
frac2 = 0.0f;
}
else if ( frac2 > 1.0f ) {
frac2 = 1.0f;
}
midf = p1f + (p2f - p1f)*frac2;
mid[0] = p1[0] + frac2*(p2[0] - p1[0]);
mid[1] = p1[1] + frac2*(p2[1] - p1[1]);
mid[2] = p1[2] + frac2*(p2[2] - p1[2]);
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, node->children[side^1], midf, p2f, mid, p2 );
}
/*
================
idCollisionModelManagerLocal::TraceThroughModel
================
*/
void idCollisionModelManagerLocal::TraceThroughModel( cm_traceWork_t *tw ) {
float d;
int i, numSteps;
idVec3 start, end;
idRotation rot;
if ( !tw->rotation ) {
// trace through spatial subdivision and then through leafs
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, tw->model->node, 0, 1, tw->start, tw->end );
}
else {
// approximate the rotation with a series of straight line movements
// total length covered along circle
d = tw->radius * DEG2RAD( tw->angle );
// if more than one step
if ( d > CIRCLE_APPROXIMATION_LENGTH ) {
// number of steps for the approximation
numSteps = (int) (CIRCLE_APPROXIMATION_LENGTH / d);
// start of approximation
start = tw->start;
// trace circle approximation steps through the BSP tree
for ( i = 0; i < numSteps; i++ ) {
// calculate next point on approximated circle
rot.Set( tw->origin, tw->axis, tw->angle * ((float) (i+1) / numSteps) );
end = start * rot;
// trace through spatial subdivision and then through leafs
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, tw->model->node, 0, 1, start, end );
// no need to continue if something was hit already
if ( tw->trace.fraction < 1.0f ) {
return;
}
start = end;
}
}
else {
start = tw->start;
}
// last step of the approximation
idCollisionModelManagerLocal::TraceThroughAxialBSPTree_r( tw, tw->model->node, 0, 1, start, tw->end );
}
}

File diff suppressed because it is too large Load Diff

2078
neo/curl/CHANGES Normal file

File diff suppressed because it is too large Load Diff

21
neo/curl/COPYING Normal file
View File

@ -0,0 +1,21 @@
COPYRIGHT AND PERMISSION NOTICE
Copyright (c) 1996 - 2004, Daniel Stenberg, <daniel@haxx.se>.
All rights reserved.
Permission to use, copy, modify, and distribute this software for any purpose
with or without fee is hereby granted, provided that the above copyright
notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. IN
NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
OR OTHER DEALINGS IN THE SOFTWARE.
Except as contained in this notice, the name of a copyright holder shall not
be used in advertising or otherwise to promote the sale, use or other dealings
in this Software without prior written authorization of the copyright holder.

113
neo/curl/Makefile.am Normal file
View File

@ -0,0 +1,113 @@
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) 1998 - 2004, Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at http://curl.haxx.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# $Id: Makefile.am,v 1.49 2004/03/15 10:18:38 bagder Exp $
###########################################################################
AUTOMAKE_OPTIONS = foreign
EXTRA_DIST = CHANGES COPYING maketgz reconf Makefile.dist \
curl-config.in curl-style.el sample.emacs testcurl.sh RELEASE-NOTES
bin_SCRIPTS = curl-config
SUBDIRS = lib src
DIST_SUBDIRS = $(SUBDIRS) tests include packages docs
dist-hook:
rm -rf $(top_builddir)/tests/log
find $(distdir) -name "*.dist" -exec rm {} \;
(distit=`find $(srcdir) -name "*.dist"`; \
for file in $$distit; do \
strip=`echo $$file | sed -e s/^$(srcdir)// -e s/\.dist//`; \
cp $$file $(distdir)$$strip; \
done)
html:
cd docs; make html
pdf:
cd docs; make pdf
check: test
test:
@(cd tests; $(MAKE) all quiet-test)
test-full:
@(cd tests; $(MAKE) all full-test)
#
# Build source and binary rpms. For rpm-3.0 and above, the ~/.rpmmacros
# must contain the following line:
# %_topdir /home/loic/local/rpm
# and that /home/loic/local/rpm contains the directory SOURCES, BUILD etc.
#
# cd /home/loic/local/rpm ; mkdir -p SOURCES BUILD RPMS/i386 SPECS SRPMS
#
# If additional configure flags are needed to build the package, add the
# following in ~/.rpmmacros
# %configure CFLAGS="%{optflags}" ./configure %{_target_platform} --prefix=%{_prefix} ${AM_CONFIGFLAGS}
# and run make rpm in the following way:
# AM_CONFIGFLAGS='--with-uri=/home/users/loic/local/RedHat-6.2' make rpm
#
rpms:
$(MAKE) RPMDIST=curl rpm
$(MAKE) RPMDIST=curl-ssl rpm
rpm:
RPM_TOPDIR=`rpm --showrc | $(PERL) -n -e 'print if(s/.*_topdir\s+(.*)/$$1/)'` ; \
cp $(srcdir)/packages/Linux/RPM/$(RPMDIST).spec $$RPM_TOPDIR/SPECS ; \
cp $(PACKAGE)-$(VERSION).tar.gz $$RPM_TOPDIR/SOURCES ; \
rpm -ba --clean --rmsource $$RPM_TOPDIR/SPECS/$(RPMDIST).spec ; \
mv $$RPM_TOPDIR/RPMS/i386/$(RPMDIST)-*.rpm . ; \
mv $$RPM_TOPDIR/SRPMS/$(RPMDIST)-*.src.rpm .
#
# Build a Solaris pkkgadd format file
# run 'make pkgadd' once you've done './configure' and 'make' to make a Solaris pkgadd format
# file (which ends up back in this directory).
# The pkgadd file is in 'pkgtrans' format, so to install on Solaris, do
# pkgadd -d ./HAXXcurl-*
#
# gak - libtool requires an absoulte directory, hence the pwd below...
pkgadd:
umask 022 ; \
make install DESTDIR=`/bin/pwd`/packages/Solaris/root ; \
cat COPYING > $(srcdir)/packages/Solaris/copyright ; \
cd $(srcdir)/packages/Solaris && $(MAKE) package
#
# Build a cygwin binary tarball installation file
# resulting .tar.bz2 file will end up at packages/Win32/cygwin
cygwinbin:
$(MAKE) -C packages/Win32/cygwin cygwinbin
# We extend the standard install with a custom hook:
install-data-hook:
cd include && $(MAKE) install
cd docs && $(MAKE) install
# We extend the standard uninstall with a custom hook:
uninstall-hook:
cd include && $(MAKE) uninstall
cd docs && $(MAKE) uninstall

756
neo/curl/Makefile.in Normal file
View File

@ -0,0 +1,756 @@
# Makefile.in generated by automake 1.8.3 from Makefile.am.
# @configure_input@
# Copyright (C) 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002,
# 2003, 2004 Free Software Foundation, Inc.
# This Makefile.in is free software; the Free Software Foundation
# gives unlimited permission to copy and/or distribute it,
# with or without modifications, as long as this notice is preserved.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY, to the extent permitted by law; without
# even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE.
@SET_MAKE@
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) 1998 - 2004, Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at http://curl.haxx.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# $Id: Makefile.am,v 1.49 2004/03/15 10:18:38 bagder Exp $
###########################################################################
srcdir = @srcdir@
top_srcdir = @top_srcdir@
VPATH = @srcdir@
pkgdatadir = $(datadir)/@PACKAGE@
pkglibdir = $(libdir)/@PACKAGE@
pkgincludedir = $(includedir)/@PACKAGE@
top_builddir = .
am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd
INSTALL = @INSTALL@
install_sh_DATA = $(install_sh) -c -m 644
install_sh_PROGRAM = $(install_sh) -c
install_sh_SCRIPT = $(install_sh) -c
INSTALL_HEADER = $(INSTALL_DATA)
transform = $(program_transform_name)
NORMAL_INSTALL = :
PRE_INSTALL = :
POST_INSTALL = :
NORMAL_UNINSTALL = :
PRE_UNINSTALL = :
POST_UNINSTALL = :
host_triplet = @host@
DIST_COMMON = README $(am__configure_deps) $(srcdir)/Makefile.am \
$(srcdir)/Makefile.in $(srcdir)/curl-config.in \
$(top_srcdir)/configure COPYING config.guess config.sub \
depcomp install-sh ltmain.sh missing mkinstalldirs
subdir = .
ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
am__aclocal_m4_deps = $(top_srcdir)/acinclude.m4 \
$(top_srcdir)/configure.ac
am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
$(ACLOCAL_M4)
am__CONFIG_DISTCLEAN_FILES = config.status config.cache config.log \
configure.lineno configure.status.lineno
mkinstalldirs = $(SHELL) $(top_srcdir)/mkinstalldirs
CONFIG_HEADER = $(top_builddir)/lib/config.h \
$(top_builddir)/src/config.h
CONFIG_CLEAN_FILES = curl-config
am__installdirs = "$(DESTDIR)$(bindir)"
binSCRIPT_INSTALL = $(INSTALL_SCRIPT)
SCRIPTS = $(bin_SCRIPTS)
SOURCES =
DIST_SOURCES =
RECURSIVE_TARGETS = all-recursive check-recursive dvi-recursive \
html-recursive info-recursive install-data-recursive \
install-exec-recursive install-info-recursive \
install-recursive installcheck-recursive installdirs-recursive \
pdf-recursive ps-recursive uninstall-info-recursive \
uninstall-recursive
ETAGS = etags
CTAGS = ctags
DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
distdir = $(PACKAGE)-$(VERSION)
top_distdir = $(distdir)
am__remove_distdir = \
{ test ! -d $(distdir) \
|| { find $(distdir) -type d ! -perm -200 -exec chmod u+w {} ';' \
&& rm -fr $(distdir); }; }
DIST_ARCHIVES = $(distdir).tar.gz
GZIP_ENV = --best
distuninstallcheck_listfiles = find . -type f -print
distcleancheck_listfiles = find . -type f -print
ACLOCAL = @ACLOCAL@
AMDEP_FALSE = @AMDEP_FALSE@
AMDEP_TRUE = @AMDEP_TRUE@
AMTAR = @AMTAR@
AR = @AR@
AS = @AS@
AUTOCONF = @AUTOCONF@
AUTOHEADER = @AUTOHEADER@
AUTOMAKE = @AUTOMAKE@
AWK = @AWK@
CABUNDLE_FALSE = @CABUNDLE_FALSE@
CABUNDLE_TRUE = @CABUNDLE_TRUE@
CC = @CC@
CCDEPMODE = @CCDEPMODE@
CFLAGS = @CFLAGS@
CPP = @CPP@
CPPFLAGS = @CPPFLAGS@
CURL_CA_BUNDLE = @CURL_CA_BUNDLE@
CURL_DISABLE_DICT = @CURL_DISABLE_DICT@
CURL_DISABLE_FILE = @CURL_DISABLE_FILE@
CURL_DISABLE_FTP = @CURL_DISABLE_FTP@
CURL_DISABLE_GOPHER = @CURL_DISABLE_GOPHER@
CURL_DISABLE_HTTP = @CURL_DISABLE_HTTP@
CURL_DISABLE_LDAP = @CURL_DISABLE_LDAP@
CURL_DISABLE_TELNET = @CURL_DISABLE_TELNET@
CXX = @CXX@
CXXCPP = @CXXCPP@
CXXDEPMODE = @CXXDEPMODE@
CXXFLAGS = @CXXFLAGS@
CYGPATH_W = @CYGPATH_W@
DEFS = @DEFS@
DEPDIR = @DEPDIR@
DLLTOOL = @DLLTOOL@
ECHO = @ECHO@
ECHO_C = @ECHO_C@
ECHO_N = @ECHO_N@
ECHO_T = @ECHO_T@
EGREP = @EGREP@
EXEEXT = @EXEEXT@
F77 = @F77@
FFLAGS = @FFLAGS@
HAVE_ARES = @HAVE_ARES@
HAVE_LIBZ = @HAVE_LIBZ@
HAVE_LIBZ_FALSE = @HAVE_LIBZ_FALSE@
HAVE_LIBZ_TRUE = @HAVE_LIBZ_TRUE@
INSTALL_DATA = @INSTALL_DATA@
INSTALL_PROGRAM = @INSTALL_PROGRAM@
INSTALL_SCRIPT = @INSTALL_SCRIPT@
INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@
IPV6_ENABLED = @IPV6_ENABLED@
KRB4_ENABLED = @KRB4_ENABLED@
LDFLAGS = @LDFLAGS@
LIBOBJS = @LIBOBJS@
LIBS = @LIBS@
LIBTOOL = @LIBTOOL@
LN_S = @LN_S@
LTLIBOBJS = @LTLIBOBJS@
MAINT = @MAINT@
MAINTAINER_MODE_FALSE = @MAINTAINER_MODE_FALSE@
MAINTAINER_MODE_TRUE = @MAINTAINER_MODE_TRUE@
MAKEINFO = @MAKEINFO@
MANOPT = @MANOPT@
MIMPURE_FALSE = @MIMPURE_FALSE@
MIMPURE_TRUE = @MIMPURE_TRUE@
NO_UNDEFINED_FALSE = @NO_UNDEFINED_FALSE@
NO_UNDEFINED_TRUE = @NO_UNDEFINED_TRUE@
NROFF = @NROFF@
OBJDUMP = @OBJDUMP@
OBJEXT = @OBJEXT@
OPENSSL_ENABLED = @OPENSSL_ENABLED@
PACKAGE = @PACKAGE@
PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@
PACKAGE_NAME = @PACKAGE_NAME@
PACKAGE_STRING = @PACKAGE_STRING@
PACKAGE_TARNAME = @PACKAGE_TARNAME@
PACKAGE_VERSION = @PACKAGE_VERSION@
PATH_SEPARATOR = @PATH_SEPARATOR@
PERL = @PERL@
PKGADD_NAME = @PKGADD_NAME@
PKGADD_PKG = @PKGADD_PKG@
PKGADD_VENDOR = @PKGADD_VENDOR@
PKGCONFIG = @PKGCONFIG@
RANDOM_FILE = @RANDOM_FILE@
RANLIB = @RANLIB@
SED = @SED@
SET_MAKE = @SET_MAKE@
SHELL = @SHELL@
STRIP = @STRIP@
USE_MANUAL_FALSE = @USE_MANUAL_FALSE@
USE_MANUAL_TRUE = @USE_MANUAL_TRUE@
VERSION = @VERSION@
VERSIONNUM = @VERSIONNUM@
YACC = @YACC@
ac_ct_AR = @ac_ct_AR@
ac_ct_AS = @ac_ct_AS@
ac_ct_CC = @ac_ct_CC@
ac_ct_CXX = @ac_ct_CXX@
ac_ct_DLLTOOL = @ac_ct_DLLTOOL@
ac_ct_F77 = @ac_ct_F77@
ac_ct_OBJDUMP = @ac_ct_OBJDUMP@
ac_ct_RANLIB = @ac_ct_RANLIB@
ac_ct_STRIP = @ac_ct_STRIP@
am__fastdepCC_FALSE = @am__fastdepCC_FALSE@
am__fastdepCC_TRUE = @am__fastdepCC_TRUE@
am__fastdepCXX_FALSE = @am__fastdepCXX_FALSE@
am__fastdepCXX_TRUE = @am__fastdepCXX_TRUE@
am__include = @am__include@
am__leading_dot = @am__leading_dot@
am__quote = @am__quote@
bindir = @bindir@
build = @build@
build_alias = @build_alias@
build_cpu = @build_cpu@
build_os = @build_os@
build_vendor = @build_vendor@
datadir = @datadir@
exec_prefix = @exec_prefix@
host = @host@
host_alias = @host_alias@
host_cpu = @host_cpu@
host_os = @host_os@
host_vendor = @host_vendor@
includedir = @includedir@
infodir = @infodir@
install_sh = @install_sh@
libdir = @libdir@
libexecdir = @libexecdir@
localstatedir = @localstatedir@
mandir = @mandir@
mkdir_p = @mkdir_p@
oldincludedir = @oldincludedir@
prefix = @prefix@
program_transform_name = @program_transform_name@
sbindir = @sbindir@
sharedstatedir = @sharedstatedir@
subdirs = @subdirs@
sysconfdir = @sysconfdir@
target_alias = @target_alias@
AUTOMAKE_OPTIONS = foreign
EXTRA_DIST = CHANGES COPYING maketgz reconf Makefile.dist \
curl-config.in curl-style.el sample.emacs testcurl.sh RELEASE-NOTES
bin_SCRIPTS = curl-config
SUBDIRS = lib src
DIST_SUBDIRS = $(SUBDIRS) tests include packages docs
all: all-recursive
.SUFFIXES:
am--refresh:
@:
$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps)
@for dep in $?; do \
case '$(am__configure_deps)' in \
*$$dep*) \
echo ' cd $(srcdir) && $(AUTOMAKE) --foreign '; \
cd $(srcdir) && $(AUTOMAKE) --foreign \
&& exit 0; \
exit 1;; \
esac; \
done; \
echo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign Makefile'; \
cd $(top_srcdir) && \
$(AUTOMAKE) --foreign Makefile
.PRECIOUS: Makefile
Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
@case '$?' in \
*config.status*) \
echo ' $(SHELL) ./config.status'; \
$(SHELL) ./config.status;; \
*) \
echo ' cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe)'; \
cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe);; \
esac;
$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)
$(SHELL) ./config.status --recheck
$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps)
cd $(srcdir) && $(AUTOCONF)
$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps)
cd $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS)
curl-config: $(top_builddir)/config.status $(srcdir)/curl-config.in
cd $(top_builddir) && $(SHELL) ./config.status $@
install-binSCRIPTS: $(bin_SCRIPTS)
@$(NORMAL_INSTALL)
test -z "$(bindir)" || $(mkdir_p) "$(DESTDIR)$(bindir)"
@list='$(bin_SCRIPTS)'; for p in $$list; do \
if test -f "$$p"; then d=; else d="$(srcdir)/"; fi; \
if test -f $$d$$p; then \
f=`echo "$$p" | sed 's|^.*/||;$(transform)'`; \
echo " $(binSCRIPT_INSTALL) '$$d$$p' '$(DESTDIR)$(bindir)/$$f'"; \
$(binSCRIPT_INSTALL) "$$d$$p" "$(DESTDIR)$(bindir)/$$f"; \
else :; fi; \
done
uninstall-binSCRIPTS:
@$(NORMAL_UNINSTALL)
@list='$(bin_SCRIPTS)'; for p in $$list; do \
f=`echo "$$p" | sed 's|^.*/||;$(transform)'`; \
echo " rm -f '$(DESTDIR)$(bindir)/$$f'"; \
rm -f "$(DESTDIR)$(bindir)/$$f"; \
done
mostlyclean-libtool:
-rm -f *.lo
clean-libtool:
-rm -rf .libs _libs
distclean-libtool:
-rm -f libtool
uninstall-info-am:
# This directory's subdirectories are mostly independent; you can cd
# into them and run `make' without going through this Makefile.
# To change the values of `make' variables: instead of editing Makefiles,
# (1) if the variable is set in `config.status', edit `config.status'
# (which will cause the Makefiles to be regenerated when you run `make');
# (2) otherwise, pass the desired values on the `make' command line.
$(RECURSIVE_TARGETS):
@set fnord $$MAKEFLAGS; amf=$$2; \
dot_seen=no; \
target=`echo $@ | sed s/-recursive//`; \
list='$(SUBDIRS)'; for subdir in $$list; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
dot_seen=yes; \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
(cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| case "$$amf" in *=*) exit 1;; *k*) fail=yes;; *) exit 1;; esac; \
done; \
if test "$$dot_seen" = "no"; then \
$(MAKE) $(AM_MAKEFLAGS) "$$target-am" || exit 1; \
fi; test -z "$$fail"
mostlyclean-recursive clean-recursive distclean-recursive \
maintainer-clean-recursive:
@set fnord $$MAKEFLAGS; amf=$$2; \
dot_seen=no; \
case "$@" in \
distclean-* | maintainer-clean-*) list='$(DIST_SUBDIRS)' ;; \
*) list='$(SUBDIRS)' ;; \
esac; \
rev=''; for subdir in $$list; do \
if test "$$subdir" = "."; then :; else \
rev="$$subdir $$rev"; \
fi; \
done; \
rev="$$rev ."; \
target=`echo $@ | sed s/-recursive//`; \
for subdir in $$rev; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
(cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| case "$$amf" in *=*) exit 1;; *k*) fail=yes;; *) exit 1;; esac; \
done && test -z "$$fail"
tags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) tags); \
done
ctags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) ctags); \
done
ID: $(HEADERS) $(SOURCES) $(LISP) $(TAGS_FILES)
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) ' { files[$$0] = 1; } \
END { for (i in files) print i; }'`; \
mkid -fID $$unique
tags: TAGS
TAGS: tags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
tags=; \
here=`pwd`; \
if (etags --etags-include --version) >/dev/null 2>&1; then \
include_option=--etags-include; \
else \
include_option=--include; \
fi; \
list='$(SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test -f $$subdir/TAGS && \
tags="$$tags $$include_option=$$here/$$subdir/TAGS"; \
fi; \
done; \
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) ' { files[$$0] = 1; } \
END { for (i in files) print i; }'`; \
test -z "$(ETAGS_ARGS)$$tags$$unique" \
|| $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
$$tags $$unique
ctags: CTAGS
CTAGS: ctags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
tags=; \
here=`pwd`; \
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) ' { files[$$0] = 1; } \
END { for (i in files) print i; }'`; \
test -z "$(CTAGS_ARGS)$$tags$$unique" \
|| $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \
$$tags $$unique
GTAGS:
here=`$(am__cd) $(top_builddir) && pwd` \
&& cd $(top_srcdir) \
&& gtags -i $(GTAGS_ARGS) $$here
distclean-tags:
-rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
distdir: $(DISTFILES)
$(am__remove_distdir)
mkdir $(distdir)
$(mkdir_p) $(distdir)/. $(distdir)/packages/EPM $(distdir)/packages/Linux/RPM
@srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; \
topsrcdirstrip=`echo "$(top_srcdir)" | sed 's|.|.|g'`; \
list='$(DISTFILES)'; for file in $$list; do \
case $$file in \
$(srcdir)/*) file=`echo "$$file" | sed "s|^$$srcdirstrip/||"`;; \
$(top_srcdir)/*) file=`echo "$$file" | sed "s|^$$topsrcdirstrip/|$(top_builddir)/|"`;; \
esac; \
if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
dir=`echo "$$file" | sed -e 's,/[^/]*$$,,'`; \
if test "$$dir" != "$$file" && test "$$dir" != "."; then \
dir="/$$dir"; \
$(mkdir_p) "$(distdir)$$dir"; \
else \
dir=''; \
fi; \
if test -d $$d/$$file; then \
if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
cp -pR $(srcdir)/$$file $(distdir)$$dir || exit 1; \
fi; \
cp -pR $$d/$$file $(distdir)$$dir || exit 1; \
else \
test -f $(distdir)/$$file \
|| cp -p $$d/$$file $(distdir)/$$file \
|| exit 1; \
fi; \
done
list='$(DIST_SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test -d "$(distdir)/$$subdir" \
|| mkdir "$(distdir)/$$subdir" \
|| exit 1; \
(cd $$subdir && \
$(MAKE) $(AM_MAKEFLAGS) \
top_distdir="../$(top_distdir)" \
distdir="../$(distdir)/$$subdir" \
distdir) \
|| exit 1; \
fi; \
done
$(MAKE) $(AM_MAKEFLAGS) \
top_distdir="$(top_distdir)" distdir="$(distdir)" \
dist-hook
-find $(distdir) -type d ! -perm -777 -exec chmod a+rwx {} \; -o \
! -type d ! -perm -444 -links 1 -exec chmod a+r {} \; -o \
! -type d ! -perm -400 -exec chmod a+r {} \; -o \
! -type d ! -perm -444 -exec $(SHELL) $(install_sh) -c -m a+r {} {} \; \
|| chmod -R a+r $(distdir)
dist-gzip: distdir
$(AMTAR) chof - $(distdir) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz
$(am__remove_distdir)
dist-bzip2: distdir
$(AMTAR) chof - $(distdir) | bzip2 -9 -c >$(distdir).tar.bz2
$(am__remove_distdir)
dist-tarZ: distdir
$(AMTAR) chof - $(distdir) | compress -c >$(distdir).tar.Z
$(am__remove_distdir)
dist-shar: distdir
shar $(distdir) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).shar.gz
$(am__remove_distdir)
dist-zip: distdir
-rm -f $(distdir).zip
zip -rq $(distdir).zip $(distdir)
$(am__remove_distdir)
dist dist-all: distdir
$(AMTAR) chof - $(distdir) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz
$(am__remove_distdir)
# This target untars the dist file and tries a VPATH configuration. Then
# it guarantees that the distribution is self-contained by making another
# tarfile.
distcheck: dist
case '$(DIST_ARCHIVES)' in \
*.tar.gz*) \
GZIP=$(GZIP_ENV) gunzip -c $(distdir).tar.gz | $(AMTAR) xf - ;;\
*.tar.bz2*) \
bunzip2 -c $(distdir).tar.bz2 | $(AMTAR) xf - ;;\
*.tar.Z*) \
uncompress -c $(distdir).tar.Z | $(AMTAR) xf - ;;\
*.shar.gz*) \
GZIP=$(GZIP_ENV) gunzip -c $(distdir).tar.gz | unshar ;;\
*.zip*) \
unzip $(distdir).zip ;;\
esac
chmod -R a-w $(distdir); chmod a+w $(distdir)
mkdir $(distdir)/_build
mkdir $(distdir)/_inst
chmod a-w $(distdir)
dc_install_base=`$(am__cd) $(distdir)/_inst && pwd | sed -e 's,^[^:\\/]:[\\/],/,'` \
&& dc_destdir="$${TMPDIR-/tmp}/am-dc-$$$$/" \
&& cd $(distdir)/_build \
&& ../configure --srcdir=.. --prefix="$$dc_install_base" \
$(DISTCHECK_CONFIGURE_FLAGS) \
&& $(MAKE) $(AM_MAKEFLAGS) \
&& $(MAKE) $(AM_MAKEFLAGS) dvi \
&& $(MAKE) $(AM_MAKEFLAGS) check \
&& $(MAKE) $(AM_MAKEFLAGS) install \
&& $(MAKE) $(AM_MAKEFLAGS) installcheck \
&& $(MAKE) $(AM_MAKEFLAGS) uninstall \
&& $(MAKE) $(AM_MAKEFLAGS) distuninstallcheck_dir="$$dc_install_base" \
distuninstallcheck \
&& chmod -R a-w "$$dc_install_base" \
&& ({ \
(cd ../.. && umask 077 && mkdir "$$dc_destdir") \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" install \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" uninstall \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" \
distuninstallcheck_dir="$$dc_destdir" distuninstallcheck; \
} || { rm -rf "$$dc_destdir"; exit 1; }) \
&& rm -rf "$$dc_destdir" \
&& $(MAKE) $(AM_MAKEFLAGS) dist \
&& rm -rf $(DIST_ARCHIVES) \
&& $(MAKE) $(AM_MAKEFLAGS) distcleancheck
$(am__remove_distdir)
@(echo "$(distdir) archives ready for distribution: "; \
list='$(DIST_ARCHIVES)'; for i in $$list; do echo $$i; done) | \
sed -e '1{h;s/./=/g;p;x;}' -e '$${p;x;}'
distuninstallcheck:
@cd $(distuninstallcheck_dir) \
&& test `$(distuninstallcheck_listfiles) | wc -l` -le 1 \
|| { echo "ERROR: files left after uninstall:" ; \
if test -n "$(DESTDIR)"; then \
echo " (check DESTDIR support)"; \
fi ; \
$(distuninstallcheck_listfiles) ; \
exit 1; } >&2
distcleancheck: distclean
@if test '$(srcdir)' = . ; then \
echo "ERROR: distcleancheck can only run from a VPATH build" ; \
exit 1 ; \
fi
@test `$(distcleancheck_listfiles) | wc -l` -eq 0 \
|| { echo "ERROR: files left in build directory after distclean:" ; \
$(distcleancheck_listfiles) ; \
exit 1; } >&2
check-am: all-am
check: check-recursive
all-am: Makefile $(SCRIPTS)
installdirs: installdirs-recursive
installdirs-am:
for dir in "$(DESTDIR)$(bindir)"; do \
test -z "$$dir" || $(mkdir_p) "$$dir"; \
done
install: install-recursive
install-exec: install-exec-recursive
install-data: install-data-recursive
uninstall: uninstall-recursive
install-am: all-am
@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
installcheck: installcheck-recursive
install-strip:
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
`test -z '$(STRIP)' || \
echo "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'"` install
mostlyclean-generic:
clean-generic:
distclean-generic:
-rm -f $(CONFIG_CLEAN_FILES)
maintainer-clean-generic:
@echo "This command is intended for maintainers to use"
@echo "it deletes files that may require special tools to rebuild."
clean: clean-recursive
clean-am: clean-generic clean-libtool mostlyclean-am
distclean: distclean-recursive
-rm -f $(am__CONFIG_DISTCLEAN_FILES)
-rm -f Makefile
distclean-am: clean-am distclean-generic distclean-libtool \
distclean-tags
dvi: dvi-recursive
dvi-am:
info: info-recursive
info-am:
install-data-am:
@$(NORMAL_INSTALL)
$(MAKE) $(AM_MAKEFLAGS) install-data-hook
install-exec-am: install-binSCRIPTS
install-info: install-info-recursive
install-man:
installcheck-am:
maintainer-clean: maintainer-clean-recursive
-rm -f $(am__CONFIG_DISTCLEAN_FILES)
-rm -rf $(top_srcdir)/autom4te.cache
-rm -f Makefile
maintainer-clean-am: distclean-am maintainer-clean-generic
mostlyclean: mostlyclean-recursive
mostlyclean-am: mostlyclean-generic mostlyclean-libtool
pdf-am:
ps: ps-recursive
ps-am:
uninstall-am: uninstall-binSCRIPTS uninstall-info-am
@$(NORMAL_INSTALL)
$(MAKE) $(AM_MAKEFLAGS) uninstall-hook
uninstall-info: uninstall-info-recursive
.PHONY: $(RECURSIVE_TARGETS) CTAGS GTAGS all all-am am--refresh check \
check-am clean clean-generic clean-libtool clean-recursive \
ctags ctags-recursive dist dist-all dist-bzip2 dist-gzip \
dist-shar dist-tarZ dist-zip distcheck distclean \
distclean-generic distclean-libtool distclean-recursive \
distclean-tags distcleancheck distdir distuninstallcheck dvi \
dvi-am html html-am info info-am install install-am \
install-binSCRIPTS install-data install-data-am install-exec \
install-exec-am install-info install-info-am install-man \
install-strip installcheck installcheck-am installdirs \
installdirs-am maintainer-clean maintainer-clean-generic \
maintainer-clean-recursive mostlyclean mostlyclean-generic \
mostlyclean-libtool mostlyclean-recursive pdf pdf-am ps ps-am \
tags tags-recursive uninstall uninstall-am \
uninstall-binSCRIPTS uninstall-info-am
dist-hook:
rm -rf $(top_builddir)/tests/log
find $(distdir) -name "*.dist" -exec rm {} \;
(distit=`find $(srcdir) -name "*.dist"`; \
for file in $$distit; do \
strip=`echo $$file | sed -e s/^$(srcdir)// -e s/\.dist//`; \
cp $$file $(distdir)$$strip; \
done)
html:
cd docs; make html
pdf:
cd docs; make pdf
check: test
test:
@(cd tests; $(MAKE) all quiet-test)
test-full:
@(cd tests; $(MAKE) all full-test)
#
# Build source and binary rpms. For rpm-3.0 and above, the ~/.rpmmacros
# must contain the following line:
# %_topdir /home/loic/local/rpm
# and that /home/loic/local/rpm contains the directory SOURCES, BUILD etc.
#
# cd /home/loic/local/rpm ; mkdir -p SOURCES BUILD RPMS/i386 SPECS SRPMS
#
# If additional configure flags are needed to build the package, add the
# following in ~/.rpmmacros
# %configure CFLAGS="%{optflags}" ./configure %{_target_platform} --prefix=%{_prefix} ${AM_CONFIGFLAGS}
# and run make rpm in the following way:
# AM_CONFIGFLAGS='--with-uri=/home/users/loic/local/RedHat-6.2' make rpm
#
rpms:
$(MAKE) RPMDIST=curl rpm
$(MAKE) RPMDIST=curl-ssl rpm
rpm:
RPM_TOPDIR=`rpm --showrc | $(PERL) -n -e 'print if(s/.*_topdir\s+(.*)/$$1/)'` ; \
cp $(srcdir)/packages/Linux/RPM/$(RPMDIST).spec $$RPM_TOPDIR/SPECS ; \
cp $(PACKAGE)-$(VERSION).tar.gz $$RPM_TOPDIR/SOURCES ; \
rpm -ba --clean --rmsource $$RPM_TOPDIR/SPECS/$(RPMDIST).spec ; \
mv $$RPM_TOPDIR/RPMS/i386/$(RPMDIST)-*.rpm . ; \
mv $$RPM_TOPDIR/SRPMS/$(RPMDIST)-*.src.rpm .
#
# Build a Solaris pkkgadd format file
# run 'make pkgadd' once you've done './configure' and 'make' to make a Solaris pkgadd format
# file (which ends up back in this directory).
# The pkgadd file is in 'pkgtrans' format, so to install on Solaris, do
# pkgadd -d ./HAXXcurl-*
#
# gak - libtool requires an absoulte directory, hence the pwd below...
pkgadd:
umask 022 ; \
make install DESTDIR=`/bin/pwd`/packages/Solaris/root ; \
cat COPYING > $(srcdir)/packages/Solaris/copyright ; \
cd $(srcdir)/packages/Solaris && $(MAKE) package
#
# Build a cygwin binary tarball installation file
# resulting .tar.bz2 file will end up at packages/Win32/cygwin
cygwinbin:
$(MAKE) -C packages/Win32/cygwin cygwinbin
# We extend the standard install with a custom hook:
install-data-hook:
cd include && $(MAKE) install
cd docs && $(MAKE) install
# We extend the standard uninstall with a custom hook:
uninstall-hook:
cd include && $(MAKE) uninstall
cd docs && $(MAKE) uninstall
# Tell versions [3.59,3.63) of GNU make to not export all variables.
# Otherwise a system limit (for SysV at least) may be exceeded.
.NOEXPORT:

75
neo/curl/README Normal file
View File

@ -0,0 +1,75 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
README
Curl is a command line tool for transfering data specified with URL
syntax. Find out how to use Curl by reading the curl.1 man page or the
MANUAL document. Find out how to install Curl by reading the INSTALL
document.
libcurl is the library curl is using to do its job. It is readily
available to be used by your software. Read the libcurl.3 man page to
learn how!
You find answers to the most frequent questions we get in the FAQ document.
Study the COPYING file for distribution terms and similar.
CONTACT
If you have problems, questions, ideas or suggestions, please contact us
by posting to a suitable mailing list. See http://curl.haxx.se/mail/
Many major contributors to the project are listed in the THANKS document.
WEB SITE
Visit the curl web site or mirrors for the latest news:
Sweden -- http://curl.haxx.se/
Australia -- http://curl.planetmirror.com/
Estonia -- http://curl.dope-brothers.com/
Germany -- http://curl.mirror.at.stealer.net/
Russia -- http://curl.tsuren.net/
Thailand -- http://curl.siamu.ac.th/
US (CA) -- http://curl.mirror.redwire.net/
DOWNLOAD
The official download mirror sites are:
Australia -- http://curl.planetmirror.com/download/
Estonia -- http://curl.dope-brothers.com/download/
Germany -- ftp://ftp.fu-berlin.de/pub/unix/network/curl/
Hongkong -- http://www.execve.net/curl/
Russia -- http://curl.tsuren.net/download/
Sweden -- ftp://ftp.sunet.se/pub/www/utilities/curl/
Sweden -- http://cool.haxx.se/curl/
Thailand -- http://curl.siamu.ac.th/download/
US (CA) -- http://curl.mirror.redwire.net/download/
CVS
To download the very latest source off the CVS server do this:
cvs -d :pserver:cvsread@cvs.php.net:/repository login
(enter "phpfi" when asked for password)
cvs -d :pserver:cvsread@cvs.php.net:/repository co curl
(you'll get a directory named curl created, filled with the source code)
cvs -d :pserver:cvsread@cvs.php.net:/repository logout
(you're off the hook!)
NOTICE
Curl contains pieces of source code that is Copyright (c) 1998, 1999
Kungliga Tekniska Högskolan. This notice is included here to comply with the
distribution terms.

89
neo/curl/RELEASE-NOTES Normal file
View File

@ -0,0 +1,89 @@
Curl and libcurl 7.11.1. A bugfix release.
Public curl release number: 79
Releases counted from the very beginning: 106
Available command line options: 94
Available curl_easy_setopt() options: 112
This release includes the following changes:
o CURLOPT_POSTFIELDSIZE_LARGE added to offer POSTs larger than 2GB
o CURL_VERSION_LARGEFILE is a feature bit returned by libcurls that feature
large file support
o libcurl only requires winsock 1.1 on windows now
o when doing FTP, curl now sends QUIT before disconnecting
o name resolves can now timeout on windows too
o $HOME is now recognized better when looking for .netrc files
o now re-uses the ares handle when re-using curl handles
o SO_BINDTODEVICE is used for network interface binding
o configure --disable-manual disables the built-in huge manual from the
command line tool
o the default Accept: header used in HTTP requests changed
o asynch dns lookups now require the c-ares library
o curl --socks can be used to set a SOCKS5 proxy to use
o response-headers received after a (proxy) CONNECT request are now passed
to the header callback just like other headers
This release includes the following bugfixes:
o builds and runs on Novell NetWare
o Windows builds now report OS as "i386-pc-win32"
o received signals during SSL connect is handled better
o improved PUT/POST with NTLM/Digest authentication
o following redirects and doing NTLM/Digest (where the first connection gets
closed) with the multi interface work better now
o file: progress meter and getinfo variables work now
o CURLOPT_FRESH_CONNECT and CURLAUTH_NTLM now work when set together
o share interface usage without (un)lock functions segfaulted
o --limit-rate no longer cripples the --speed-limit feature
o fixed verbose output problem with ipv6-enabled re-used connections
o fixed the socks5 code to check version in the socks response properly
o dns cache bug - fixed the 'inuse' counter
o large file fix for Content-Length
o better docs for the share interface
o several configure fixes for mingw/msys
o setting a Host: header is no longer affecting the Host: header used when
libcurl follows a Location:
o fixed numerous compiler warnings on several operating systems and compilers
o PUTing from stdin couldn't disable chunked transfer-encoding
o corrected the mingw makefiles
o improved the configure libz detection
o fixed EPRT/PORT use when doing FTP on ipv6-enabled AIX hosts
o *nroff commands that only support -mandoc and not -man are now supported
(for the built-in manual text in the command line tool)
o fixed the unconditional #include of config.h in hugehelp.c
o builds fine on MPE/iX
o upload using chunked transfer-encoding now sends the last chunk properly
teriminated with an extra CRLF
o Fixed the progress meter display for files >2GB
o persistant connections over a proxy messed up the proxy name/password
o the socks5 code segfaulted if no username/password was set
o the *_LARGE options now take curl_off_t types as parameters and this will
make it possible to handle large files on windows too
o builds with large file support even on systems without strtoll()
Other curl-related news since the previous public release:
o Many platforms are being used to autobuild and autotest curl on a daily
basis. Please join in and test curl on your systems:
http://curl.haxx.se/auto/
o the curl mailing lists moved, (re-)subscribe to the new ones from here:
http://curl.haxx.se/mail/
o c-ares 1.1.0 was relased: http://daniel.haxx.se/projects/c-ares/
o TclCurl 0.11.0 was released:
http://personal1.iddeo.es/andresgarci/tclcurl/english/
o PycURL 7.11.0 was released: http://pycurl.sourceforge.net/
o the libcurl D binding was released:
http://www.atari-soldiers.com/libcurl.html
o new Estonian web site mirror: http://curl.dope-brothers.com/
This release would not have looked like this without help, code, reports and
advice from friends like these:
Gisle Vanem, Vincent Bronner, Richard Bramante, Dirk Manske, Dan Fandrich,
Ken Hirsch, Stadler Stephan, Domenico Andreoli, Patrick Smith, Tor Arntsen,
Andrés García, Tim Baker, Len Krause, Gilad, Ken Rastatter, P R Schaffner,
Greg Hewgill, Ben Greear, Jeff Lawson, Grigory Entin, Doug Porter, David
Byron, Andy Serpa, Joe Halpin, Christopher R. Palmer, Günter Knauf
Thanks! (and sorry if I forgot to mention someone)

592
neo/curl/acinclude.m4 Normal file
View File

@ -0,0 +1,592 @@
dnl Check for how to set a socket to non-blocking state. There seems to exist
dnl four known different ways, with the one used almost everywhere being POSIX
dnl and XPG3, while the other different ways for different systems (old BSD,
dnl Windows and Amiga).
dnl
dnl There are two known platforms (AIX 3.x and SunOS 4.1.x) where the
dnl O_NONBLOCK define is found but does not work. This condition is attempted
dnl to get caught in this script by using an excessive number of #ifdefs...
dnl
AC_DEFUN([CURL_CHECK_NONBLOCKING_SOCKET],
[
AC_MSG_CHECKING([non-blocking sockets style])
AC_TRY_COMPILE([
/* headers for O_NONBLOCK test */
#include <sys/types.h>
#include <unistd.h>
#include <fcntl.h>
],[
/* try to compile O_NONBLOCK */
#if defined(sun) || defined(__sun__) || defined(__SUNPRO_C) || defined(__SUNPRO_CC)
# if defined(__SVR4) || defined(__srv4__)
# define PLATFORM_SOLARIS
# else
# define PLATFORM_SUNOS4
# endif
#endif
#if (defined(_AIX) || defined(__xlC__)) && !defined(_AIX4)
# define PLATFORM_AIX_V3
#endif
#if defined(PLATFORM_SUNOS4) || defined(PLATFORM_AIX_V3) || defined(__BEOS__)
#error "O_NONBLOCK does not work on this platform"
#endif
int socket;
int flags = fcntl(socket, F_SETFL, flags | O_NONBLOCK);
],[
dnl the O_NONBLOCK test was fine
nonblock="O_NONBLOCK"
AC_DEFINE(HAVE_O_NONBLOCK, 1, [use O_NONBLOCK for non-blocking sockets])
],[
dnl the code was bad, try a different program now, test 2
AC_TRY_COMPILE([
/* headers for FIONBIO test */
#include <unistd.h>
#include <stropts.h>
],[
/* FIONBIO source test (old-style unix) */
int socket;
int flags = ioctl(socket, FIONBIO, &flags);
],[
dnl FIONBIO test was good
nonblock="FIONBIO"
AC_DEFINE(HAVE_FIONBIO, 1, [use FIONBIO for non-blocking sockets])
],[
dnl FIONBIO test was also bad
dnl the code was bad, try a different program now, test 3
AC_TRY_COMPILE([
/* headers for ioctlsocket test (cygwin?) */
#include <windows.h>
],[
/* ioctlsocket source code */
int socket;
int flags = ioctlsocket(socket, FIONBIO, &flags);
],[
dnl ioctlsocket test was good
nonblock="ioctlsocket"
AC_DEFINE(HAVE_IOCTLSOCKET, 1, [use ioctlsocket() for non-blocking sockets])
],[
dnl ioctlsocket didnt compile!, go to test 4
AC_TRY_LINK([
/* headers for IoctlSocket test (Amiga?) */
#include <sys/ioctl.h>
],[
/* IoctlSocket source code */
int socket;
int flags = IoctlSocket(socket, FIONBIO, (long)1);
],[
dnl ioctlsocket test was good
nonblock="IoctlSocket"
AC_DEFINE(HAVE_IOCTLSOCKET_CASE, 1, [use Ioctlsocket() for non-blocking sockets])
],[
dnl Ioctlsocket didnt compile, do test 5!
AC_TRY_COMPILE([
/* headers for SO_NONBLOCK test (BeOS) */
#include <sys/types.h>
#include <unistd.h>
#include <fcntl.h>
],[
/* SO_NONBLOCK source code */
long b = 1;
int socket;
int flags = setsockopt(socket, SOL_SOCKET, SO_NONBLOCK, &b, sizeof(b));
],[
dnl the SO_NONBLOCK test was good
nonblock="SO_NONBLOCK"
AC_DEFINE(HAVE_SO_NONBLOCK, 1, [use SO_NONBLOCK for non-blocking sockets])
],[
dnl test 5 didnt compile!
nonblock="nada"
AC_DEFINE(HAVE_DISABLED_NONBLOCKING, 1, [disabled non-blocking sockets])
])
dnl end of fifth test
])
dnl end of forth test
])
dnl end of third test
])
dnl end of second test
])
dnl end of non-blocking try-compile test
AC_MSG_RESULT($nonblock)
if test "$nonblock" = "nada"; then
AC_MSG_WARN([non-block sockets disabled])
fi
])
dnl Check for socklen_t: historically on BSD it is an int, and in
dnl POSIX 1g it is a type of its own, but some platforms use different
dnl types for the argument to getsockopt, getpeername, etc. So we
dnl have to test to find something that will work.
AC_DEFUN([TYPE_SOCKLEN_T],
[
AC_CHECK_TYPE([socklen_t], ,[
AC_MSG_CHECKING([for socklen_t equivalent])
AC_CACHE_VAL([curl_cv_socklen_t_equiv],
[
# Systems have either "struct sockaddr *" or
# "void *" as the second argument to getpeername
curl_cv_socklen_t_equiv=
for arg2 in "struct sockaddr" void; do
for t in int size_t unsigned long "unsigned long"; do
AC_TRY_COMPILE([
#ifdef HAVE_SYS_TYPES_H
#include <sys/types.h>
#endif
#ifdef HAVE_SYS_SOCKET_H
#include <sys/socket.h>
#endif
int getpeername (int, $arg2 *, $t *);
],[
$t len;
getpeername(0,0,&len);
],[
curl_cv_socklen_t_equiv="$t"
break
])
done
done
if test "x$curl_cv_socklen_t_equiv" = x; then
AC_MSG_ERROR([Cannot find a type to use in place of socklen_t])
fi
])
AC_MSG_RESULT($curl_cv_socklen_t_equiv)
AC_DEFINE_UNQUOTED(socklen_t, $curl_cv_socklen_t_equiv,
[type to use in place of socklen_t if not defined])],
[#include <sys/types.h>
#include <sys/socket.h>])
])
dnl Check for in_addr_t: it is used to receive the return code of inet_addr()
dnl and a few other things. If not found, we set it to unsigned int, as even
dnl 64-bit implementations use to set it to a 32-bit type.
AC_DEFUN([TYPE_IN_ADDR_T],
[
AC_CHECK_TYPE([in_addr_t], ,[
AC_MSG_CHECKING([for in_addr_t equivalent])
AC_CACHE_VAL([curl_cv_in_addr_t_equiv],
[
curl_cv_in_addr_t_equiv=
for t in "unsigned long" int size_t unsigned long; do
AC_TRY_COMPILE([
#ifdef HAVE_SYS_TYPES_H
#include <sys/types.h>
#endif
#ifdef HAVE_SYS_SOCKET_H
#include <sys/socket.h>
#endif
#ifdef HAVE_ARPA_INET_H
#include <arpa/inet.h>
#endif
],[
$t data = inet_addr ("1.2.3.4");
],[
curl_cv_in_addr_t_equiv="$t"
break
])
done
if test "x$curl_cv_in_addr_t_equiv" = x; then
AC_MSG_ERROR([Cannot find a type to use in place of in_addr_t])
fi
])
AC_MSG_RESULT($curl_cv_in_addr_t_equiv)
AC_DEFINE_UNQUOTED(in_addr_t, $curl_cv_in_addr_t_equiv,
[type to use in place of in_addr_t if not defined])],
[#include <sys/types.h>
#include <sys/socket.h>
#include <arpa/inet.h>])
])
dnl ************************************************************
dnl check for "localhost", if it doesn't exist, we can't do the
dnl gethostbyname_r tests!
dnl
AC_DEFUN([CURL_CHECK_WORKING_RESOLVER],[
AC_MSG_CHECKING([if "localhost" resolves])
AC_TRY_RUN([
#include <string.h>
#include <sys/types.h>
#include <netdb.h>
int
main () {
struct hostent *h;
h = gethostbyname("localhost");
exit (h == NULL ? 1 : 0); }],[
AC_MSG_RESULT(yes)],[
AC_MSG_RESULT(no)
AC_MSG_ERROR([can't figure out gethostbyname_r() since localhost doesn't resolve])
]
)
])
dnl ************************************************************
dnl check for working getaddrinfo()
dnl
AC_DEFUN([CURL_CHECK_WORKING_GETADDRINFO],[
AC_CACHE_CHECK(for working getaddrinfo, ac_cv_working_getaddrinfo,[
AC_TRY_RUN( [
#include <netdb.h>
#include <sys/types.h>
#include <sys/socket.h>
void main(void) {
struct addrinfo hints, *ai;
int error;
memset(&hints, 0, sizeof(hints));
hints.ai_family = AF_UNSPEC;
hints.ai_socktype = SOCK_STREAM;
error = getaddrinfo("127.0.0.1", "8080", &hints, &ai);
if (error) {
exit(1);
}
else {
exit(0);
}
}
],[
ac_cv_working_getaddrinfo="yes"
],[
ac_cv_working_getaddrinfo="no"
],[
ac_cv_working_getaddrinfo="yes"
])])
if test "$ac_cv_working_getaddrinfo" = "yes"; then
AC_DEFINE(HAVE_GETADDRINFO, 1, [Define if getaddrinfo exists and works])
AC_DEFINE(ENABLE_IPV6, 1, [Define if you want to enable IPv6 support])
IPV6_ENABLED=1
AC_SUBST(IPV6_ENABLED)
fi
])
AC_DEFUN([CURL_CHECK_LOCALTIME_R],
[
dnl check for a few thread-safe functions
AC_CHECK_FUNCS(localtime_r,[
AC_MSG_CHECKING(whether localtime_r is declared)
AC_EGREP_CPP(localtime_r,[
#include <time.h>],[
AC_MSG_RESULT(yes)],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING(whether localtime_r with -D_REENTRANT is declared)
AC_EGREP_CPP(localtime_r,[
#define _REENTRANT
#include <time.h>],[
AC_DEFINE(NEED_REENTRANT)
AC_MSG_RESULT(yes)],
AC_MSG_RESULT(no))])])
])
AC_DEFUN([CURL_CHECK_INET_NTOA_R],
[
dnl determine if function definition for inet_ntoa_r exists.
AC_CHECK_FUNCS(inet_ntoa_r,[
AC_MSG_CHECKING(whether inet_ntoa_r is declared)
AC_EGREP_CPP(inet_ntoa_r,[
#include <arpa/inet.h>],[
AC_DEFINE(HAVE_INET_NTOA_R_DECL, 1, [inet_ntoa_r() is declared])
AC_MSG_RESULT(yes)],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING(whether inet_ntoa_r with -D_REENTRANT is declared)
AC_EGREP_CPP(inet_ntoa_r,[
#define _REENTRANT
#include <arpa/inet.h>],[
AC_DEFINE(HAVE_INET_NTOA_R_DECL, 1, [inet_ntoa_r() is declared])
AC_DEFINE(NEED_REENTRANT, 1, [need REENTRANT defined])
AC_MSG_RESULT(yes)],
AC_MSG_RESULT(no))])])
])
AC_DEFUN([CURL_CHECK_GETHOSTBYADDR_R],
[
dnl check for number of arguments to gethostbyaddr_r. it might take
dnl either 5, 7, or 8 arguments.
AC_CHECK_FUNCS(gethostbyaddr_r,[
AC_MSG_CHECKING(if gethostbyaddr_r takes 5 arguments)
AC_TRY_COMPILE([
#include <sys/types.h>
#include <netdb.h>],[
char * address;
int length;
int type;
struct hostent h;
struct hostent_data hdata;
int rc;
rc = gethostbyaddr_r(address, length, type, &h, &hdata);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYADDR_R_5, 1, [gethostbyaddr_r() takes 5 args])
ac_cv_gethostbyaddr_args=5],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING(if gethostbyaddr_r with -D_REENTRANT takes 5 arguments)
AC_TRY_COMPILE([
#define _REENTRANT
#include <sys/types.h>
#include <netdb.h>],[
char * address;
int length;
int type;
struct hostent h;
struct hostent_data hdata;
int rc;
rc = gethostbyaddr_r(address, length, type, &h, &hdata);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYADDR_R_5, 1, [gethostbyaddr_r() takes 5 args])
AC_DEFINE(NEED_REENTRANT, 1, [need REENTRANT])
ac_cv_gethostbyaddr_args=5],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING(if gethostbyaddr_r takes 7 arguments)
AC_TRY_COMPILE([
#include <sys/types.h>
#include <netdb.h>],[
char * address;
int length;
int type;
struct hostent h;
char buffer[8192];
int h_errnop;
struct hostent * hp;
hp = gethostbyaddr_r(address, length, type, &h,
buffer, 8192, &h_errnop);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYADDR_R_7, 1, [gethostbyaddr_r() takes 7 args] )
ac_cv_gethostbyaddr_args=7],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING(if gethostbyaddr_r takes 8 arguments)
AC_TRY_COMPILE([
#include <sys/types.h>
#include <netdb.h>],[
char * address;
int length;
int type;
struct hostent h;
char buffer[8192];
int h_errnop;
struct hostent * hp;
int rc;
rc = gethostbyaddr_r(address, length, type, &h,
buffer, 8192, &hp, &h_errnop);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYADDR_R_8, 1, [gethostbyaddr_r() takes 8 args])
ac_cv_gethostbyaddr_args=8],[
AC_MSG_RESULT(no)
have_missing_r_funcs="$have_missing_r_funcs gethostbyaddr_r"])])])])])
])
AC_DEFUN([CURL_CHECK_GETHOSTBYNAME_R],
[
dnl check for number of arguments to gethostbyname_r. it might take
dnl either 3, 5, or 6 arguments.
AC_CHECK_FUNCS(gethostbyname_r,[
AC_MSG_CHECKING([if gethostbyname_r takes 3 arguments])
AC_TRY_COMPILE([
#include <string.h>
#include <sys/types.h>
#include <netdb.h>
#undef NULL
#define NULL (void *)0
int
gethostbyname_r(const char *, struct hostent *, struct hostent_data *);],[
struct hostent_data data;
gethostbyname_r(NULL, NULL, NULL);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYNAME_R_3, 1, [gethostbyname_r() takes 3 args])
ac_cv_gethostbyname_args=3],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING([if gethostbyname_r with -D_REENTRANT takes 3 arguments])
AC_TRY_COMPILE([
#define _REENTRANT
#include <string.h>
#include <sys/types.h>
#include <netdb.h>
#undef NULL
#define NULL (void *)0
int
gethostbyname_r(const char *,struct hostent *, struct hostent_data *);],[
struct hostent_data data;
gethostbyname_r(NULL, NULL, NULL);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYNAME_R_3, 1, [gethostbyname_r() takes 3 args])
AC_DEFINE(NEED_REENTRANT, 1, [needs REENTRANT])
ac_cv_gethostbyname_args=3],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING([if gethostbyname_r takes 5 arguments])
AC_TRY_COMPILE([
#include <sys/types.h>
#include <netdb.h>
#undef NULL
#define NULL (void *)0
struct hostent *
gethostbyname_r(const char *, struct hostent *, char *, int, int *);],[
gethostbyname_r(NULL, NULL, NULL, 0, NULL);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYNAME_R_5, 1, [gethostbyname_r() takes 5 args])
ac_cv_gethostbyname_args=5],[
AC_MSG_RESULT(no)
AC_MSG_CHECKING([if gethostbyname_r takes 6 arguments])
AC_TRY_COMPILE([
#include <sys/types.h>
#include <netdb.h>
#undef NULL
#define NULL (void *)0
int
gethostbyname_r(const char *, struct hostent *, char *, size_t,
struct hostent **, int *);],[
gethostbyname_r(NULL, NULL, NULL, 0, NULL, NULL);],[
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_GETHOSTBYNAME_R_6, 1, [gethostbyname_r() takes 6 args])
ac_cv_gethostbyname_args=6],[
AC_MSG_RESULT(no)
have_missing_r_funcs="$have_missing_r_funcs gethostbyname_r"],
[ac_cv_gethostbyname_args=0])],
[ac_cv_gethostbyname_args=0])],
[ac_cv_gethostbyname_args=0])],
[ac_cv_gethostbyname_args=0])])
if test "$ac_cv_func_gethostbyname_r" = "yes"; then
if test "$ac_cv_gethostbyname_args" = "0"; then
dnl there's a gethostbyname_r() function, but we don't know how
dnl many arguments it wants!
AC_MSG_ERROR([couldn't figure out how to use gethostbyname_r()])
fi
fi
])
dnl We create a function for detecting which compiler we use and then set as
dnl pendantic compiler options as possible for that particular compiler. The
dnl options are only used for debug-builds.
AC_DEFUN([CURL_CC_DEBUG_OPTS],
[
if test "$GCC" = "yes"; then
dnl figure out gcc version!
AC_MSG_CHECKING([gcc version])
gccver=`$CC -dumpversion`
num1=`echo $gccver | cut -d . -f1`
num2=`echo $gccver | cut -d . -f2`
gccnum=`(expr $num1 "*" 100 + $num2) 2>/dev/null`
AC_MSG_RESULT($gccver)
AC_MSG_CHECKING([if this is icc in disguise])
AC_EGREP_CPP([^__INTEL_COMPILER], [__INTEL_COMPILER],
dnl action if the text is found, this it has not been replaced by the
dnl cpp
ICC="no"
AC_MSG_RESULT([no]),
dnl the text was not found, it was replaced by the cpp
ICC="yes"
AC_MSG_RESULT([yes])
)
if test "$ICC" = "yes"; then
dnl this is icc, not gcc.
dnl ICC warnings we ignore:
dnl * 269 warns on our "%Od" printf formatters for curl_off_t output:
dnl "invalid format string conversion"
dnl * 279 warns on static conditions in while expressions
dnl * 981 warns on "operands are evaluated in unspecified order"
dnl * 1419 warns on "external declaration in primary source file"
dnl which we know and do on purpose.
WARN="-wd279,269,1419,981"
if test "$gccnum" -gt "600"; then
dnl icc 6.0 and older doesn't have the -Wall flag
WARN="-Wall $WARN"
fi
else dnl $ICC = yes
dnl this is a set of options we believe *ALL* gcc versions support:
WARN="-W -Wall -Wwrite-strings -pedantic -Wno-long-long -Wpointer-arith -Wnested-externs -Winline -Wmissing-declarations -Wmissing-prototypes -Wsign-compare"
dnl -Wcast-align is a bit too annoying on all gcc versions ;-)
if test "$gccnum" -gt "295"; then
dnl only if the compiler is newer than 2.95 since we got lots of
dnl "`_POSIX_C_SOURCE' is not defined" in system headers with
dnl gcc 2.95.4 on FreeBSD 4.9!
WARN="$WARN -Wundef"
fi
if test "$gccnum" -ge "296"; then
dnl gcc 2.96 or later
WARN="$WARN -Wfloat-equal"
fi
if test "$gccnum" -gt "296"; then
dnl this option does not exist in 2.96
WARN="$WARN -Wno-format-nonliteral"
fi
dnl -Wunreachable-code seems totally unreliable on my gcc 3.3.2 on
dnl on i686-Linux as it gives us heaps with false positives
if test "$gccnum" -ge "303"; then
dnl gcc 3.3 and later
WARN="$WARN -Wendif-labels -Wstrict-prototypes"
fi
for flag in $CPPFLAGS; do
case "$flag" in
-I*)
dnl Include path, provide a -isystem option for the same dir
dnl to prevent warnings in those dirs. The -isystem was not very
dnl reliable on earlier gcc versions.
add=`echo $flag | sed 's/^-I/-isystem /g'`
WARN="$WARN $add"
;;
esac
done
fi dnl $ICC = no
CFLAGS="$CFLAGS $WARN"
AC_MSG_NOTICE([Added this set of compiler options: $WARN])
else dnl $GCC = yes
AC_MSG_NOTICE([Added no extra compiler options])
fi dnl $GCC = yes
dnl strip off optimizer flags
NEWFLAGS=""
for flag in $CFLAGS; do
case "$flag" in
-O*)
dnl echo "cut off $flag"
;;
*)
NEWFLAGS="$NEWFLAGS $flag"
;;
esac
done
CFLAGS=$NEWFLAGS
]) dnl end of AC_DEFUN()

6876
neo/curl/aclocal.m4 vendored Normal file

File diff suppressed because it is too large Load Diff

1435
neo/curl/config.guess vendored Normal file

File diff suppressed because it is too large Load Diff

1537
neo/curl/config.sub vendored Normal file

File diff suppressed because it is too large Load Diff

31052
neo/curl/configure vendored Executable file

File diff suppressed because it is too large Load Diff

1274
neo/curl/configure.ac Normal file

File diff suppressed because it is too large Load Diff

133
neo/curl/curl-config.in Normal file
View File

@ -0,0 +1,133 @@
#! /bin/sh
#
# The idea to this kind of setup info script was stolen from numerous
# other packages, such as neon, libxml and gnome.
#
# $Id: curl-config.in,v 1.18 2003/12/08 10:00:21 bagder Exp $
#
prefix=@prefix@
exec_prefix=@exec_prefix@
includedir=@includedir@
usage()
{
cat <<EOF
Usage: curl-config [OPTION]
Available values for OPTION include:
--ca ca bundle install path
--cc compiler
--cflags pre-processor and compiler flags
--feature newline separated list of enabled features
--help display this help and exit
--libs library linking information
--prefix curl install prefix
--version output version information
--vernum output the version information as a number (hexadecimal)
EOF
exit $1
}
if test $# -eq 0; then
usage 1
fi
while test $# -gt 0; do
case "$1" in
# this deals with options in the style
# --option=value and extracts the value part
# [not currently used]
-*=*) value=`echo "$1" | sed 's/[-_a-zA-Z0-9]*=//'` ;;
*) value= ;;
esac
case "$1" in
--ca)
echo @CURL_CA_BUNDLE@
;;
--cc)
echo @CC@
;;
--prefix)
echo $prefix
;;
--feature)
if test "@OPENSSL_ENABLED@" = "1"; then
echo "SSL"
fi
if test "@KRB4_ENABLED@" = "1"; then
echo "KRB4"
fi
if test "@IPV6_ENABLED@" = "1"; then
echo "IPv6"
fi
if test "@HAVE_LIBZ@" = "1"; then
echo "libz"
fi
if test "@CURL_DISABLE_HTTP@" = "1"; then
echo "HTTP-disabled"
fi
if test "@CURL_DISABLE_FTP@" = "1"; then
echo "FTP-disabled"
fi
if test "@CURL_DISABLE_GOPHER@" = "1"; then
echo "GOPHER-disabled"
fi
if test "@CURL_DISABLE_FILE@" = "1"; then
echo "FILE-disabled"
fi
if test "@CURL_DISABLE_TELNET@" = "1"; then
echo "TELNET-disabled"
fi
if test "@CURL_DISABLE_LDAP@" = "1"; then
echo "LDAP-disabled"
fi
if test "@CURL_DISABLE_DICT@" = "1"; then
echo "DICT-disabled"
fi
if test "@HAVE_ARES@" = "1"; then
echo "AsynchDNS"
fi
;;
--version)
echo libcurl @VERSION@
exit 0
;;
--vernum)
echo @VERSIONNUM@
exit 0
;;
--help)
usage 0
;;
--cflags)
if test "X@includedir@" = "X/usr/include"; then
echo ""
else
echo "-I@includedir@"
fi
;;
--libs)
echo -L@libdir@ -lcurl @LDFLAGS@ @LIBS@
;;
*)
echo "unknown option: $1"
usage
exit 1
;;
esac
shift
done
exit 0

50
neo/curl/curl-style.el Normal file
View File

@ -0,0 +1,50 @@
;;;; Emacs Lisp help for writing curl code. ;;;;
;;;; $Id: curl-style.el,v 1.7 2004/03/09 22:55:47 bagder Exp $
;;; The curl hacker's C conventions.
;;; After loading this file and added the mode-hook you can in C
;;; files, put something like this to use the curl style
;;; automatically:
;;
;; /* -----------------------------------------------------------------
;; * local variables:
;; * eval: (set c-file-style "curl")
;; * end:
;; */
;;
(defconst curl-c-style
'((c-basic-offset . 2)
(c-comment-only-line-offset . 0)
(c-hanging-braces-alist . ((substatement-open before after)))
(c-offsets-alist . ((topmost-intro . 0)
(topmost-intro-cont . 0)
(substatement . +)
(substatement-open . 0)
(statement-case-intro . +)
(statement-case-open . 0)
(case-label . 0)
))
)
"Curl C Programming Style")
;; Customizations for all of c-mode, c++-mode, and objc-mode
(defun curl-c-mode-common-hook ()
"Curl C mode hook"
;; add curl style and set it for the current buffer
(c-add-style "curl" curl-c-style t)
(setq tab-width 8
indent-tabs-mode nil ; Use spaces. Not tabs.
comment-column 40
c-font-lock-extra-types (append '("bool" "CURL" "CURLcode" "ssize_t" "size_t" "socklen_t" "fd_set" "time_t" "curl_off_t" "curl_socket_t"))
)
;; keybindings for C, C++, and Objective-C. We can put these in
;; c-mode-base-map because of inheritance ...
(define-key c-mode-base-map "\M-q" 'c-fill-paragraph)
(setq c-recognize-knr-p nil)
)
;; Set this is in your .emacs if you want to use the c-mode-hook as
;; defined here right out of the box.
; (add-hook 'c-mode-common-hook 'curl-c-mode-common-hook)

479
neo/curl/depcomp Normal file
View File

@ -0,0 +1,479 @@
#! /bin/sh
# depcomp - compile a program generating dependencies as side-effects
# Copyright 1999, 2000, 2003 Free Software Foundation, Inc.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2, or (at your option)
# any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA
# 02111-1307, USA.
# As a special exception to the GNU General Public License, if you
# distribute this file as part of a program that contains a
# configuration script generated by Autoconf, you may include it under
# the same distribution terms that you use for the rest of that program.
# Originally written by Alexandre Oliva <oliva@dcc.unicamp.br>.
if test -z "$depmode" || test -z "$source" || test -z "$object"; then
echo "depcomp: Variables source, object and depmode must be set" 1>&2
exit 1
fi
# `libtool' can also be set to `yes' or `no'.
if test -z "$depfile"; then
base=`echo "$object" | sed -e 's,^.*/,,' -e 's,\.\([^.]*\)$,.P\1,'`
dir=`echo "$object" | sed 's,/.*$,/,'`
if test "$dir" = "$object"; then
dir=
fi
# FIXME: should be _deps on DOS.
depfile="$dir.deps/$base"
fi
tmpdepfile=${tmpdepfile-`echo "$depfile" | sed 's/\.\([^.]*\)$/.T\1/'`}
rm -f "$tmpdepfile"
# Some modes work just like other modes, but use different flags. We
# parameterize here, but still list the modes in the big case below,
# to make depend.m4 easier to write. Note that we *cannot* use a case
# here, because this file can only contain one case statement.
if test "$depmode" = hp; then
# HP compiler uses -M and no extra arg.
gccflag=-M
depmode=gcc
fi
if test "$depmode" = dashXmstdout; then
# This is just like dashmstdout with a different argument.
dashmflag=-xM
depmode=dashmstdout
fi
case "$depmode" in
gcc3)
## gcc 3 implements dependency tracking that does exactly what
## we want. Yay! Note: for some reason libtool 1.4 doesn't like
## it if -MD -MP comes after the -MF stuff. Hmm.
"$@" -MT "$object" -MD -MP -MF "$tmpdepfile"
stat=$?
if test $stat -eq 0; then :
else
rm -f "$tmpdepfile"
exit $stat
fi
mv "$tmpdepfile" "$depfile"
;;
gcc)
## There are various ways to get dependency output from gcc. Here's
## why we pick this rather obscure method:
## - Don't want to use -MD because we'd like the dependencies to end
## up in a subdir. Having to rename by hand is ugly.
## (We might end up doing this anyway to support other compilers.)
## - The DEPENDENCIES_OUTPUT environment variable makes gcc act like
## -MM, not -M (despite what the docs say).
## - Using -M directly means running the compiler twice (even worse
## than renaming).
if test -z "$gccflag"; then
gccflag=-MD,
fi
"$@" -Wp,"$gccflag$tmpdepfile"
stat=$?
if test $stat -eq 0; then :
else
rm -f "$tmpdepfile"
exit $stat
fi
rm -f "$depfile"
echo "$object : \\" > "$depfile"
alpha=ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz
## The second -e expression handles DOS-style file names with drive letters.
sed -e 's/^[^:]*: / /' \
-e 's/^['$alpha']:\/[^:]*: / /' < "$tmpdepfile" >> "$depfile"
## This next piece of magic avoids the `deleted header file' problem.
## The problem is that when a header file which appears in a .P file
## is deleted, the dependency causes make to die (because there is
## typically no way to rebuild the header). We avoid this by adding
## dummy dependencies for each header file. Too bad gcc doesn't do
## this for us directly.
tr ' ' '
' < "$tmpdepfile" |
## Some versions of gcc put a space before the `:'. On the theory
## that the space means something, we add a space to the output as
## well.
## Some versions of the HPUX 10.20 sed can't process this invocation
## correctly. Breaking it into two sed invocations is a workaround.
sed -e 's/^\\$//' -e '/^$/d' -e '/:$/d' | sed -e 's/$/ :/' >> "$depfile"
rm -f "$tmpdepfile"
;;
hp)
# This case exists only to let depend.m4 do its work. It works by
# looking at the text of this script. This case will never be run,
# since it is checked for above.
exit 1
;;
sgi)
if test "$libtool" = yes; then
"$@" "-Wp,-MDupdate,$tmpdepfile"
else
"$@" -MDupdate "$tmpdepfile"
fi
stat=$?
if test $stat -eq 0; then :
else
rm -f "$tmpdepfile"
exit $stat
fi
rm -f "$depfile"
if test -f "$tmpdepfile"; then # yes, the sourcefile depend on other files
echo "$object : \\" > "$depfile"
# Clip off the initial element (the dependent). Don't try to be
# clever and replace this with sed code, as IRIX sed won't handle
# lines with more than a fixed number of characters (4096 in
# IRIX 6.2 sed, 8192 in IRIX 6.5). We also remove comment lines;
# the IRIX cc adds comments like `#:fec' to the end of the
# dependency line.
tr ' ' '
' < "$tmpdepfile" \
| sed -e 's/^.*\.o://' -e 's/#.*$//' -e '/^$/ d' | \
tr '
' ' ' >> $depfile
echo >> $depfile
# The second pass generates a dummy entry for each header file.
tr ' ' '
' < "$tmpdepfile" \
| sed -e 's/^.*\.o://' -e 's/#.*$//' -e '/^$/ d' -e 's/$/:/' \
>> $depfile
else
# The sourcefile does not contain any dependencies, so just
# store a dummy comment line, to avoid errors with the Makefile
# "include basename.Plo" scheme.
echo "#dummy" > "$depfile"
fi
rm -f "$tmpdepfile"
;;
aix)
# The C for AIX Compiler uses -M and outputs the dependencies
# in a .u file. In older versions, this file always lives in the
# current directory. Also, the AIX compiler puts `$object:' at the
# start of each line; $object doesn't have directory information.
# Version 6 uses the directory in both cases.
stripped=`echo "$object" | sed 's/\(.*\)\..*$/\1/'`
tmpdepfile="$stripped.u"
if test "$libtool" = yes; then
"$@" -Wc,-M
else
"$@" -M
fi
stat=$?
if test -f "$tmpdepfile"; then :
else
stripped=`echo "$stripped" | sed 's,^.*/,,'`
tmpdepfile="$stripped.u"
fi
if test $stat -eq 0; then :
else
rm -f "$tmpdepfile"
exit $stat
fi
if test -f "$tmpdepfile"; then
outname="$stripped.o"
# Each line is of the form `foo.o: dependent.h'.
# Do two passes, one to just change these to
# `$object: dependent.h' and one to simply `dependent.h:'.
sed -e "s,^$outname:,$object :," < "$tmpdepfile" > "$depfile"
sed -e "s,^$outname: \(.*\)$,\1:," < "$tmpdepfile" >> "$depfile"
else
# The sourcefile does not contain any dependencies, so just
# store a dummy comment line, to avoid errors with the Makefile
# "include basename.Plo" scheme.
echo "#dummy" > "$depfile"
fi
rm -f "$tmpdepfile"
;;
icc)
# Intel's C compiler understands `-MD -MF file'. However on
# icc -MD -MF foo.d -c -o sub/foo.o sub/foo.c
# ICC 7.0 will fill foo.d with something like
# foo.o: sub/foo.c
# foo.o: sub/foo.h
# which is wrong. We want:
# sub/foo.o: sub/foo.c
# sub/foo.o: sub/foo.h
# sub/foo.c:
# sub/foo.h:
# ICC 7.1 will output
# foo.o: sub/foo.c sub/foo.h
# and will wrap long lines using \ :
# foo.o: sub/foo.c ... \
# sub/foo.h ... \
# ...
"$@" -MD -MF "$tmpdepfile"
stat=$?
if test $stat -eq 0; then :
else
rm -f "$tmpdepfile"
exit $stat
fi
rm -f "$depfile"
# Each line is of the form `foo.o: dependent.h',
# or `foo.o: dep1.h dep2.h \', or ` dep3.h dep4.h \'.
# Do two passes, one to just change these to
# `$object: dependent.h' and one to simply `dependent.h:'.
sed "s,^[^:]*:,$object :," < "$tmpdepfile" > "$depfile"
# Some versions of the HPUX 10.20 sed can't process this invocation
# correctly. Breaking it into two sed invocations is a workaround.
sed 's,^[^:]*: \(.*\)$,\1,;s/^\\$//;/^$/d;/:$/d' < "$tmpdepfile" |
sed -e 's/$/ :/' >> "$depfile"
rm -f "$tmpdepfile"
;;
tru64)
# The Tru64 compiler uses -MD to generate dependencies as a side
# effect. `cc -MD -o foo.o ...' puts the dependencies into `foo.o.d'.
# At least on Alpha/Redhat 6.1, Compaq CCC V6.2-504 seems to put
# dependencies in `foo.d' instead, so we check for that too.
# Subdirectories are respected.
dir=`echo "$object" | sed -e 's|/[^/]*$|/|'`
test "x$dir" = "x$object" && dir=
base=`echo "$object" | sed -e 's|^.*/||' -e 's/\.o$//' -e 's/\.lo$//'`
if test "$libtool" = yes; then
tmpdepfile1="$dir.libs/$base.lo.d"
tmpdepfile2="$dir.libs/$base.d"
"$@" -Wc,-MD
else
tmpdepfile1="$dir$base.o.d"
tmpdepfile2="$dir$base.d"
"$@" -MD
fi
stat=$?
if test $stat -eq 0; then :
else
rm -f "$tmpdepfile1" "$tmpdepfile2"
exit $stat
fi
if test -f "$tmpdepfile1"; then
tmpdepfile="$tmpdepfile1"
else
tmpdepfile="$tmpdepfile2"
fi
if test -f "$tmpdepfile"; then
sed -e "s,^.*\.[a-z]*:,$object:," < "$tmpdepfile" > "$depfile"
# That's a tab and a space in the [].
sed -e 's,^.*\.[a-z]*:[ ]*,,' -e 's,$,:,' < "$tmpdepfile" >> "$depfile"
else
echo "#dummy" > "$depfile"
fi
rm -f "$tmpdepfile"
;;
#nosideeffect)
# This comment above is used by automake to tell side-effect
# dependency tracking mechanisms from slower ones.
dashmstdout)
# Important note: in order to support this mode, a compiler *must*
# always write the preprocessed file to stdout, regardless of -o.
"$@" || exit $?
# Remove the call to Libtool.
if test "$libtool" = yes; then
while test $1 != '--mode=compile'; do
shift
done
shift
fi
# Remove `-o $object'.
IFS=" "
for arg
do
case $arg in
-o)
shift
;;
$object)
shift
;;
*)
set fnord "$@" "$arg"
shift # fnord
shift # $arg
;;
esac
done
test -z "$dashmflag" && dashmflag=-M
# Require at least two characters before searching for `:'
# in the target name. This is to cope with DOS-style filenames:
# a dependency such as `c:/foo/bar' could be seen as target `c' otherwise.
"$@" $dashmflag |
sed 's:^[ ]*[^: ][^:][^:]*\:[ ]*:'"$object"'\: :' > "$tmpdepfile"
rm -f "$depfile"
cat < "$tmpdepfile" > "$depfile"
tr ' ' '
' < "$tmpdepfile" | \
## Some versions of the HPUX 10.20 sed can't process this invocation
## correctly. Breaking it into two sed invocations is a workaround.
sed -e 's/^\\$//' -e '/^$/d' -e '/:$/d' | sed -e 's/$/ :/' >> "$depfile"
rm -f "$tmpdepfile"
;;
dashXmstdout)
# This case only exists to satisfy depend.m4. It is never actually
# run, as this mode is specially recognized in the preamble.
exit 1
;;
makedepend)
"$@" || exit $?
# Remove any Libtool call
if test "$libtool" = yes; then
while test $1 != '--mode=compile'; do
shift
done
shift
fi
# X makedepend
shift
cleared=no
for arg in "$@"; do
case $cleared in
no)
set ""; shift
cleared=yes ;;
esac
case "$arg" in
-D*|-I*)
set fnord "$@" "$arg"; shift ;;
# Strip any option that makedepend may not understand. Remove
# the object too, otherwise makedepend will parse it as a source file.
-*|$object)
;;
*)
set fnord "$@" "$arg"; shift ;;
esac
done
obj_suffix="`echo $object | sed 's/^.*\././'`"
touch "$tmpdepfile"
${MAKEDEPEND-makedepend} -o"$obj_suffix" -f"$tmpdepfile" "$@"
rm -f "$depfile"
cat < "$tmpdepfile" > "$depfile"
sed '1,2d' "$tmpdepfile" | tr ' ' '
' | \
## Some versions of the HPUX 10.20 sed can't process this invocation
## correctly. Breaking it into two sed invocations is a workaround.
sed -e 's/^\\$//' -e '/^$/d' -e '/:$/d' | sed -e 's/$/ :/' >> "$depfile"
rm -f "$tmpdepfile" "$tmpdepfile".bak
;;
cpp)
# Important note: in order to support this mode, a compiler *must*
# always write the preprocessed file to stdout.
"$@" || exit $?
# Remove the call to Libtool.
if test "$libtool" = yes; then
while test $1 != '--mode=compile'; do
shift
done
shift
fi
# Remove `-o $object'.
IFS=" "
for arg
do
case $arg in
-o)
shift
;;
$object)
shift
;;
*)
set fnord "$@" "$arg"
shift # fnord
shift # $arg
;;
esac
done
"$@" -E |
sed -n '/^# [0-9][0-9]* "\([^"]*\)".*/ s:: \1 \\:p' |
sed '$ s: \\$::' > "$tmpdepfile"
rm -f "$depfile"
echo "$object : \\" > "$depfile"
cat < "$tmpdepfile" >> "$depfile"
sed < "$tmpdepfile" '/^$/d;s/^ //;s/ \\$//;s/$/ :/' >> "$depfile"
rm -f "$tmpdepfile"
;;
msvisualcpp)
# Important note: in order to support this mode, a compiler *must*
# always write the preprocessed file to stdout, regardless of -o,
# because we must use -o when running libtool.
"$@" || exit $?
IFS=" "
for arg
do
case "$arg" in
"-Gm"|"/Gm"|"-Gi"|"/Gi"|"-ZI"|"/ZI")
set fnord "$@"
shift
shift
;;
*)
set fnord "$@" "$arg"
shift
shift
;;
esac
done
"$@" -E |
sed -n '/^#line [0-9][0-9]* "\([^"]*\)"/ s::echo "`cygpath -u \\"\1\\"`":p' | sort | uniq > "$tmpdepfile"
rm -f "$depfile"
echo "$object : \\" > "$depfile"
. "$tmpdepfile" | sed 's% %\\ %g' | sed -n '/^\(.*\)$/ s:: \1 \\:p' >> "$depfile"
echo " " >> "$depfile"
. "$tmpdepfile" | sed 's% %\\ %g' | sed -n '/^\(.*\)$/ s::\1\::p' >> "$depfile"
rm -f "$tmpdepfile"
;;
none)
exec "$@"
;;
*)
echo "Unknown depmode $depmode" 1>&2
exit 1
;;
esac
exit 0

119
neo/curl/docs/BINDINGS Normal file
View File

@ -0,0 +1,119 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
libcurl bindings
Creative people have written bindings or interfaces for various environments
and programming languages. Using one of these allows you to take advantage of
curl powers from within your favourite language or system.
This is a list of all known interfaces as of this writing.
The bindings listed below are not part of the curl/libcurl distribution
archives, but must be downloaded and installed separately.
Ada95
Writtten by Andreas Almroth.
http://www.almroth.com/adacurl/index.html
Basic
ScriptBasic bindings to libcurl. Writtten by Peter Verhas.
http://scriptbasic.com/
C++
Written by Jean-Philippe Barrette-LaPierre.
http://www.sourceforge.net/projects/curlpp
Cocoa
Written by Dan Wood.
http://curlhandle.sourceforge.net/
D
Written by Charles Sanders and James Wavro
http://www.atari-soldiers.com/libcurl.html
Dylan
Written by Chris Double.
http://dylanlibs.sourceforge.net/
Euphoria
Written by Ray Smith.
http://rays-web.com/eulibcurl.htm
Ferite
http://www.ferite.org/
Java
Written by Daniel Stenberg.
http://curl.haxx.se/libcurl/java/
Lua
Written by Steve Dekorte.
http://curl.haxx.se/libcurl/lua/
Object-Pascal
Free Pascal, Delphi and Kylix binding written by Christophe Espern.
http://www.tekool.com/opcurl
O'Caml
Written by Lars Nilsson.
http://sourceforge.net/projects/ocurl/
Pascal
Free Pascal, Delphi and Kylix binding written by Jeffrey Pohlmeyer.
http://houston.quik.com/jkp/curlpas/
Perl
Maintained by Cris Bailiff.
http://curl.haxx.se/libcurl/perl/
PHP
Written by Sterling Hughes.
http://curl.haxx.se/libcurl/php/
PostgreSQL
Written by Gian Paolo Ciceri.
http://gborg.postgresql.org/project/pgcurl/projdisplay.php
Python
Written by Kjetil Jacobsen.
http://pycurl.sourceforge.net/
Rexx
Written Mark Hessling.
http://rexxcurl.sourceforge.net/
Ruby
Written by Hirotaka Matsuyuki.
http://www.d1.dion.ne.jp/~matuyuki/ruby.html
Scheme
Bigloo binding written by Kirill Lisovsky.
http://curl.haxx.se/libcurl/scheme/
Tcl
Written by Andrés García.
http://personal1.iddeo.es/andresgarci/tclcurl/english/docs.html

81
neo/curl/docs/BUGS Normal file
View File

@ -0,0 +1,81 @@
$Id: BUGS,v 1.7 2003/08/18 15:24:46 bagder Exp $
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
BUGS
Curl and libcurl have grown substantially since the beginning. At the time
of writing (August 2003), there are about 40000 lines of source code, and by
the time you read this it has probably grown even more.
Of course there are lots of bugs left. And lots of misfeatures.
To help us make curl the stable and solid product we want it to be, we need
bug reports and bug fixes.
WHERE TO REPORT
If you can't fix a bug yourself and submit a fix for it, try to report an as
detailed report as possible to a curl mailing list to allow one of us to
have a go at a solution. You should also post your bug/problem at curl's bug
tracking system over at
http://sourceforge.net/bugs/?group_id=976
(but please read the sections below first before doing that)
If you feel you need to ask around first, find a suitable mailing list and
post there. The lists are available on http://curl.haxx.se/mail/
WHAT TO REPORT
When reporting a bug, you should include all information that will help us
understand what's wrong, what you expected to happen and how to repeat the
bad behavior. You therefore need to tell us:
- your operating system's name and version number (uname -a under a unix
is fine)
- what version of curl you're using (curl -V is fine)
- what URL you were working with (if possible), at least which protocol
and anything and everything else you think matters. Tell us what you
expected to happen, tell use what did happen, tell us how you could make it
work another way. Dig around, try out, test. Then include all the tiny bits
and pieces in your report. You will benefit from this yourself, as it will
enable us to help you quicker and more accurately.
Since curl deals with networks, it often helps us if you include a protocol
debug dump with your bug report. The output you get by using the -v or
--trace options.
If curl crashed, causing a core dump (in unix), there is hardly any use to
send that huge file to anyone of us. Unless we have an exact same system
setup as you, we can't do much with it. Instead we ask you to get a stack
trace and send that (much smaller) output to us instead!
The address and how to subscribe to the mailing lists are detailed in the
MANUAL file.
HOW TO GET A STACK TRACE
First, you must make sure that you compile all sources with -g and that you
don't 'strip' the final executable. Try to avoid optimizing the code as
well, remove -O, -O2 etc from the compiler options.
Run the program until it cores.
Run your debugger on the core file, like '<debugger> curl core'. <debugger>
should be replaced with the name of your debugger, in most cases that will
be 'gdb', but 'dbx' and others also occur.
When the debugger has finished loading the core file and presents you a
prompt, enter 'where' (without the quotes) and press return.
The list that is presented is the stack trace. If everything worked, it is
supposed to contain the chain of functions that were called when curl
crashed. Include the stack trace with your detailed bug report. It'll help a
lot.

159
neo/curl/docs/CONTRIBUTE Normal file
View File

@ -0,0 +1,159 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
To Think About When Contributing Source Code
This document is intended to offer some simple guidelines that can be useful
to keep in mind when you decide to contribute to the project. This concerns
new features as well as corrections to existing flaws or bugs.
Join the Community
Skip over to http://curl.haxx.se/mail/ and join the appropriate mailing
list(s). Read up on details before you post questions. Read this file before
you start sending patches! We prefer patches and discussions being held on
the mailing list(s), not sent to individuals.
The License Issue
When contributing with code, you agree to put your changes and new code under
the same license curl and libcurl is already using unless stated otherwise.
If you add a larger piece of code, you can opt to make that file or set of
files to use a different license as long as they don't enforce any changes to
the rest of the package and they make sense. Such "separate parts" can not be
GPL (as we don't want the GPL virus to attack users of libcurl) but they must
use "GPL compatible" licenses.
What To Read
Source code, the man pages, the INTERNALS document, the TODO, the most recent
CHANGES. Just lurking on the libcurl mailing list is gonna give you a lot of
insights on what's going on right now. Asking there is a good idea too.
Naming
Try using a non-confusing naming scheme for your new functions and variable
names. It doesn't necessarily have to mean that you should use the same as in
other places of the code, just that the names should be logical,
understandable and be named according to what they're used for. File-local
functions should be made static. We like lower case names.
See the INTERNALS document on how we name non-exported library-global
symbols.
Indenting
Please try using the same indenting levels and bracing method as all the
other code already does. It makes the source code a lot easier to follow if
all of it is written using the same style. We don't ask you to like it, we
just ask you to follow the tradition! ;-) This mainly means: 2-level indents,
using spaces only (no tabs) and having the opening brace ({) on the same line
as the if() or while().
Commenting
Comment your source code extensively using C comments (/* comment */), DO NOT
use C++ comments (// this style). Commented code is quality code and enables
future modifications much more. Uncommented code risk having to be completely
replaced when someone wants to extend things, since other persons' source
code can get quite hard to read.
General Style
Keep your functions small. If they're small you avoid a lot of mistakes and
you don't accidentally mix up variables etc.
Non-clobbering All Over
When you write new functionality or fix bugs, it is important that you don't
fiddle all over the source files and functions. Remember that it is likely
that other people have done changes in the same source files as you have and
possibly even in the same functions. If you bring completely new
functionality, try writing it in a new source file. If you fix bugs, try to
fix one bug at a time and send them as separate patches.
Platform Dependent Code
Use #ifdef HAVE_FEATURE to do conditional code. We avoid checking for
particular operating systems or hardware in the #ifdef lines. The
HAVE_FEATURE shall be generated by the configure script for unix-like systems
and they are hard-coded in the config-[system].h files for the others.
Separate Patches
It is annoying when you get a huge patch from someone that is said to fix 511
odd problems, but discussions and opinions don't agree with 510 of them - or
509 of them were already fixed in a different way. Then the patcher needs to
extract the single interesting patch from somewhere within the huge pile of
source, and that gives a lot of extra work. Preferably, all fixes that
correct different problems should be in their own patch with an attached
description exactly what they correct so that all patches can be selectively
applied by the maintainer or other interested parties.
Patch Against Recent Sources
Please try to get the latest available sources to make your patches
against. It makes the life of the developers so much easier. The very best is
if you get the most up-to-date sources from the CVS repository, but the
latest release archive is quite OK as well!
Document
Writing docs is dead boring and one of the big problems with many open source
projects. Someone's gotta do it. It makes it a lot easier if you submit a
small description of your fix or your new features with every contribution so
that it can be swiftly added to the package documentation.
The documentation is always made in man pages (nroff formatted) or plain
ASCII files. All HTML files on the web site and in the release archives are
generated from the nroff/ASCII versions.
Write Access to CVS Repository
If you are a frequent contributor, or have another good reason, you can of
course get write access to the CVS repository and then you'll be able to
check-in all your changes straight into the CVS tree instead of sending all
changes by mail as patches. Just ask if this is what you'd want. You will be
required to have posted a few quality patches first, before you can be
granted write access.
Test Cases
Since the introduction of the test suite, we can quickly verify that the main
features are working as they're supposed to. To maintain this situation and
improve it, all new features and functions that are added need to be tested
in the test suite. Every feature that is added should get at least one valid
test case that verifies that it works as documented. If every submitter also
posts a few test cases, it won't end up as a heavy burden on a single person!
How To Make a Patch
Keep a copy of the unmodified curl sources. Make your changes in a separate
source tree. When you think you have something that you want to offer the
curl community, use GNU diff to generate patches.
If you have modified a single file, try something like:
diff -u undmodified-file.c my-changed-one.c > my-fixes.diff
If you have modified several files, possibly in different directories, you
can use diff recursively:
diff -ur curl-original-dir curl-modfied-sources-dir > my-fixes.diff
The GNU diff and GNU patch tools exist for virtually all platforms, including
all kinds of unixes and Windows:
For unix-like operating systems:
http://www.fsf.org/software/patch/patch.html
http://www.gnu.org/directory/diffutils.html
For Windows:
http://gnuwin32.sourceforge.net/packages/patch.htm
http://gnuwin32.sourceforge.net/packages/diffutils.htm

796
neo/curl/docs/FAQ Normal file
View File

@ -0,0 +1,796 @@
Updated: March 16, 2004 (http://curl.haxx.se/docs/faq.html)
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
FAQ
1. Philosophy
1.1 What is cURL?
1.2 What is libcurl?
1.3 What is cURL not?
1.4 When will you make curl do XXXX ?
1.5 Who makes cURL?
1.6 What do you get for making cURL?
1.7 What about CURL from curl.com?
1.8 I have a problem who do I mail?
2. Install Related Problems
2.1 configure doesn't find OpenSSL even when it is installed
2.1.1. native linker doesn't find OpenSSL
2.1.2. only the libssl lib is missing
2.2 Does curl work/build with other SSL libraries?
2.3 Where can I find a copy of LIBEAY32.DLL?
2.4 Does cURL support Socks (RFC 1928) ?
3. Usage Problems
3.1 curl: (1) SSL is disabled, https: not supported
3.2 How do I tell curl to resume a transfer?
3.3 Why doesn't my posting using -F work?
3.4 How do I tell curl to run custom FTP commands?
3.5 How can I disable the Pragma: nocache header?
3.6 Does curl support ASP, XML, XHTML or HTML version Y?
3.7 Can I use curl to delete/rename a file through FTP?
3.8 How do I tell curl to follow HTTP redirects?
3.9 How do I use curl in my favorite programming language?
3.10 What about SOAP, WebDAV, XML-RPC or similar protocols over HTTP?
3.11 How do I POST with a different Content-Type?
3.12 Why do FTP specific features over HTTP proxy fail?
3.13 Why does my single/double quotes fail?
3.14 Does curl support javascript or pac (automated proxy config)?
3.15 Can I do recursive fetches with curl?
4. Running Problems
4.1 Problems connecting to SSL servers.
4.2 Why do I get problems when I use & or % in the URL?
4.3 How can I use {, }, [ or ] to specify multiple URLs?
4.4 Why do I get downloaded data even though the web page doesn't exist?
4.5 Why do I get return code XXX from a HTTP server?
4.5.1 "400 Bad Request"
4.5.2 "401 Unauthorized"
4.5.3 "403 Forbidden"
4.5.4 "404 Not Found"
4.5.5 "405 Method Not Allowed"
4.5.6 "301 Moved Permanently"
4.6 Can you tell me what error code 142 means?
4.7 How do I keep user names and passwords secret in Curl command lines?
4.8 I found a bug!
4.9 Curl can't authenticate to the server that requires NTLM?
4.10 My HTTP request using HEAD, PUT or DELETE doesn't work!
4.11 Why does my HTTP range requests return the full document?
4.12 Why do I get "certificate verify failed" ?
5. libcurl Issues
5.1 Is libcurl thread-safe?
5.2 How can I receive all data into a large memory chunk?
5.3 How do I fetch multiple files with libcurl?
5.4 Does libcurl do Winsock initing on win32 systems?
5.5 Does CURLOPT_WRITEDATA and CURLOPT_READDATA work on win32 ?
5.6 What about Keep-Alive or persistent connections?
5.7 Link errors when building libcurl on Windows!
6. License Issues
6.1 I have a GPL program, can I use the libcurl library?
6.2 I have a closed-source program, can I use the libcurl library?
6.3 I have a BSD licensed program, can I use the libcurl library?
6.4 I have a program that uses LGPL libraries, can I use libcurl?
6.5 Can I modify curl/libcurl for my program and keep the changes secret?
6.6 Can you please change the curl/libcurl license to XXXX?
==============================================================================
1. Philosophy
1.1 What is cURL?
cURL (or simply just 'curl') is a command line tool for getting or sending
files using URL syntax. The name is a play on 'Client for URLs', originally
with URL spelled in uppercase to make it obvious it deals with URLs. The
fact it can also be pronounced 'see URL' also helped, it works as an
abbrivation for "Client URL Request Library" or why not the recursive
version: "Curl URL Request Library".
Curl supports a range of common Internet protocols, currently including
HTTP, HTTPS, FTP, FTPS, GOPHER, LDAP, DICT, TELNET and FILE.
We spell it cURL or just curl. We pronounce it with an initial k sound:
[kurl].
NOTE: there are numerous sub-projects and related projects that also use the
word curl in the project names in various combinations, but you should take
notice that this FAQ is directed at the command-line tool named curl (and
libcurl the library), and may therefore not be valid for other curl-related
projects.
1.2 What is libcurl?
libcurl is a reliable and portable library which provides you with an easy
interface to a range of common Internet protocols.
You can use libcurl for free in your application, be it open source,
commercial or closed-source.
1.3 What is cURL not?
Curl is *not* a wget clone. That is a common misconception. Never, during
curl's development, have we intended curl to replace wget or compete on its
market. Curl is targeted at single-shot file transfers.
Curl is not a web site mirroring program. If you want to use curl to mirror
something: fine, go ahead and write a script that wraps around curl to make
it reality (like curlmirror.pl does).
Curl is not an FTP site mirroring program. Sure, get and send FTP with curl
but if you want systematic and sequential behavior you should write a
script (or write a new program that interfaces libcurl) and do it.
Curl is not a PHP tool, even though it works perfectly well when used from
or with PHP.
Curl is not a single-OS program. Curl exists, compiles, builds and runs
under a wide range of operating systems, including all modern Unixes (and a
bunch of older ones too), Windows, Amiga, BeOS, OS/2, OS X, QNX etc.
1.4 When will you make curl do XXXX ?
We love suggestions of what to change in order to make curl and libcurl
better. We do however believe in a few rules when it comes to the future of
curl:
* Curl -- the command line tool -- is to remain a non-graphical command line
tool. If you want GUIs or fancy scripting capabilities, you should look
for another tool that uses libcurl.
* We do not add things to curl that other small and available tools already
do very fine at the side. Curl's output is fine to pipe into another
program or redirect to another file for the next program to interpret.
* We focus on protocol related issues and improvements. If you wanna do more
magic with the supported protocols than curl currently does, chances are
big we will agree. If you wanna add more protocols, we may very well
agree.
* If you want someone else to make all the work while you wait for us to
implement it for you, that is not a very friendly attitude. We spend a
considerable time already on maintaining and developing curl. In order to
get more out of us, you should consider trading in some of your time and
efforts in return.
* If you write the code, chances are bigger that it will get into curl
faster.
1.5 Who makes cURL?
cURL and libcurl are not made by any single individual. Sure, Daniel
Stenberg writes the major parts, but other persons' submissions are
important and crucial. Anyone can contribute and post their changes and
improvements and have them inserted in the main sources (of course on the
condition that developers agree on that the fixes are good).
The list of contributors in the docs/THANKS file is only a small part of all
the people that every day provide us with bug reports, suggestions, ideas
and source code.
curl is developed by a community, with Daniel at the wheel.
1.6 What do you get for making cURL?
Project cURL is entirely free and open. No person gets paid for developing
curl. We do this voluntarily on our spare time.
We get some help from companies. Contactor Data hosts the curl web site,
Haxx owns the curl web site's domain and sourceforge.net hosts project
services we take advantage from, like the bug tracker.
If you want to support our project with a donation or similar, one way of
doing that would be to buy "gift certificates" at useful online shopping
sites, such as amazon.com or thinkgeek.com. Another way would be to sponsor
us through a banner-program or even better: by helping us coding,
documenting, testing etc. You're welcome to send us a buck using paypal, as
described here: http://curl.haxx.se/donation.html
1.7 What about CURL from curl.com?
During the summer 2001, curl.com was busy advertising their client-side
programming language for the web, named CURL.
We are in no way associated with curl.com or their CURL programming
language.
Our project name curl has been in effective use since 1998. We were not the
first computer related project to use the name "curl" and do not claim any
first-hand rights to the name.
We recognize that we will be living in parallel with curl.com and wish them
every success.
1.8 I have a problem who do I mail?
Please do not mail any single individual unless you really need to. Keep
curl-related questions on a suitable mailing list. All available mailing
lists are listed in the MANUAL document and online at
http://curl.haxx.se/mail/
Keeping curl-related questions and discussions on mailing lists allows
others to join in and help, to share their ideas, contribute their
suggestions and spread their wisdom. Keeping discussions on public mailing
lists also allows for others to learn from this (both current and future
users thanks to the web based archives of the mailing lists), thus saving us
from having to repeat ourselves even more. Thanks for respecting this.
2. Install Related Problems
2.1. configure doesn't find OpenSSL even when it is installed
This may be because of several reasons.
2.1.1. native linker doesn't find openssl
Affected platforms:
Solaris (native cc compiler)
HPUX (native cc compiler)
SGI IRIX (native cc compiler)
SCO UNIX (native cc compiler)
When configuring curl, I specify --with-ssl. OpenSSL is installed in
/usr/local/ssl Configure reports SSL in /usr/local/ssl, but fails to find
CRYPTO_lock in -lcrypto
Cause: The cc for this test places the -L/usr/local/ssl/lib AFTER
-lcrypto, so ld can't find the library. This is due to a bug in the GNU
autoconf tool.
Workaround: Specifying "LDFLAGS=-L/usr/local/ssl/lib" in front of
./configure places the -L/usr/local/ssl/lib early enough in the command
line to make things work
Solution submitted by: Bob Allison <allisonb@users.sourceforge.net>
2.1.2. only the libssl lib is missing
If all include files and the libcrypto lib is present, with only the
libssl being missing according to configure, this is mostly likely because
a few functions are left out from the libssl.
If the function names missing include RSA or RSAREF you can be certain
that this is because libssl requires the RSA and RSAREF libs to build.
See the INSTALL file section that explains how to add those libs to
configure. Make sure that you remove the config.cache file before you
rerun configure with the new flags.
2.2. Does curl work/build with other SSL libraries?
Curl has been written to use OpenSSL, although there should not be much
problems using a different library. If anyone does "port" curl to use a
different SSL library, we are of course very interested in getting the
patch!
2.3. Where can I find a copy of LIBEAY32.DLL?
That is an OpenSSL binary built for Windows.
Curl uses OpenSSL to do the SSL stuff. The LIBEAY32.DLL is what curl needs
on a windows machine to do https://. Check out the curl web site to find
accurate and up-to-date pointers to recent OpenSSL DLLs and other binary
packages.
2.4. Does cURL support Socks (RFC 1928) ?
Yes, SOCKS5 is supported when curl is built with IPv6 support disabled.
3. Usage problems
3.1. curl: (1) SSL is disabled, https: not supported
If you get this output when trying to get anything from a https:// server,
it means that the configure script couldn't find all libs and include files
it requires for SSL to work. If the configure script fails to find them,
curl is simply built without SSL support.
To get the https:// support into a curl that was previously built but that
reports that https:// is not supported, you should dig through the document
and logs and check out why the configure script doesn't find the SSL libs
and/or include files.
Also, check out the other paragraph in this FAQ labeled "configure doesn't
find OpenSSL even when it is installed".
3.2. How do I tell curl to resume a transfer?
Curl supports resumed transfers both ways on both FTP and HTTP.
Try the -C option.
3.3. Why doesn't my posting using -F work?
You can't simply use -F or -d at your choice. The web server that will
receive your post assumes one of the formats. If the form you're trying to
"fake" sets the type to 'multipart/form-data', then and only then you must
use the -F type. In all the most common cases, you should use -d which then
causes a posting with the type 'application/x-www-form-urlencoded'.
This is described in some detail in the MANUAL and TheArtOfHttpScripting
documents, and if you don't understand it the first time, read it again
before you post questions about this to the mailing list. Also, try reading
through the mailing list archives for old postings and questions regarding
this.
3.4. How do I tell curl to run custom FTP commands?
You can tell curl to perform optional commands both before and/or after a
file transfer. Study the -Q/--quote option.
Since curl is used for file transfers, you don't use curl to just perform
FTP commands without transferring anything. Therefore you must always specify
a URL to transfer to/from even when doing custom FTP commands.
3.5. How can I disable the Pragma: nocache header?
You can change all internally generated headers by adding a replacement with
the -H/--header option. By adding a header with empty contents you safely
disable that one. Use -H "Pragma:" to disable that specific header.
3.6. Does curl support ASP, XML, XHTML or HTML version Y?
To curl, all contents are alike. It doesn't matter how the page was
generated. It may be ASP, PHP, Perl, shell-script, SSI or plain
HTML-files. There's no difference to curl and it doesn't even know what kind
of language that generated the page.
See also item 3.14 regarding javascript.
3.7. Can I use curl to delete/rename a file through FTP?
Yes. You specify custom FTP commands with -Q/--quote.
One example would be to delete a file after you have downloaded it:
curl -O ftp://download.com/coolfile -Q '-DELE coolfile'
3.8 How do I tell curl to follow HTTP redirects?
Curl does not follow so-called redirects by default. The Location: header
that informs the client about this is only interpreted if you're using the
-L/--location option. As in:
curl -L http://redirector.com
3.9 How do I use curl in my favorite programming language?
There exist many language interfaces/bindings for curl that integrates it
better with various languages. If you are fluid in a script language, you
may very well opt to use such an interface instead of using the command line
tool.
Find out more about which languages that support curl directly, and how to
install and use them, in the libcurl section of the curl web site:
http://curl.haxx.se/libcurl/
In February 2003, there are interfaces available for the following
languages: Basic, C, C++, Cocoa, Dylan, Euphoria, Java, Lua, Object-Pascal,
Pascal, Perl, PHP, PostgreSQL, Python, Rexx, Ruby, Scheme and Tcl. By the
time you read this, additional ones may have appeared!
3.10 What about SOAP, WebDAV, XML-RPC or similar protocols over HTTP?
Curl adheres to the HTTP spec, which basically means you can play with *any*
protocol that is built on top of HTTP. Protocols such as SOAP, WEBDAV and
XML-RPC are all such ones. You can use -X to set custom requests and -H to
set custom headers (or replace internally generated ones).
Using libcurl is of course just as fine and you'd just use the proper
library options to do the same.
3.11 How do I POST with a different Content-Type?
You can always replace the internally generated headers with -H/--header.
To make a simple HTTP POST with text/xml as content-type, do something like:
curl -d "datatopost" -H "Content-Type: text/xml" [URL]
3.12 Why do FTP specific features over HTTP proxy fail?
Because when you use a HTTP proxy, the protocol spoken on the network will
be HTTP, even if you specify a FTP URL. This effectively means that you
normally can't use FTP specific features such as FTP upload and FTP quote
etc.
There is one exception to this rule, and that is if you can "tunnel through"
the given HTTP proxy. Proxy tunneling is enabled with a special option (-p)
and is generally not available as proxy admins usually disable tunneling to
other ports than 443 (which is used for HTTPS access through proxies).
3.13 Why does my single/double quotes fail?
To specify a command line option that includes spaces, you might need to
put the entire option within quotes. Like in:
curl -d " with spaces " url.com
or perhaps
curl -d ' with spaces ' url.com
Exactly what kind of quotes and how to do this is entirely up to the shell
or command line interpreter that you are using. For most unix shells, you
can more or less pick either single (') or double (") quotes. For
Windows/DOS prompts I believe you're forced to use double (") quotes.
Please study the documentation for your particular environment. Examples in
the curl docs will use a mix of both these ones as shown above. You must
adjust them to work in your environment.
Remember that curl works and runs on more operating systems than most single
individuals have ever tried.
3.14 Does curl support javascript or pac (automated proxy config)?
Many web pages do magic stuff using embedded javascript. Curl and libcurl
have no built-in support for that, so it will be treated just like any other
contents.
.pac files are a netscape invention and are sometimes used by organizations
to allow them to differentiate which proxies to use. The .pac contents is
just a javascript program that gets invoked by the browser and that returns
the name of the proxy to connect to. Since curl doesn't support javascript,
it can't support .pac proxy configuration either.
Some work-arounds usually suggested to overcome this javascript dependency:
- Depending on the javascript complexity, write up a script that
translates it to another language and execute that.
- Read the javascript code and rewrite the same logic in another language.
- Implement a javascript interpreter, people have successfully used the
Mozilla javascript engine in the past.
- Ask your admins to stop this, for a static proxy setup or similar.
3.15 Can I do recursive fetches with curl?
No. curl itself has no code that performs recursive operations, such as
those performed by wget.
There exist wrapper scripts with that functionality (for example the
curlmirror perl script), and you can write programs based on libcurl to do
it, but the command line tool curl itself cannot.
4. Running Problems
4.1. Problems connecting to SSL servers.
It took a very long time before we could sort out why curl had problems to
connect to certain SSL servers when using SSLeay or OpenSSL v0.9+. The
error sometimes showed up similar to:
16570:error:1407D071:SSL routines:SSL2_READ:bad mac decode:s2_pkt.c:233:
It turned out to be because many older SSL servers don't deal with SSLv3
requests properly. To correct this problem, tell curl to select SSLv2 from
the command line (-2/--sslv2).
There have also been examples where the remote server didn't like the SSLv2
request and instead you had to force curl to use SSLv3 with -3/--sslv3.
4.2. Why do I get problems when I use & or % in the URL?
In general unix shells, the & letter is treated special and when used, it
runs the specified command in the background. To safely send the & as a part
of a URL, you should quote the entire URL by using single (') or double (")
quotes around it.
An example that would invoke a remote CGI that uses &-letters could be:
curl 'http://www.altavista.com/cgi-bin/query?text=yes&q=curl'
In Windows, the standard DOS shell treats the %-letter specially and you
need to use TWO %-letters for each single one you want to use in the URL.
Also note that if you want the literal %-letter to be part of the data you
pass in a POST using -d/--data you must encode it as '%25' (which then also
needs the %-letter doubled on Windows machines).
4.3. How can I use {, }, [ or ] to specify multiple URLs?
Because those letters have a special meaning to the shell, and to be used in
a URL specified to curl you must quote them.
An example that downloads two URLs (sequentially) would do:
curl '{curl,www}.haxx.se'
To be able to use those letters as actual parts of the URL (without using
them for the curl URL "globbing" system), use the -g/--globoff option:
curl -g 'www.site.com/weirdname[].html'
4.4. Why do I get downloaded data even though the web page doesn't exist?
Curl asks remote servers for the page you specify. If the page doesn't exist
at the server, the HTTP protocol defines how the server should respond and
that means that headers and a "page" will be returned. That's simply how
HTTP works.
By using the --fail option you can tell curl explicitly to not get any data
if the HTTP return code doesn't say success.
4.5 Why do I get return code XXX from a HTTP server?
RFC2616 clearly explains the return codes. This is a short transcript. Go
read the RFC for exact details:
4.5.1 "400 Bad Request"
The request could not be understood by the server due to malformed
syntax. The client SHOULD NOT repeat the request without modifications.
4.5.2 "401 Unauthorized"
The request requires user authentication.
4.5.3 "403 Forbidden"
The server understood the request, but is refusing to fulfill it.
Authorization will not help and the request SHOULD NOT be repeated.
4.5.4 "404 Not Found"
The server has not found anything matching the Request-URI. No indication
is given of whether the condition is temporary or permanent.
4.5.5 "405 Method Not Allowed"
The method specified in the Request-Line is not allowed for the resource
identified by the Request-URI. The response MUST include an Allow header
containing a list of valid methods for the requested resource.
4.5.6 "301 Moved Permanently"
If you get this return code and an HTML output similar to this:
<H1>Moved Permanently</H1> The document has moved <A
HREF="http://same_url_now_with_a_trailing_slash/">here</A>.
it might be because you request a directory URL but without the trailing
slash. Try the same operation again _with_ the trailing URL, or use the
-L/--location option to follow the redirection.
4.6. Can you tell me what error code 142 means?
All error codes that are larger than the highest documented error code means
that curl has exited due to a crash. This is a serious error, and we
appreciate a detailed bug report from you that describes how we could go
ahead and repeat this!
4.7. How do I keep user names and passwords secret in Curl command lines?
This problem has two sides:
The first part is to avoid having clear-text passwords in the command line
so that they don't appear in 'ps' outputs and similar. That is easily
avoided by using the "-K" option to tell curl to read parameters from a file
or stdin to which you can pass the secret info. curl itself will also
attempt to "hide" the given password by blanking out the option - this
doesn't work on all platforms.
To keep the passwords in your account secret from the rest of the world is
not a task that curl addresses. You could of course encrypt them somehow to
at least hide them from being read by human eyes, but that is not what
anyone would call security.
Also note that regular HTTP (using Basic authentication) and FTP passwords
are sent in clear across the network. All it takes for anyone to fetch them
is to listen on the network. Eavesdropping is very easy. Use more secure
authentication methods (like Digest, Negotiate or even NTLM) or consider the
SSL-based alternatives HTTPS and FTPS.
4.8 I found a bug!
It is not a bug if the behavior is documented. Read the docs first.
Especially check out the KNOWN_BUGS file, it may be a documented bug!
If it is a problem with a binary you've downloaded or a package for your
particular platform, try contacting the person who built the package/archive
you have.
If there is a bug, read the BUGS document first. Then report it as described
in there.
4.9. Curl can't authenticate to the server that requires NTLM?
This is supported in curl 7.10.6 or later. No earlier curl version knows
of this magic.
NTLM is a Microsoft proprietary protocol. Proprietary formats are evil. You
should not use such ones.
4.10 My HTTP request using HEAD, PUT or DELETE doesn't work!
Many web servers allow or demand that the administrator configures the
server properly for these requests to work on the web server.
Some servers seem to support HEAD only on certain kinds of URLs.
To fully grasp this, try the documentation for the particular server
software you're trying to interact with. This is not anything curl can do
anything about.
4.11 Why does my HTTP range requests return the full document?
Because the range may not be supported by the server, or the server may
choose to ignore it and return the full document anyway.
4.12 Why do I get "certificate verify failed" ?
You invoke curl 7.10 or later to communicate on a https:// URL and get an
error back looking something similar to this:
curl: (35) SSL: error:14090086:SSL routines:
SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
Then it means that curl couldn't verify that the server's certificate was
good. Curl verifies the certificate using the CA cert bundle that comes with
the curl installation.
To disable the verification (which makes it act like curl did before 7.10),
use -k. This does however enable man-in-the-middle attacks.
If you get this failure but are having a CA cert bundle installed and used,
the server's certificate is not signed by one of the CA's in the bundle. It
might for example be self-signed. You then correct this problem by obtaining
a valid CA cert for the server. Or again, decrease the security by disabling
this check.
Details are also in the SSLCERTS file in the release archives, found online
here: http://curl.haxx.se/docs/sslcerts.html
5. libcurl Issues
5.1. Is libcurl thread-safe?
Yes.
We have written the libcurl code specificly adjusted for multi-threaded
programs. libcurl will use thread-safe functions instead of non-safe ones if
your system has such.
We would appreciate some kind of report or README file from those who have
used libcurl in a threaded environment.
5.2 How can I receive all data into a large memory chunk?
[ See also the examples/getinmemory.c source ]
You are in full control of the callback function that gets called every time
there is data received from the remote server. You can make that callback do
whatever you want. You do not have to write the received data to a file.
One solution to this problem could be to have a pointer to a struct that you
pass to the callback function. You set the pointer using the
curl_easy_setopt(CURLOPT_FILE) function. Then that pointer will be passed to
the callback instead of a FILE * to a file:
/* imaginary struct */
struct MemoryStruct {
char *memory;
size_t size;
};
/* imaginary callback function */
size_t
WriteMemoryCallback(void *ptr, size_t size, size_t nmemb, void *data)
{
register int realsize = size * nmemb;
struct MemoryStruct *mem = (struct MemoryStruct *)data;
mem->memory = (char *)realloc(mem->memory, mem->size + realsize + 1);
if (mem->memory) {
memcpy(&(mem->memory[mem->size]), ptr, realsize);
mem->size += realsize;
mem->memory[mem->size] = 0;
}
return realsize;
}
5.3 How do I fetch multiple files with libcurl?
libcurl has excellent support for transferring multiple files. You should
just repeatedly set new URLs with curl_easy_setopt() and then transfer it
with curl_easy_perform(). The handle you get from curl_easy_init() is not
only reusable, but you're even encouraged to reuse it if you can, as that
will enable libcurl to use persistent connections.
5.4 Does libcurl do Winsock initialization on win32 systems?
Yes, if told to in the curl_global_init() call.
5.5 Does CURLOPT_WRITEDATA and CURLOPT_READDATA work on win32 ?
Yes, but you cannot open a FILE * and pass the pointer to a DLL and have
that DLL use the FILE * (as the DLL and the client application cannot access
each others' variable memory areas). If you set CURLOPT_WRITEDATA you must
also use CURLOPT_WRITEFUNCTION as well to set a function that writes the
file, even if that simply writes the data to the specified FILE *.
Similarly, if you use CURLOPT_READDATA you must also specify
CURLOPT_READFUNCTION.
(Provided by Joel DeYoung and Bob Schader)
5.6 What about Keep-Alive or persistent connections?
curl and libcurl have excellent support for persistent connections when
transferring several files from the same server. Curl will attempt to reuse
connections for all URLs specified on the same command line/config file, and
libcurl will reuse connections for all transfers that are made using the
same libcurl handle.
5.7 Link errors when building libcurl on Windows!
You need to make sure that your project, and all the libraries (both static
and dynamic) that it links against, are compiled/linked against the same run
time library.
This is determined by the /MD, /ML, /MT (and their corresponding /M?d)
options to the command line compiler. /MD (linking against MSVCRT dll) seems
to be the most commonly used option.
(Provided by Andrew Francis)
6. License Issues
Curl and libcurl are released under a MIT/X derivate license. The license is
very liberal and should not impose a problem for your project. This section
is just a brief summary for the cases we get the most questions. (Parts of
this section was much enhanced by Bjorn Reese.)
6.1. I have a GPL program, can I use the libcurl library?
Yes!
Since libcurl may be distributed under the MIT/X derivate license, it can be
used together with GPL in any software.
6.2. I have a closed-source program, can I use the libcurl library?
Yes!
libcurl does not put any restrictions on the program that uses the library.
6.3. I have a BSD licensed program, can I use the libcurl library?
Yes!
libcurl does not put any restrictions on the program that uses the library.
6.4. I have a program that uses LGPL libraries, can I use libcurl?
Yes!
The LGPL license doesn't clash with other licenses.
6.5. Can I modify curl/libcurl for my program and keep the changes secret?
Yes!
The MIT/X derivate license practically allows you to do almost anything with
the sources, on the condition that the copyright texts in the sources are
left intact.
6.6. Can you please change the curl/libcurl license to XXXX?
No.
We have carefully picked this license after years of development and
discussions and a large amount of people have contributed with source code
knowing that this is the license we use. This license puts the restrictions
we want on curl/libcurl and it does not spread to other programs or
libraries that use it. It should be possible for everyone to use libcurl or
curl in their projects, no matter what license they already have in use.

114
neo/curl/docs/FEATURES Normal file
View File

@ -0,0 +1,114 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
FEATURES
Misc
- full URL syntax
- custom maximum download time
- custom least download speed acceptable
- custom output result after completion
- multiple URLs
- guesses protocol from host name unless specified
- uses .netrc
- progress bar/time specs while downloading
- "standard" proxy environment variables support
- config file support
- compiles on win32 (reported builds on 40+ operating systems)
- redirectable stderr
- selectable network interface for outgoing traffic
- IPv6 support
- persistant connections
- socks5 support
- supports user name + password in proxy environment variables
- operations through proxy "tunnel" (using CONNECT)
- supports transfers of large files (>2GB and >4GB)
HTTP
- HTTP/1.1 compliant (optionally uses 1.0)
- GET
- PUT
- HEAD
- POST
- multipart formpost (RFC1867-style)
- authentication: Basic, Digest, NTLM(*1), GSS-Negotiate/Negotiate(*3) and
SPNEGO (*4)
- resume (both GET and PUT)
- follow redirects
- maximum amount of redirects to follow
- custom HTTP request
- cookie get/send fully parsed
- reads/writes the netscape cookie file format
- custom headers (replace/remove internally generated headers)
- custom user-agent string
- custom referer string
- range
- proxy authentication
- time conditions
- via http-proxy
- retrieve file modification date
- Content-Encoding support for deflate and gzip
- "Transfer-Encoding: chunked" support for "uploads"
HTTPS (*1)
- (all the HTTP features)
- using certificates
- verify server certificate
- via http-proxy
- select desired encryption
- force usage of a specific SSL version (SSLv2, SSLv3 or TLSv1)
FTP
- download
- authentication
- kerberos4 (*5)
- active/passive using PORT, EPRT, PASV or EPSV
- single file size information (compare to HTTP HEAD)
- 'type=' URL support
- dir listing
- dir listing names-only
- upload
- upload append
- upload via http-proxy as HTTP PUT
- download resume
- upload resume
- custom ftp commands (before and/or after the transfer)
- simple "range" support
- via http-proxy
- all operations can be tunneled through a http-proxy
- customizable to retrieve file modification date
FTPS (*1)
- explicit ftps:// support that use SSL on both connections
- implicit "AUTH TSL" and "AUTH SSL" usage to "upgrade" plain ftp://
connection to use SSL for both or one of the connections
TELNET
- connection negotiation
- custom telnet options
- stdin/stdout I/O
LDAP (*2)
- full LDAP URL support
DICT
- extended DICT URL support
GOPHER
- GET
- via http-proxy
FILE
- URL support
FOOTNOTES
=========
*1 = requires OpenSSL
*2 = requires OpenLDAP
*3 = requires a GSSAPI-compliant library, such as Heimdal or similar.
*4 = requires FBopenssl
*5 = requires a krb4 library, such as the MIT one or similar.

128
neo/curl/docs/HISTORY Normal file
View File

@ -0,0 +1,128 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
How cURL Became Like This
In the second half of 1997, Daniel Stenberg came up with the idea to make
currency-exchange calculations available to Internet Relay Chat (IRC)
users. All the necessary data are published on the Web; he just needed to
automate their retrieval.
Daniel simply adopted an existing command-line open-source tool, httpget, that
Brazilian Rafael Sagula had written. After a few minor adjustments, it did
just what he needed.
Soon, he found currencies on a GOPHER site, so support for that had to go in,
and not before long FTP download support was added as well. The name of the
project was changed to urlget to better fit what it actually did now, since
the http-only days were already passed.
The project slowly grew bigger. When upload capabilities were added and the
name once again was misleading, a second name change was made and on March 20,
1998 curl 4 was released. (The version numbering from the previous names was
kept.)
(Unrelated to this project a company called Curl Corporation registered a US
trademark on the name "CURL" on May 18 1998. That company had then already
registered the curl.com domain back in November of the previous year. All this
was revealed to us much later.)
SSL support was added, powered by the SSLeay library.
August 1998, first announcement of curl on freshmeat.net.
October 1998, with the curl 4.9 release and the introduction of cookie
support, curl was no longer released under the GPL license. Now we're at 4000
lines of code, we switched over to the MPL license to restrict the effects of
"copyleft".
November 1998, configure script and reported successful compiles on several
major operating systems. The never-quite-understood -F option was added and
curl could now simulate quite a lot of a browser. TELNET support was added.
Curl 5 was released in December 1998 and introduced the first ever curl man
page. People started making Linux RPM packages out of it.
January 1999, DICT support added.
OpenSSL took over where SSLeay was abandoned.
May 1999, first Debian package.
August 1999, LDAP:// and FILE:// support added. The curl web site gets 1300
visits weekly.
Released curl 6.0 in September. 15000 lines of code.
December 28 1999, added the project on Sourceforge and started using its
services for managing the project.
Spring 2000, major internal overhaul to provide a suitable library interface.
The first non-beta release was named 7.1 and arrived in August. This offered
the easy interface and turned out to be the beginning of actually getting
other software and programs to get based on and powered by libcurl. Almost
20000 lines of code.
August 2000, the curl web site gets 4000 visits weekly.
The PHP guys adopted libcurl already the same month, when the first ever third
party libcurl binding showed up. CURL has been a supported module in PHP since
the release of PHP 4.0.2. This would soon get followers. More than 16
different bindings exist at the time of this writing.
September 2000, kerberos4 support was added.
In November 2000 started the work on a test suite for curl. It was later
re-written from scratch again.
January 2001, Daniel released curl 7.5.2 under a new license again: MIT (or
MPL). The MIT license is extremely liberal and can be used combined with GPL
in other projects. This would finally put an end to the "complaints" from
people involved in GPLed projects that previously were prohibited from using
libcurl while it was released under MPL only. (Due to the fact that MPL is
deemed "GPL incompatible".)
curl supports HTTP 1.1 starting with the release of 7.7, March 22 2001. This
also introduced libcurl's ability to do persistent connections. 24000 lines of
code.
The first experimental ftps:// support was added in March 2001.
August 2001. curl is bundled in Mac OS X, 10.1. It was already becoming more
and more of a standard utility of Linux distributions and a regular in the BSD
ports collections. The curl web site gets 8000 visits weekly. Curl Corporation
contacted Daniel to discuss "the name issue". After Daniel's reply, they have
never since got in touch again.
September 2001, libcurl 7.9 introduces cookie jar and curl_formadd(). During
the forthcoming 7.9.x releases, we introduced the multi interface slowly and
without much whistles.
June 2002, the curl web site gets 13000 visits weekly. curl and libcurl is
35000 lines of code. Reported successful compiles on more than 40 combinations
of CPUs and operating systems.
To estimate number of users of the curl tool or libcurl library is next to
impossible. Around 5000 downloaded packages each week from the main site gives
a hint, but the packages are mirrored extensively, bundled with numerous OS
distributions and otherwise retrieved as part of other software.
September 2002, with the release of curl 7.10 it is released under the MIT
license only.
February 2003, the curl site averages at 20000 visits weekly. At any given
moment, there's an average of 3 people browsing the curl.haxx.se site.
Multiple new authentication schemes are supported: Digest (May), NTLM (June)
and Negotiate (June).
November 2003: curl 7.10.8 is released. 45000 lines of code. ~55000 unique
visitors to the curl.haxx.se site. Five official web mirrors.
December 2003, full-fledged SSL for FTP is supported.
January 2004: curl 7.11.0 introduced large file support.

551
neo/curl/docs/INSTALL Normal file
View File

@ -0,0 +1,551 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
How To Compile
Installing Binary Packages
==========================
Lots of people download binary distributions of curl and libcurl. This
document does not describe how to install curl or libcurl using such a
binary package. This document describes how to compile, build and install
curl and libcurl from source code.
UNIX
====
A normal unix installation is made in three or four steps (after you've
unpacked the source archive):
./configure
make
make test (optional)
make install
You probably need to be root when doing the last command.
If you have checked out the sources from the CVS repository, read the
CVS-INFO on how to proceed.
Get a full listing of all available configure options by invoking it like:
./configure --help
If you want to install curl in a different file hierarchy than /usr/local,
you need to specify that already when running configure:
./configure --prefix=/path/to/curl/tree
If you happen to have write permission in that directory, you can do 'make
install' without being root. An example of this would be to make a local
install in your own home directory:
./configure --prefix=$HOME
make
make install
The configure script always tries to find a working SSL library unless
explicitly told not to. If you have OpenSSL installed in the default search
path for your compiler/linker, you don't need to do anything special. If
you have OpenSSL installed in /usr/local/ssl, you can run configure like:
./configure --with-ssl
If you have OpenSSL installed somewhere else (for example, /opt/OpenSSL,)
you can run configure like this:
./configure --with-ssl=/opt/OpenSSL
If you insist on forcing a build without SSL support, even though you may
have OpenSSL installed in your system, you can run configure like this:
./configure --without-ssl
If you have OpenSSL installed, but with the libraries in one place and the
header files somewhere else, you have to set the LDFLAGS and CPPFLAGS
environment variables prior to running configure. Something like this
should work:
(with the Bourne shell and its clones):
CPPFLAGS="-I/path/to/ssl/include" LDFLAGS="-L/path/to/ssl/lib" \
./configure
(with csh, tcsh and their clones):
env CPPFLAGS="-I/path/to/ssl/include" LDFLAGS="-L/path/to/ssl/lib" \
./configure
If your SSL library was compiled with rsaref (usually for use in the United
States), you may also need to set:
LIBS=-lRSAglue -lrsaref
(as suggested by Doug Kaufman)
MORE OPTIONS
To force configure to use the standard cc compiler if both cc and gcc are
present, run configure like
CC=cc ./configure
or
env Cc=cc ./configure
To force a static library compile, disable the shared library creation
by running configure like:
./configure --disable-shared
To tell the configure script to skip searching for thread-safe functions,
add an option like:
./configure --disable-thread
To build curl with kerberos4 support enabled, curl requires the krb4 libs
and headers installed. You can then use a set of options to tell
configure where those are:
--with-krb4-includes[=DIR] Specify location of kerberos4 headers
--with-krb4-libs[=DIR] Specify location of kerberos4 libs
--with-krb4[=DIR] where to look for Kerberos4
In most cases, /usr/athena is the install prefix and then it works with
./configure --with-krb4=/usr/athena
If you're a curl developer and use gcc, you might want to enable more
debug options with the --enable-debug option.
Win32
=====
Without SSL:
MingW32 (GCC-2.95) style
------------------------
Run the 'mingw32.bat' file to get the proper environment variables
set, then run 'make mingw32' in the root dir.
If you have any problems linking libraries or finding header files, be
sure to verify that the provided "Makefile.m32" files use the proper
paths, and adjust as necessary.
Cygwin style
------------
Almost identical to the unix installation. Run the configure script in
the curl root with 'sh configure'. Make sure you have the sh
executable in /bin/ or you'll see the configure fail towards the end.
Run 'make'
Microsoft command line style
----------------------------
Run the 'vcvars32.bat' file to get the proper environment variables
set, then run 'nmake vc' in the root dir.
The vcvars32.bat file is part of the Microsoft development
environment.
IDE-style
-------------------------
If you use VC++, Borland or similar compilers. Include all lib source
files in a static lib "project" (all .c and .h files that is).
(you should name it libcurl or similar)
Make the sources in the src/ drawer be a "win32 console application"
project. Name it curl.
With VC++, add 'ws2_32.lib' to the link libs when you build curl!
Borland seems to do that itself magically. Of course you have to make
sure it links with the libcurl too!
For VC++ 6, there's an included Makefile.vc6 that should be possible
to use out-of-the-box.
Microsoft note: add /Zm200 to the compiler options to increase the
compiler's memory allocation limit, as the hugehelp.c won't compile
due to "too long puts string".
With SSL:
MingW32 (GCC-2.95) style
------------------------
Run the 'mingw32.bat' file to get the proper environment variables
set, then run 'make mingw32-ssl' in the root dir.
If you have any problems linking libraries or finding header files, be
sure to look at the provided "Makefile.m32" files for the proper
paths, and adjust as necessary.
Cygwin style
------------
Haven't done, nor got any reports on how to do. It should although be
identical to the unix setup for the same purpose. See above.
Microsoft command line style
----------------------------
Please read the OpenSSL documentation on how to compile and install
the OpenSSL libraries. The build process of OpenSSL generates the
libeay32.dll and ssleay32.dll files in the out32dll subdirectory in
the OpenSSL home directory. OpenSSL static libraries (libeay32.lib,
ssleay32.lib, RSAglue.lib) are created in the out32 subdirectory.
Run the 'vcvars32.bat' file to get a proper environment. The
vcvars32.bat file is part of the Microsoft development environment and
you may find it in 'C:\Program Files\Microsoft Visual Studio\vc98\bin'
provided that you installed Visual C/C++ 6 in the default directory.
Before running nmake define the OPENSSL_PATH environment variable with
the root/base directory of OpenSSL, for example:
set OPENSSL_PATH=c:\openssl-0.9.7a
lib/Makefile.vc6 depends on zlib (http://www.gzip.org/zlib/) as well.
Please read the zlib documentation on how to compile zlib. Define the
ZLIB_PATH environment variable to the location of zlib.h and zlib.lib,
for example:
set ZLIB_PATH=c:\zlib-1.1.4
Then run 'nmake vc-ssl' or 'nmake vc-ssl-dll' in curl's root
directory. 'nmake vc-ssl' will create a libcurl static and dynamic
libraries in the lib subdirectory, as well as a statically linked
version of curl.exe in the src subdirectory. This statically linked
version is a standalone executable not requiring any DLL at
runtime. This make method requires that you have the static OpenSSL
libraries available in OpenSSL's out32 subdirectory.
'nmake vc-ssl-dll' creates the libcurl dynamic library and
links curl.exe against libcurl and OpenSSL dynamically.
This executable requires libcurl.dll and the OpenSSL DLLs
at runtime.
Microsoft / Borland style
-------------------------
If you have OpenSSL, and want curl to take advantage of it, edit your
project properties to use the SSL include path, link with the SSL libs
and define the USE_SSLEAY symbol.
Using Borland C++ compiler version 5.5.1 (available as free download
from Borland's site)
---------------------------------------------------------------------
compile openssl
Make sure you include the paths to curl/include and openssl/inc32 in
your bcc32.cnf file
eg : -I"c:\Bcc55\include;c:\path_curl\include;c:\path_openssl\inc32"
Check to make sure that all of the sources listed in lib/Makefile.b32
are present in the /path_to_curl/lib directory. (Check the src
directory for missing ones.)
Make sure the environment variable "BCCDIR" is set to the install
location for the compiler eg : c:\Borland\BCC55
command line:
make -f /path_to_curl/lib/Makefile-ssl.b32
compile simplessl.c with appropriate links
c:\curl\docs\examples\> bcc32 -L c:\path_to_curl\lib\libcurl.lib
-L c:\borland\bcc55\lib\psdk\ws2_32.lib
-L c:\openssl\out32\libeay32.lib
-L c:\openssl\out32\ssleay32.lib
simplessl.c
Disabling Specific Protocols:
The configure utility, unfortunately, is not available for the Windows
environment, therefore, you cannot use the various disable-protocol
options of the configure utility on this platform.
However, you can use the following defines to disable specific
protocols:
HTTP_ONLY disables all protocols except HTTP
CURL_DISABLE_FTP disables FTP
CURL_DISABLE_LDAP disables LDAP
CURL_DISABLE_TELNET disables TELNET
CURL_DISABLE_DICT disables DICT
CURL_DISABLE_FILE disables FILE
CURL_DISABLE_GOPHER disables GOPHER
If you want to set any of these defines you have the following
possibilities:
- Modify lib/setup.h
- Modify lib/Makefile.vc6
- Add defines to Project/Settings/C/C++/General/Preprocessor Definitions
in the curllib.dsw/curllib.dsp Visual C++ 6 IDE project.
IBM OS/2
========
Building under OS/2 is not much different from building under unix.
You need:
- emx 0.9d
- GNU make
- GNU patch
- ksh
- GNU bison
- GNU file utilities
- GNU sed
- autoconf 2.13
If you want to build with OpenSSL or OpenLDAP support, you'll need to
download those libraries, too. Dirk Ohme has done some work to port SSL
libraries under OS/2, but it looks like he doesn't care about emx. You'll
find his patches on: http://come.to/Dirk_Ohme
If during the linking you get an error about _errno being an undefined
symbol referenced from the text segment, you need to add -D__ST_MT_ERRNO__
in your definitions.
If everything seems to work fine but there's no curl.exe, you need to add
-Zexe to your linker flags.
If you're getting huge binaries, probably your makefiles have the -g in
CFLAGS.
VMS
===
(The VMS section is in whole contributed by the friendly Nico Baggus)
Curl seems to work with FTP & HTTP other protocols are not tested. (the
perl http/ftp testing server supplied as testing too cannot work on VMS
because vms has no concept of fork(). [ I tried to give it a whack, but
thats of no use.
SSL stuff has not been ported.
Telnet has about the same issues as for Win32. When the changes for Win32
are clear maybe they'll work for VMS too. The basic problem is that select
ONLY works for sockets.
Marked instances of fopen/[f]stat that might become a problem, especially
for non stream files. In this regard, the files opened for writing will be
created stream/lf and will thus be safe. Just keep in mind that non-binary
read/wring from/to files will have a records size limit of 32767 bytes
imposed.
Stat to get the size of the files is again only safe for stream files &
fixed record files without implied CC.
-- My guess is that only allowing access to stream files is the quickest
way to get around the most issues. Therefore all files need to to be
checked to be sure they will be stream/lf before processing them. This is
the easiest way out, I know. The reason for this is that code that needs to
report the filesize will become a pain in the ass otherwise.
Exit status.... Well we needed something done here,
VMS has a structured exist status:
| 3 | 2 | 1 | 0|
|1098|765432109876|5432109876543|210|
+----+------------+-------------+---+
|Ctrl| Facility | Error code |sev|
+----+------------+-------------+---+
With the Ctrl-bits an application can tell if part or the whole message has
allready been printed from the program, DCL doesn't need to print it again.
Facility - basicaly the program ID. A code assigned to the program
the name can be fetched from external or internal message libraries
Errorcode - the errodes assigned by the application
Sev. - severity: Even = error, off = non error
0 = Warning
1 = Success
2 = Error
3 = Information
4 = Fatal
<5-7> reserved.
This all presents itself with:
%<FACILITY>-<SeV>-<Errorname>, <Error message>
See also the src/curlmsg.msg file, it has the source for the messages In
src/main.c a section is devoted to message status values, the globalvalues
create symbols with certain values, referenced from a compiled message
file. Have all exit function use a exit status derived from a translation
table with the compiled message codes.
This was all compiled with:
Compaq C V6.2-003 on OpenVMS Alpha V7.1-1H2
So far for porting notes as of:
13-jul-2001
N. Baggus
QNX
===
(This section was graciously brought to us by David Bentham)
As QNX is targetted for resource constrained environments, the QNX headers
set conservative limits. This includes the FD_SETSIZE macro, set by default
to 32. Socket descriptors returned within the CURL library may exceed this,
resulting in memory faults/SIGSEGV crashes when passed into select(..)
calls using fd_set macros.
A good all-round solution to this is to override the default when building
libcurl, by overriding CFLAGS during configure, example
# configure CFLAGS='-DFD_SETSIZE=64 -g -O2'
CROSS COMPILE
=============
(This section was graciously brought to us by Jim Duey, 23-oct-2001)
Download and unpack the cURL package. Version should be 7.9.1 or later.
'cd' to the new directory. (ie. curl-7.9.1-pre4)
Set environment variables to point to the cross-compile toolchain and call
configure with any options you need. Be sure and specify the '--host' and
'--build' parameters at configuration time. The following script is an
example of cross-compiling for the IBM 405GP PowerPC processor using the
toolchain from MonteVista for Hardhat Linux.
(begin script)
#! /bin/sh
export PATH=$PATH:/opt/hardhat/devkit/ppc/405/bin
export CPPFLAGS="-I/opt/hardhat/devkit/ppc/405/target/usr/include"
export AR=ppc_405-ar
export AS=ppc_405-as
export LD=ppc_405-ld
export RANLIB=ppc_405-ranlib
export CC=ppc_405-gcc
export NM=ppc_405-nm
configure --target=powerpc-hardhat-linux \
--host=powerpc-hardhat-linux \
--build=i586-pc-linux-gnu \
--prefix=/opt/hardhat/devkit/ppc/405/target/usr/local \
--exec-prefix=/usr/local
(end script)
The '--prefix' parameter specifies where cURL will be installed. If
'configure' completes successfully, do 'make' and 'make install' as usual.
RISC OS
=======
The library can be cross-compiled using gccsdk as follows:
CC=riscos-gcc AR=riscos-ar RANLIB='riscos-ar -s' ./configure \
--host=arm-riscos-aof --without-random --disable-shared
make
where riscos-gcc and riscos-ar are links to the gccsdk tools.
You can then link your program with curl/lib/.libs/libcurl.a
AmigaOS
=======
(This section was graciously brought to us by Diego Casorran)
To build cURL/libcurl on AmigaOS just type 'make amiga' ...
What you need is: (not tested with others versions)
GeekGadgets / gcc 2.95.3 (http://www.geekgadgets.org/)
AmiTCP SDK v4.3 (http://www.aminet.net/comm/tcp/AmiTCP-SDK-4.3.lha)
Native Developer Kit (http://www.amiga.com/3.9/download/NDK3.9.lha)
As no ixemul.library is required you will be able to build it for
WarpOS/PowerPC (not tested by me), as well a MorphOS version should be
possible with no problems.
To enable SSL support, you need a OpenSSL native version (without ixemul),
you can find a precompiled package at http://amiga.sourceforge.net/OpenSSL/
PORTS
=====
This is a probably incomplete list of known hardware and operating systems
that curl has been compiled for. If you know a system curl compiles and
runs on, that isn't listed, please let us know!
- Alpha DEC OSF 4
- Alpha Digital UNIX v3.2
- Alpha FreeBSD 4.1, 4.5
- Alpha Linux 2.2, 2.4
- Alpha NetBSD 1.5.2
- Alpha OpenBSD 3.0
- Alpha OpenVMS V7.1-1H2
- Alpha Tru64 v5.0 5.1
- HP-PA HP-UX 9.X 10.X 11.X
- HP-PA Linux
- HP3000 MPE/iX
- MIPS IRIX 6.2, 6.5
- MIPS Linux
- Pocket PC/Win CE 3.0
- Power AIX 3.2.5, 4.2, 4.3.1, 4.3.2, 5.1
- PowerPC Darwin 1.0
- PowerPC Linux
- PowerPC Mac OS 9
- PowerPC Mac OS X
- SINIX-Z v5
- Sparc Linux
- Sparc Solaris 2.4, 2.5, 2.5.1, 2.6, 7, 8
- Sparc SunOS 4.1.X
- StrongARM (and other ARM) RISC OS 3.1, 4.02
- StrongARM Linux 2.4
- StrongARM NetBSD 1.4.1
- Ultrix 4.3a
- i386 BeOS
- i386 DOS
- i386 FreeBSD
- i386 HURD
- i386 Linux 1.3, 2.0, 2.2, 2.3, 2.4
- i386 NetBSD
- i386 Novell NetWare
- i386 OS/2
- i386 OpenBSD
- i386 SCO unix
- i386 Solaris 2.7
- i386 Windows 95, 98, ME, NT, 2000
- i386 QNX 6
- i486 ncr-sysv4.3.03 (NCR MP-RAS)
- ia64 Linux 2.3.99
- m68k AmigaOS 3
- m68k Linux
- m68k OpenBSD
- m88k dg-dgux5.4R3.00
- s390 Linux
- XScale/PXA250 Linux 2.4
OpenSSL
=======
You'll find OpenSSL information at:
http://www.openssl.org
MingW32/Cygwin
==============
You'll find MingW32 and Cygwin information at:
http://www.mingw.org
OpenLDAP
========
You'll find OpenLDAP information at:
http://www.openldap.org

381
neo/curl/docs/INTERNALS Normal file
View File

@ -0,0 +1,381 @@
Updated for curl 7.9.1 on November 2, 2001
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
INTERNALS
The project is split in two. The library and the client. The client part uses
the library, but the library is designed to allow other applications to use
it.
The largest amount of code and complexity is in the library part.
CVS
===
All changes to the sources are committed to the CVS repository as soon as
they're somewhat verified to work. Changes shall be commited as independently
as possible so that individual changes can be easier spotted and tracked
afterwards.
Tagging shall be used extensively, and by the time we release new archives we
should tag the sources with a name similar to the released version number.
Windows vs Unix
===============
There are a few differences in how to program curl the unix way compared to
the Windows way. The four perhaps most notable details are:
1. Different function names for socket operations.
In curl, this is solved with defines and macros, so that the source looks
the same at all places except for the header file that defines them. The
macros in use are sclose(), sread() and swrite().
2. Windows requires a couple of init calls for the socket stuff.
Those must be made by the application that uses libcurl, in curl that means
src/main.c has some code #ifdef'ed to do just that.
3. The file descriptors for network communication and file operations are
not easily interchangable as in unix.
We avoid this by not trying any funny tricks on file descriptors.
4. When writing data to stdout, Windows makes end-of-lines the DOS way, thus
destroying binary data, although you do want that conversion if it is
text coming through... (sigh)
We set stdout to binary under windows
Inside the source code, We make an effort to avoid '#ifdef [Your OS]'. All
conditionals that deal with features *should* instead be in the format
'#ifdef HAVE_THAT_WEIRD_FUNCTION'. Since Windows can't run configure scripts,
we maintain two config-win32.h files (one in lib/ and one in src/) that are
supposed to look exactly as a config.h file would have looked like on a
Windows machine!
Generally speaking: always remember that this will be compiled on dozens of
operating systems. Don't walk on the edge.
Library
=======
There are plenty of entry points to the library, namely each publicly defined
function that libcurl offers to applications. All of those functions are
rather small and easy-to-follow. All the ones prefixed with 'curl_easy' are
put in the lib/easy.c file.
curl_global_init_() and curl_global_cleanup() should be called by the
application to initialize and clean up global stuff in the library. As of
today, it can handle the global SSL initing if SSL is enabled and it can init
the socket layer on windows machines. libcurl itself has no "global" scope.
All printf()-style functions use the supplied clones in lib/mprintf.c. This
makes sure we stay absolutely platform independent.
curl_easy_init() allocates an internal struct and makes some initializations.
The returned handle does not reveal internals. This is the 'SessionHandle'
struct which works as an "anchor" struct for all curl_easy functions. All
connections performed will get connect-specific data allocated that should be
used for things related to particular connections/requests.
curl_easy_setopt() takes three arguments, where the option stuff must be
passed in pairs: the parameter-ID and the parameter-value. The list of
options is documented in the man page. This function mainly sets things in
the 'SessionHandle' struct.
curl_easy_perform() does a whole lot of things:
It starts off in the lib/easy.c file by calling Curl_perform() and the main
work then continues in lib/url.c. The flow continues with a call to
Curl_connect() to connect to the remote site.
o Curl_connect()
... analyzes the URL, it separates the different components and connects to
the remote host. This may involve using a proxy and/or using SSL. The
Curl_gethost() function in lib/hostip.c is used for looking up host names.
When Curl_connect is done, we are connected to the remote site. Then it is
time to tell the server to get a document/file. Curl_do() arranges this.
This function makes sure there's an allocated and initiated 'connectdata'
struct that is used for this particular connection only (although there may
be several requests performed on the same connect). A bunch of things are
inited/inherited from the SessionHandle struct.
o Curl_do()
Curl_do() makes sure the proper protocol-specific function is called. The
functions are named after the protocols they handle. Curl_ftp(),
Curl_http(), Curl_dict(), etc. They all reside in their respective files
(ftp.c, http.c and dict.c). HTTPS is handled by Curl_http() and FTPS by
Curl_ftp().
The protocol-specific functions of course deal with protocol-specific
negotiations and setup. They have access to the Curl_sendf() (from
lib/sendf.c) function to send printf-style formatted data to the remote
host and when they're ready to make the actual file transfer they call the
Curl_Transfer() function (in lib/transfer.c) to setup the transfer and
returns.
Starting in 7.9.1, if this DO function fails and the connection is being
re-used, libcurl will then close this connection, setup a new connection
and re-issue the DO request on that. This is because there is no way to be
perfectly sure that we have discovered a dead connection before the DO
function and thus we might wrongly be re-using a connection that was closed
by the remote peer.
o Transfer()
Curl_perform() then calls Transfer() in lib/transfer.c that performs
the entire file transfer.
During transfer, the progress functions in lib/progress.c are called at a
frequent interval (or at the user's choice, a specified callback might get
called). The speedcheck functions in lib/speedcheck.c are also used to
verify that the transfer is as fast as required.
o Curl_done()
Called after a transfer is done. This function takes care of everything
that has to be done after a transfer. This function attempts to leave
matters in a state so that Curl_do() should be possible to call again on
the same connection (in a persistent connection case). It might also soon
be closed with Curl_disconnect().
o Curl_disconnect()
When doing normal connections and transfers, no one ever tries to close any
connections so this is not normally called when curl_easy_perform() is
used. This function is only used when we are certain that no more transfers
is going to be made on the connection. It can be also closed by force, or
it can be called to make sure that libcurl doesn't keep too many
connections alive at the same time (there's a default amount of 5 but that
can be changed with the CURLOPT_MAXCONNECTS option).
This function cleans up all resources that are associated with a single
connection.
Curl_perform() is the function that does the main "connect - do - transfer -
done" loop. It loops if there's a Location: to follow.
When completed, the curl_easy_cleanup() should be called to free up used
resources. It runs Curl_disconnect() on all open connectons.
A quick roundup on internal function sequences (many of these call
protocol-specific function-pointers):
curl_connect - connects to a remote site and does initial connect fluff
This also checks for an existing connection to the requested site and uses
that one if it is possible.
curl_do - starts a transfer
curl_transfer() - transfers data
curl_done - ends a transfer
curl_disconnect - disconnects from a remote site. This is called when the
disconnect is really requested, which doesn't necessarily have to be
exactly after curl_done in case we want to keep the connection open for
a while.
HTTP(S)
HTTP offers a lot and is the protocol in curl that uses the most lines of
code. There is a special file (lib/formdata.c) that offers all the multipart
post functions.
base64-functions for user+password stuff (and more) is in (lib/base64.c) and
all functions for parsing and sending cookies are found in (lib/cookie.c).
HTTPS uses in almost every means the same procedure as HTTP, with only two
exceptions: the connect procedure is different and the function used to read
or write from the socket is different, although the latter fact is hidden in
the source by the use of curl_read() for reading and curl_write() for writing
data to the remote server.
http_chunks.c contains functions that understands HTTP 1.1 chunked transfer
encoding.
An interesting detail with the HTTP(S) request, is the add_buffer() series of
functions we use. They append data to one single buffer, and when the
building is done the entire request is sent off in one single write. This is
done this way to overcome problems with flawed firewalls and lame servers.
FTP
The Curl_if2ip() function can be used for getting the IP number of a
specified network interface, and it resides in lib/if2ip.c.
Curl_ftpsendf() is used for sending FTP commands to the remote server. It was
made a separate function to prevent us programmers from forgetting that they
must be CRLF terminated. They must also be sent in one single write() to make
firewalls and similar happy.
Kerberos
The kerberos support is mainly in lib/krb4.c and lib/security.c.
TELNET
Telnet is implemented in lib/telnet.c.
FILE
The file:// protocol is dealt with in lib/file.c.
LDAP
Everything LDAP is in lib/ldap.c.
GENERAL
URL encoding and decoding, called escaping and unescaping in the source code,
is found in lib/escape.c.
While transfering data in Transfer() a few functions might get
used. curl_getdate() in lib/getdate.c is for HTTP date comparisons (and
more).
lib/getenv.c offers curl_getenv() which is for reading environment variables
in a neat platform independent way. That's used in the client, but also in
lib/url.c when checking the proxy environment variables. Note that contrary
to the normal unix getenv(), this returns an allocated buffer that must be
free()ed after use.
lib/netrc.c holds the .netrc parser
lib/timeval.c features replacement functions for systems that don't have
gettimeofday() and a few support functions for timeval convertions.
A function named curl_version() that returns the full curl version string is
found in lib/version.c.
If authentication is requested but no password is given, a getpass_r() clone
exists in lib/getpass.c. libcurl offers a custom callback that can be used
instead of this, but it doesn't change much to us.
Persistent Connections
======================
The persistent connection support in libcurl requires some considerations on
how to do things inside of the library.
o The 'SessionHandle' struct returned in the curl_easy_init() call must never
hold connection-oriented data. It is meant to hold the root data as well as
all the options etc that the library-user may choose.
o The 'SessionHandle' struct holds the "connection cache" (an array of
pointers to 'connectdata' structs). There's one connectdata struct
allocated for each connection that libcurl knows about.
o This also enables the 'curl handle' to be reused on subsequent transfers,
something that was illegal before libcurl 7.7.
o When we are about to perform a transfer with curl_easy_perform(), we first
check for an already existing connection in the cache that we can use,
otherwise we create a new one and add to the cache. If the cache is full
already when we add a new connection, we close one of the present ones. We
select which one to close dependent on the close policy that may have been
previously set.
o When the transfer operation is complete, we try to leave the connection
open. Particular options may tell us not to, and protocols may signal
closure on connections and then we don't keep it open of course.
o When curl_easy_cleanup() is called, we close all still opened connections.
You do realize that the curl handle must be re-used in order for the
persistent connections to work.
Library Symbols
===============
All symbols used internally in libcurl must use a 'Curl_' prefix if they're
used in more than a single file. Single-file symbols must be made static.
Public ("exported") symbols must use a 'curl_' prefix. (There are exceptions,
but they are to be changed to follow this pattern in future versions.)
Return Codes and Informationals
===============================
I've made things simple. Almost every function in libcurl returns a CURLcode,
that must be CURLE_OK if everything is OK or otherwise a suitable error code
as the curl/curl.h include file defines. The very spot that detects an error
must use the Curl_failf() function to set the human-readable error
description.
In aiding the user to understand what's happening and to debug curl usage, we
must supply a fair amount of informational messages by using the Curl_infof()
function. Those messages are only displayed when the user explicitly asks for
them. They are best used when revealing information that isn't otherwise
obvious.
Client
======
main() resides in src/main.c together with most of the client code.
src/hugehelp.c is automatically generated by the mkhelp.pl perl script to
display the complete "manual" and the src/urlglob.c file holds the functions
used for the URL-"globbing" support. Globbing in the sense that the {} and []
expansion stuff is there.
The client mostly messes around to setup its 'config' struct properly, then
it calls the curl_easy_*() functions of the library and when it gets back
control after the curl_easy_perform() it cleans up the library, checks status
and exits.
When the operation is done, the ourWriteOut() function in src/writeout.c may
be called to report about the operation. That function is using the
curl_easy_getinfo() function to extract useful information from the curl
session.
Recent versions may loop and do all this several times if many URLs were
specified on the command line or config file.
Memory Debugging
================
The file lib/memdebug.c contains debug-versions of a few functions. Functions
such as malloc, free, fopen, fclose, etc that somehow deal with resources
that might give us problems if we "leak" them. The functions in the memdebug
system do nothing fancy, they do their normal function and then log
information about what they just did. The logged data can then be analyzed
after a complete session,
memanalyze.pl is the perl script present only present in CVS (not part of the
release archives) that analyzes a log file generated by the memdebug
system. It detects if resources are allocated but never freed and other kinds
of errors related to resource management.
Use -DMALLOCDEBUG when compiling to enable memory debugging, this is also
switched on by running configure with --enable-debug.
Test Suite
==========
Since November 2000, a test suite has evolved. It is placed in its own
subdirectory directly off the root in the curl archive tree, and it contains
a bunch of scripts and a lot of test case data.
The main test script is runtests.pl that will invoke the two servers
httpserver.pl and ftpserver.pl before all the test cases are performed. The
test suite currently only runs on unix-like platforms.
You'll find a complete description of the test case data files in the
tests/README file.
The test suite automatically detects if curl was built with the memory
debugging enabled, and if it was it will detect memory leaks too.
Building Releases
=================
There's no magic to this. When you consider everything stable enough to be
released, run the 'maketgz' script (using 'make distcheck' will give you a
pretty good view on the status of the current sources). maketgz prompts for
version number of the client and the library before it creates a release
archive. maketgz uses 'make dist' for the actual archive building, why you
need to fill in the Makefile.am files properly for which files that should
be included in the release archives.

60
neo/curl/docs/KNOWN_BUGS Normal file
View File

@ -0,0 +1,60 @@
These are problems known to exist at the time of this release. Feel free to
join in and help us correct one or more of these! Also be sure to check the
changelog of the current development status, as one or more of these problems
may have been fixed since this was written!
* NTLM authentication with passwords longer than 14 letters fail. There is
a known fix for this, planned to come in curl 7.11.2
* Doing resumed upload over HTTP does not work with '-C -', because curl
doesn't do a HEAD first to get the initial size. This needs to be done
manually for HTTP PUT resume to work, and then '-C [index]'.
* CURLOPT_USERPWD and CURLOPT_PROXYUSERPWD have no way of providing user names
that contain a colon. This can't be fixed easily in a backwards compatible
way without adding new options (and then, they should most probably allow
setting user name and password separately).
* libcurl ignores empty path parts in FTP URLs, whereas RFC1738 states that
such parts should be sent to the server as 'CWD ' (without an argument).
The only exception to this rule, is that we knowingly break this if the
empty part is first in the path, as then we use the double slashes to
indicate that the user wants to reach the root dir (this exception SHALL
remain even when this bug is fixed).
* 1) libcurl does a POST
2) receives a 100-continue
3) sends away the POST
Now, if nothing else is returned from the server, libcurl MUST return
CURLE_GOT_NOTHING, but it seems it returns CURLE_OK as it seems to count
the 100-continue reply as a good enough reply.
* libcurl doesn't treat the content-length of compressed data properly, as
it seems HTTP servers send the *uncompressed* length in that header and
libcurl thinks of it as the *compressed* lenght. Some explanations are here:
http://curl.haxx.se/mail/lib-2003-06/0146.html
* Downloading 0 (zero) bytes files over FTP will not create a zero byte file
locally, which is because libcurl doesn't call the write callback with zero
bytes. Explained here: http://curl.haxx.se/mail/archive-2003-04/0143.html
* Using CURLOPT_FAILONERROR (-f/--fail) will make authentication to stop
working if you use anything but plain Basic auth.
* IPv6 support on AIX 4.3.3 doesn't work due to a missing sockaddr_storage
struct. It has been reported to work on AIX 5.1 though.
* GOPHER transfers seem broken
* configure --disable-http is not fully supported. All other protocols seem
to work to disable.
* If a HTTP server responds to a HEAD request and includes a body (thus
violating the RFC2616), curl won't wait to read the response but just stop
reading and return back. If a second request (let's assume a GET) is then
immediately made to the same server again, the connection will be re-used
fine of course, and the second request will be sent off but when the
response is to get read, the previous response-body is what curl will read
and havoc is what happens.
More details on this is found in this libcurl mailing list thread:
http://curl.haxx.se/mail/lib-2002-08/0000.html

877
neo/curl/docs/MANUAL Normal file
View File

@ -0,0 +1,877 @@
LATEST VERSION
You always find news about what's going on as well as the latest versions
from the curl web pages, located at:
http://curl.haxx.se
SIMPLE USAGE
Get the main page from netscape's web-server:
curl http://www.netscape.com/
Get the README file the user's home directory at funet's ftp-server:
curl ftp://ftp.funet.fi/README
Get a web page from a server using port 8000:
curl http://www.weirdserver.com:8000/
Get a list of a directory of an FTP site:
curl ftp://cool.haxx.se/
Get a gopher document from funet's gopher server:
curl gopher://gopher.funet.fi
Get the definition of curl from a dictionary:
curl dict://dict.org/m:curl
Fetch two documents at once:
curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/
DOWNLOAD TO A FILE
Get a web page and store in a local file:
curl -o thatpage.html http://www.netscape.com/
Get a web page and store in a local file, make the local file get the name
of the remote document (if no file name part is specified in the URL, this
will fail):
curl -O http://www.netscape.com/index.html
Fetch two files and store them with their remote names:
curl -O www.haxx.se/index.html -O curl.haxx.se/download.html
USING PASSWORDS
FTP
To ftp files using name+passwd, include them in the URL like:
curl ftp://name:passwd@machine.domain:port/full/path/to/file
or specify them with the -u flag like
curl -u name:passwd ftp://machine.domain:port/full/path/to/file
HTTP
The HTTP URL doesn't support user and password in the URL string. Curl
does support that anyway to provide a ftp-style interface and thus you can
pick a file like:
curl http://name:passwd@machine.domain/full/path/to/file
or specify user and password separately like in
curl -u name:passwd http://machine.domain/full/path/to/file
NOTE! Since HTTP URLs don't support user and password, you can't use that
style when using Curl via a proxy. You _must_ use the -u style fetch
during such circumstances.
HTTPS
Probably most commonly used with private certificates, as explained below.
GOPHER
Curl features no password support for gopher.
PROXY
Get an ftp file using a proxy named my-proxy that uses port 888:
curl -x my-proxy:888 ftp://ftp.leachsite.com/README
Get a file from a HTTP server that requires user and password, using the
same proxy as above:
curl -u user:passwd -x my-proxy:888 http://www.get.this/
Some proxies require special authentication. Specify by using -U as above:
curl -U user:passwd -x my-proxy:888 http://www.get.this/
See also the environment variables Curl support that offer further proxy
control.
RANGES
With HTTP 1.1 byte-ranges were introduced. Using this, a client can request
to get only one or more subparts of a specified document. Curl supports
this with the -r flag.
Get the first 100 bytes of a document:
curl -r 0-99 http://www.get.this/
Get the last 500 bytes of a document:
curl -r -500 http://www.get.this/
Curl also supports simple ranges for FTP files as well. Then you can only
specify start and stop position.
Get the first 100 bytes of a document using FTP:
curl -r 0-99 ftp://www.get.this/README
UPLOADING
FTP
Upload all data on stdin to a specified ftp site:
curl -T - ftp://ftp.upload.com/myfile
Upload data from a specified file, login with user and password:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
Upload a local file to the remote site, and use the local file name remote
too:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/
Upload a local file to get appended to the remote file using ftp:
curl -T localfile -a ftp://ftp.upload.com/remotefile
Curl also supports ftp upload through a proxy, but only if the proxy is
configured to allow that kind of tunneling. If it does, you can run curl in
a fashion similar to:
curl --proxytunnel -x proxy:port -T localfile ftp.upload.com
HTTP
Upload all data on stdin to a specified http site:
curl -T - http://www.upload.com/myfile
Note that the http server must've been configured to accept PUT before this
can be done successfully.
For other ways to do http data upload, see the POST section below.
VERBOSE / DEBUG
If curl fails where it isn't supposed to, if the servers don't let you in,
if you can't understand the responses: use the -v flag to get verbose
fetching. Curl will output lots of info and what it sends and receives in
order to let the user see all client-server interaction (but it won't show
you the actual data).
curl -v ftp://ftp.upload.com/
To get even more details and information on what curl does, try using the
--trace or --trace-ascii options with a given file name to log to, like
this:
curl --trace trace.txt www.haxx.se
DETAILED INFORMATION
Different protocols provide different ways of getting detailed information
about specific files/documents. To get curl to show detailed information
about a single file, you should use -I/--head option. It displays all
available info on a single file for HTTP and FTP. The HTTP information is a
lot more extensive.
For HTTP, you can get the header information (the same as -I would show)
shown before the data by using -i/--include. Curl understands the
-D/--dump-header option when getting files from both FTP and HTTP, and it
will then store the headers in the specified file.
Store the HTTP headers in a separate file (headers.txt in the example):
curl --dump-header headers.txt curl.haxx.se
Note that headers stored in a separate file can be very useful at a later
time if you want curl to use cookies sent by the server. More about that in
the cookies section.
POST (HTTP)
It's easy to post data using curl. This is done using the -d <data>
option. The post data must be urlencoded.
Post a simple "name" and "phone" guestbook.
curl -d "name=Rafael%20Sagula&phone=3320780" \
http://www.where.com/guest.cgi
How to post a form with curl, lesson #1:
Dig out all the <input> tags in the form that you want to fill in. (There's
a perl program called formfind.pl on the curl site that helps with this).
If there's a "normal" post, you use -d to post. -d takes a full "post
string", which is in the format
<variable1>=<data1>&<variable2>=<data2>&...
The 'variable' names are the names set with "name=" in the <input> tags, and
the data is the contents you want to fill in for the inputs. The data *must*
be properly URL encoded. That means you replace space with + and that you
write weird letters with %XX where XX is the hexadecimal representation of
the letter's ASCII code.
Example:
(page located at http://www.formpost.com/getthis/
<form action="post.cgi" method="post">
<input name=user size=10>
<input name=pass type=password size=10>
<input name=id type=hidden value="blablabla">
<input name=ding value="submit">
</form>
We want to enter user 'foobar' with password '12345'.
To post to this, you enter a curl command line like:
curl -d "user=foobar&pass=12345&id=blablabla&ding=submit" (continues)
http://www.formpost.com/getthis/post.cgi
While -d uses the application/x-www-form-urlencoded mime-type, generally
understood by CGI's and similar, curl also supports the more capable
multipart/form-data type. This latter type supports things like file upload.
-F accepts parameters like -F "name=contents". If you want the contents to
be read from a file, use <@filename> as contents. When specifying a file,
you can also specify the file content type by appending ';type=<mime type>'
to the file name. You can also post the contents of several files in one
field. For example, the field name 'coolfiles' is used to send three files,
with different content types using the following syntax:
curl -F "coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html" \
http://www.post.com/postit.cgi
If the content-type is not specified, curl will try to guess from the file
extension (it only knows a few), or use the previously specified type (from
an earlier file if several files are specified in a list) or else it will
using the default type 'text/plain'.
Emulate a fill-in form with -F. Let's say you fill in three fields in a
form. One field is a file name which to post, one field is your name and one
field is a file description. We want to post the file we have written named
"cooltext.txt". To let curl do the posting of this data instead of your
favourite browser, you have to read the HTML source of the form page and
find the names of the input fields. In our example, the input field names
are 'file', 'yourname' and 'filedescription'.
curl -F "file=@cooltext.txt" -F "yourname=Daniel" \
-F "filedescription=Cool text file with cool text inside" \
http://www.post.com/postit.cgi
To send two files in one post you can do it in two ways:
1. Send multiple files in a single "field" with a single field name:
curl -F "pictures=@dog.gif,cat.gif"
2. Send two fields with two field names:
curl -F "docpicture=@dog.gif" -F "catpicture=@cat.gif"
REFERRER
A HTTP request has the option to include information about which address
that referred to actual page. Curl allows you to specify the
referrer to be used on the command line. It is especially useful to
fool or trick stupid servers or CGI scripts that rely on that information
being available or contain certain data.
curl -e www.coolsite.com http://www.showme.com/
NOTE: The referer field is defined in the HTTP spec to be a full URL.
USER AGENT
A HTTP request has the option to include information about the browser
that generated the request. Curl allows it to be specified on the command
line. It is especially useful to fool or trick stupid servers or CGI
scripts that only accept certain browsers.
Example:
curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/
Other common strings:
'Mozilla/3.0 (Win95; I)' Netscape Version 3 for Windows 95
'Mozilla/3.04 (Win95; U)' Netscape Version 3 for Windows 95
'Mozilla/2.02 (OS/2; U)' Netscape Version 2 for OS/2
'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)' NS for AIX
'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)' NS for Linux
Note that Internet Explorer tries hard to be compatible in every way:
'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)' MSIE for W95
Mozilla is not the only possible User-Agent name:
'Konqueror/1.0' KDE File Manager desktop client
'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser
COOKIES
Cookies are generally used by web servers to keep state information at the
client's side. The server sets cookies by sending a response line in the
headers that looks like 'Set-Cookie: <data>' where the data part then
typically contains a set of NAME=VALUE pairs (separated by semicolons ';'
like "NAME1=VALUE1; NAME2=VALUE2;"). The server can also specify for what
path the "cookie" should be used for (by specifying "path=value"), when the
cookie should expire ("expire=DATE"), for what domain to use it
("domain=NAME") and if it should be used on secure connections only
("secure").
If you've received a page from a server that contains a header like:
Set-Cookie: sessionid=boo123; path="/foo";
it means the server wants that first pair passed on when we get anything in
a path beginning with "/foo".
Example, get a page that wants my name passed in a cookie:
curl -b "name=Daniel" www.sillypage.com
Curl also has the ability to use previously received cookies in following
sessions. If you get cookies from a server and store them in a file in a
manner similar to:
curl --dump-header headers www.example.com
... you can then in a second connect to that (or another) site, use the
cookies from the 'headers' file like:
curl -b headers www.example.com
While saving headers to a file is a working way to store cookies, it is
however error-prone and not the prefered way to do this. Instead, make curl
save the incoming cookies using the well-known netscape cookie format like
this:
curl -c cookies.txt www.example.com
Note that by specifying -b you enable the "cookie awareness" and with -L
you can make curl follow a location: (which often is used in combination
with cookies). So that if a site sends cookies and a location, you can
use a non-existing file to trigger the cookie awareness like:
curl -L -b empty.txt www.example.com
The file to read cookies from must be formatted using plain HTTP headers OR
as netscape's cookie file. Curl will determine what kind it is based on the
file contents. In the above command, curl will parse the header and store
the cookies received from www.example.com. curl will send to the server the
stored cookies which match the request as it follows the location. The
file "empty.txt" may be a non-existant file.
Alas, to both read and write cookies from a netscape cookie file, you can
set both -b and -c to use the same file:
curl -b cookies.txt -c cookies.txt www.example.com
PROGRESS METER
The progress meter exists to show a user that something actually is
happening. The different fields in the output have the following meaning:
% Total % Received % Xferd Average Speed Time Curr.
Dload Upload Total Current Left Speed
0 151M 0 38608 0 0 9406 0 4:41:43 0:00:04 4:41:39 9287
From left-to-right:
% - percentage completed of the whole transfer
Total - total size of the whole expected transfer
% - percentage completed of the download
Received - currently downloaded amount of bytes
% - percentage completed of the upload
Xferd - currently uploaded amount of bytes
Average Speed
Dload - the average transfer speed of the download
Average Speed
Upload - the average transfer speed of the upload
Time Total - expected time to complete the operation
Time Current - time passed since the invoke
Time Left - expected time left to completetion
Curr.Speed - the average transfer speed the last 5 seconds (the first
5 seconds of a transfer is based on less time of course.)
The -# option will display a totally different progress bar that doesn't
need much explanation!
SPEED LIMIT
Curl allows the user to set the transfer speed conditions that must be met
to let the transfer keep going. By using the switch -y and -Y you
can make curl abort transfers if the transfer speed is below the specified
lowest limit for a specified time.
To have curl abort the download if the speed is slower than 3000 bytes per
second for 1 minute, run:
curl -Y 3000 -y 60 www.far-away-site.com
This can very well be used in combination with the overall time limit, so
that the above operatioin must be completed in whole within 30 minutes:
curl -m 1800 -Y 3000 -y 60 www.far-away-site.com
Forcing curl not to transfer data faster than a given rate is also possible,
which might be useful if you're using a limited bandwidth connection and you
don't want your transfer to use all of it (sometimes referred to as
"bandwith throttle").
Make curl transfer data no faster than 10 kilobytes per second:
curl --limit-rate 10K www.far-away-site.com
or
curl --limit-rate 10240 www.far-away-site.com
Or prevent curl from uploading data faster than 1 megabyte per second:
curl -T upload --limit-rate 1M ftp://uploadshereplease.com
When using the --limit-rate option, the transfer rate is regulated on a
per-second basis, which will cause the total transfer speed to become lower
than the given number. Sometimes of course substantially lower, if your
transfer stalls during periods.
CONFIG FILE
Curl automatically tries to read the .curlrc file (or _curlrc file on win32
systems) from the user's home dir on startup.
The config file could be made up with normal command line switches, but you
can also specify the long options without the dashes to make it more
readable. You can separate the options and the parameter with spaces, or
with = or :. Comments can be used within the file. If the first letter on a
line is a '#'-letter the rest of the line is treated as a comment.
If you want the parameter to contain spaces, you must inclose the entire
parameter within double quotes ("). Within those quotes, you specify a
quote as \".
NOTE: You must specify options and their arguments on the same line.
Example, set default time out and proxy in a config file:
# We want a 30 minute timeout:
-m 1800
# ... and we use a proxy for all accesses:
proxy = proxy.our.domain.com:8080
White spaces ARE significant at the end of lines, but all white spaces
leading up to the first characters of each line are ignored.
Prevent curl from reading the default file by using -q as the first command
line parameter, like:
curl -q www.thatsite.com
Force curl to get and display a local help page in case it is invoked
without URL by making a config file similar to:
# default url to get
url = "http://help.with.curl.com/curlhelp.html"
You can specify another config file to be read by using the -K/--config
flag. If you set config file name to "-" it'll read the config from stdin,
which can be handy if you want to hide options from being visible in process
tables etc:
echo "user = user:passwd" | curl -K - http://that.secret.site.com
EXTRA HEADERS
When using curl in your own very special programs, you may end up needing
to pass on your own custom headers when getting a web page. You can do
this by using the -H flag.
Example, send the header "X-you-and-me: yes" to the server when getting a
page:
curl -H "X-you-and-me: yes" www.love.com
This can also be useful in case you want curl to send a different text in a
header than it normally does. The -H header you specify then replaces the
header curl would normally send. If you replace an internal header with an
empty one, you prevent that header from being sent. To prevent the Host:
header from being used:
curl -H "Host:" www.server.com
FTP and PATH NAMES
Do note that when getting files with the ftp:// URL, the given path is
relative the directory you enter. To get the file 'README' from your home
directory at your ftp site, do:
curl ftp://user:passwd@my.site.com/README
But if you want the README file from the root directory of that very same
site, you need to specify the absolute file name:
curl ftp://user:passwd@my.site.com//README
(I.e with an extra slash in front of the file name.)
FTP and firewalls
The FTP protocol requires one of the involved parties to open a second
connction as soon as data is about to get transfered. There are two ways to
do this.
The default way for curl is to issue the PASV command which causes the
server to open another port and await another connection performed by the
client. This is good if the client is behind a firewall that don't allow
incoming connections.
curl ftp.download.com
If the server for example, is behind a firewall that don't allow connections
on other ports than 21 (or if it just doesn't support the PASV command), the
other way to do it is to use the PORT command and instruct the server to
connect to the client on the given (as parameters to the PORT command) IP
number and port.
The -P flag to curl supports a few different options. Your machine may have
several IP-addresses and/or network interfaces and curl allows you to select
which of them to use. Default address can also be used:
curl -P - ftp.download.com
Download with PORT but use the IP address of our 'le0' interface (this does
not work on windows):
curl -P le0 ftp.download.com
Download with PORT but use 192.168.0.10 as our IP address to use:
curl -P 192.168.0.10 ftp.download.com
NETWORK INTERFACE
Get a web page from a server using a specified port for the interface:
curl --interface eth0:1 http://www.netscape.com/
or
curl --interface 192.168.1.10 http://www.netscape.com/
HTTPS
Secure HTTP requires SSL libraries to be installed and used when curl is
built. If that is done, curl is capable of retrieving and posting documents
using the HTTPS procotol.
Example:
curl https://www.secure-site.com
Curl is also capable of using your personal certificates to get/post files
from sites that require valid certificates. The only drawback is that the
certificate needs to be in PEM-format. PEM is a standard and open format to
store certificates with, but it is not used by the most commonly used
browsers (Netscape and MSIE both use the so called PKCS#12 format). If you
want curl to use the certificates you use with your (favourite) browser, you
may need to download/compile a converter that can convert your browser's
formatted certificates to PEM formatted ones. This kind of converter is
included in recent versions of OpenSSL, and for older versions Dr Stephen
N. Henson has written a patch for SSLeay that adds this functionality. You
can get his patch (that requires an SSLeay installation) from his site at:
http://www.drh-consultancy.demon.co.uk/
Example on how to automatically retrieve a document using a certificate with
a personal password:
curl -E /path/to/cert.pem:password https://secure.site.com/
If you neglect to specify the password on the command line, you will be
prompted for the correct password before any data can be received.
Many older SSL-servers have problems with SSLv3 or TLS, that newer versions
of OpenSSL etc is using, therefore it is sometimes useful to specify what
SSL-version curl should use. Use -3, -2 or -1 to specify that exact SSL
version to use (for SSLv3, SSLv2 or TLSv1 respectively):
curl -2 https://secure.site.com/
Otherwise, curl will first attempt to use v3 and then v2.
To use OpenSSL to convert your favourite browser's certificate into a PEM
formatted one that curl can use, do something like this (assuming netscape,
but IE is likely to work similarly):
You start with hitting the 'security' menu button in netscape.
Select 'certificates->yours' and then pick a certificate in the list
Press the 'export' button
enter your PIN code for the certs
select a proper place to save it
Run the 'openssl' application to convert the certificate. If you cd to the
openssl installation, you can do it like:
# ./apps/openssl pkcs12 -in [file you saved] -clcerts -out [PEMfile]
RESUMING FILE TRANSFERS
To continue a file transfer where it was previously aborted, curl supports
resume on http(s) downloads as well as ftp uploads and downloads.
Continue downloading a document:
curl -C - -o file ftp://ftp.server.com/path/file
Continue uploading a document(*1):
curl -C - -T file ftp://ftp.server.com/path/file
Continue downloading a document from a web server(*2):
curl -C - -o file http://www.server.com/
(*1) = This requires that the ftp server supports the non-standard command
SIZE. If it doesn't, curl will say so.
(*2) = This requires that the web server supports at least HTTP/1.1. If it
doesn't, curl will say so.
TIME CONDITIONS
HTTP allows a client to specify a time condition for the document it
requests. It is If-Modified-Since or If-Unmodified-Since. Curl allow you to
specify them with the -z/--time-cond flag.
For example, you can easily make a download that only gets performed if the
remote file is newer than a local copy. It would be made like:
curl -z local.html http://remote.server.com/remote.html
Or you can download a file only if the local file is newer than the remote
one. Do this by prepending the date string with a '-', as in:
curl -z -local.html http://remote.server.com/remote.html
You can specify a "free text" date as condition. Tell curl to only download
the file if it was updated since yesterday:
curl -z yesterday http://remote.server.com/remote.html
Curl will then accept a wide range of date formats. You always make the date
check the other way around by prepending it with a dash '-'.
DICT
For fun try
curl dict://dict.org/m:curl
curl dict://dict.org/d:heisenbug:jargon
curl dict://dict.org/d:daniel:web1913
Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'
and 'lookup'. For example,
curl dict://dict.org/find:curl
Commands that break the URL description of the RFC (but not the DICT
protocol) are
curl dict://dict.org/show:db
curl dict://dict.org/show:strat
Authentication is still missing (but this is not required by the RFC)
LDAP
If you have installed the OpenLDAP library, curl can take advantage of it
and offer ldap:// support.
LDAP is a complex thing and writing an LDAP query is not an easy task. I do
advice you to dig up the syntax description for that elsewhere. Two places
that might suit you are:
Netscape's "Netscape Directory SDK 3.0 for C Programmer's Guide Chapter 10:
Working with LDAP URLs":
http://developer.netscape.com/docs/manuals/dirsdk/csdk30/url.htm
RFC 2255, "The LDAP URL Format" http://www.rfc-editor.org/rfc/rfc2255.txt
To show you an example, this is now I can get all people from my local LDAP
server that has a certain sub-domain in their email address:
curl -B "ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se"
If I want the same info in HTML format, I can get it by not using the -B
(enforce ASCII) flag.
ENVIRONMENT VARIABLES
Curl reads and understands the following environment variables:
http_proxy, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY
They should be set for protocol-specific proxies. General proxy should be
set with
ALL_PROXY
A comma-separated list of host names that shouldn't go through any proxy is
set in (only an asterisk, '*' matches all hosts)
NO_PROXY
If a tail substring of the domain-path for a host matches one of these
strings, transactions with that node will not be proxied.
The usage of the -x/--proxy flag overrides the environment variables.
NETRC
Unix introduced the .netrc concept a long time ago. It is a way for a user
to specify name and password for commonly visited ftp sites in a file so
that you don't have to type them in each time you visit those sites. You
realize this is a big security risk if someone else gets hold of your
passwords, so therefor most unix programs won't read this file unless it is
only readable by yourself (curl doesn't care though).
Curl supports .netrc files if told so (using the -n/--netrc and
--netrc-optional options). This is not restricted to only ftp,
but curl can use it for all protocols where authentication is used.
A very simple .netrc file could look something like:
machine curl.haxx.se login iamdaniel password mysecret
CUSTOM OUTPUT
To better allow script programmers to get to know about the progress of
curl, the -w/--write-out option was introduced. Using this, you can specify
what information from the previous transfer you want to extract.
To display the amount of bytes downloaded together with some text and an
ending newline:
curl -w 'We downloaded %{size_download} bytes\n' www.download.com
KERBEROS4 FTP TRANSFER
Curl supports kerberos4 for FTP transfers. You need the kerberos package
installed and used at curl build time for it to be used.
First, get the krb-ticket the normal way, like with the kauth tool. Then use
curl in way similar to:
curl --krb4 private ftp://krb4site.com -u username:fakepwd
There's no use for a password on the -u switch, but a blank one will make
curl ask for one and you already entered the real password to kauth.
TELNET
The curl telnet support is basic and very easy to use. Curl passes all data
passed to it on stdin to the remote server. Connect to a remote telnet
server using a command line similar to:
curl telnet://remote.server.com
And enter the data to pass to the server on stdin. The result will be sent
to stdout or to the file you specify with -o.
You might want the -N/--no-buffer option to switch off the buffered output
for slow connections or similar.
Pass options to the telnet protocol negotiation, by using the -t option. To
tell the server we use a vt100 terminal, try something like:
curl -tTTYPE=vt100 telnet://remote.server.com
Other interesting options for it -t include:
- XDISPLOC=<X display> Sets the X display location.
- NEW_ENV=<var,val> Sets an environment variable.
NOTE: the telnet protocol does not specify any way to login with a specified
user and password so curl can't do that automatically. To do that, you need
to track when the login prompt is received and send the username and
password accordingly.
PERSISTANT CONNECTIONS
Specifying multiple files on a single command line will make curl transfer
all of them, one after the other in the specified order.
libcurl will attempt to use persistant connections for the transfers so that
the second transfer to the same host can use the same connection that was
already initiated and was left open in the previous transfer. This greatly
decreases connection time for all but the first transfer and it makes a far
better use of the network.
Note that curl cannot use persistant connections for transfers that are used
in subsequence curl invokes. Try to stuff as many URLs as possible on the
same command line if they are using the same host, as that'll make the
transfers faster. If you use a http proxy for file transfers, practicly
all transfers will be persistant.
MAILING LISTS
For your convenience, we have several open mailing lists to discuss curl,
its development and things relevant to this. Get all info at
http://curl.haxx.se/mail/. The lists available are:
curl-users
Users of the command line tool. How to use it, what doesn't work, new
features, related tools, questions, news, installations, compilations,
running, porting etc.
curl-library
Developers using or developing libcurl. Bugs, extensions, improvements.
curl-announce
Low-traffic. Only announcements of new public versions.
curl-and-PHP
Using the curl functions in PHP. Everything curl with a PHP angle. Or PHP
with a curl angle.
curl-commits
Receives notifications on all CVS commits done to the curl source module.
This can become quite a large amount of mails during intense development,
be aware. This is for us who like email...
curl-www-commits
Receives notifications on all CVS commits done to the curl www module
(basicly the web site). This can become quite a large amount of mails
during intense changing, be aware. This is for us who like email...
Please direct curl questions, feature requests and trouble reports to one of
these mailing lists instead of mailing any individual.

49
neo/curl/docs/Makefile.am Normal file
View File

@ -0,0 +1,49 @@
#
# $Id: Makefile.am,v 1.31 2004/03/05 08:01:55 bagder Exp $
#
AUTOMAKE_OPTIONS = foreign no-dependencies
man_MANS = \
curl.1 \
curl-config.1
GENHTMLPAGES = \
curl.html \
curl-config.html
HTMLPAGES = $(GENHTMLPAGES) index.html
PDFPAGES = \
curl.pdf \
curl-config.pdf
SUBDIRS = examples libcurl
CLEANFILES = $(GENHTMLPAGES) $(PDFPAGES)
EXTRA_DIST = MANUAL BUGS CONTRIBUTE FAQ FEATURES INTERNALS SSLCERTS \
README.win32 RESOURCES TODO TheArtOfHttpScripting THANKS \
VERSIONS KNOWN_BUGS BINDINGS $(man_MANS) $(HTMLPAGES) \
HISTORY INSTALL libcurl-the-guide $(PDFPAGES)
MAN2HTML= roffit < $< >$@
SUFFIXES = .1 .html .pdf
html: $(HTMLPAGES)
cd libcurl; make html
pdf: $(PDFPAGES)
cd libcurl; make pdf
.1.html:
$(MAN2HTML)
.1.pdf:
@(foo=`echo $@ | sed -e 's/\.[0-9]$$//g'`; \
groff -Tps -man $< >$$foo.ps; \
ps2pdf $$foo.ps $@; \
rm $$foo.ps; \
echo "converted $< to $@")

588
neo/curl/docs/Makefile.in Normal file
View File

@ -0,0 +1,588 @@
# Makefile.in generated by automake 1.8.3 from Makefile.am.
# @configure_input@
# Copyright (C) 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002,
# 2003, 2004 Free Software Foundation, Inc.
# This Makefile.in is free software; the Free Software Foundation
# gives unlimited permission to copy and/or distribute it,
# with or without modifications, as long as this notice is preserved.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY, to the extent permitted by law; without
# even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE.
@SET_MAKE@
#
# $Id: Makefile.am,v 1.31 2004/03/05 08:01:55 bagder Exp $
#
srcdir = @srcdir@
top_srcdir = @top_srcdir@
VPATH = @srcdir@
pkgdatadir = $(datadir)/@PACKAGE@
pkglibdir = $(libdir)/@PACKAGE@
pkgincludedir = $(includedir)/@PACKAGE@
top_builddir = ..
am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd
INSTALL = @INSTALL@
install_sh_DATA = $(install_sh) -c -m 644
install_sh_PROGRAM = $(install_sh) -c
install_sh_SCRIPT = $(install_sh) -c
INSTALL_HEADER = $(INSTALL_DATA)
transform = $(program_transform_name)
NORMAL_INSTALL = :
PRE_INSTALL = :
POST_INSTALL = :
NORMAL_UNINSTALL = :
PRE_UNINSTALL = :
POST_UNINSTALL = :
host_triplet = @host@
subdir = docs
DIST_COMMON = $(srcdir)/Makefile.am $(srcdir)/Makefile.in INSTALL \
THANKS TODO
ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
am__aclocal_m4_deps = $(top_srcdir)/acinclude.m4 \
$(top_srcdir)/configure.ac
am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
$(ACLOCAL_M4)
mkinstalldirs = $(SHELL) $(top_srcdir)/mkinstalldirs
CONFIG_HEADER = $(top_builddir)/lib/config.h \
$(top_builddir)/src/config.h
CONFIG_CLEAN_FILES =
depcomp =
am__depfiles_maybe =
SOURCES =
DIST_SOURCES =
RECURSIVE_TARGETS = all-recursive check-recursive dvi-recursive \
html-recursive info-recursive install-data-recursive \
install-exec-recursive install-info-recursive \
install-recursive installcheck-recursive installdirs-recursive \
pdf-recursive ps-recursive uninstall-info-recursive \
uninstall-recursive
man1dir = $(mandir)/man1
am__installdirs = "$(DESTDIR)$(man1dir)"
MANS = $(man_MANS)
ETAGS = etags
CTAGS = ctags
DIST_SUBDIRS = $(SUBDIRS)
DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
ACLOCAL = @ACLOCAL@
AMDEP_FALSE = @AMDEP_FALSE@
AMDEP_TRUE = @AMDEP_TRUE@
AMTAR = @AMTAR@
AR = @AR@
AS = @AS@
AUTOCONF = @AUTOCONF@
AUTOHEADER = @AUTOHEADER@
AUTOMAKE = @AUTOMAKE@
AWK = @AWK@
CABUNDLE_FALSE = @CABUNDLE_FALSE@
CABUNDLE_TRUE = @CABUNDLE_TRUE@
CC = @CC@
CCDEPMODE = @CCDEPMODE@
CFLAGS = @CFLAGS@
CPP = @CPP@
CPPFLAGS = @CPPFLAGS@
CURL_CA_BUNDLE = @CURL_CA_BUNDLE@
CURL_DISABLE_DICT = @CURL_DISABLE_DICT@
CURL_DISABLE_FILE = @CURL_DISABLE_FILE@
CURL_DISABLE_FTP = @CURL_DISABLE_FTP@
CURL_DISABLE_GOPHER = @CURL_DISABLE_GOPHER@
CURL_DISABLE_HTTP = @CURL_DISABLE_HTTP@
CURL_DISABLE_LDAP = @CURL_DISABLE_LDAP@
CURL_DISABLE_TELNET = @CURL_DISABLE_TELNET@
CXX = @CXX@
CXXCPP = @CXXCPP@
CXXDEPMODE = @CXXDEPMODE@
CXXFLAGS = @CXXFLAGS@
CYGPATH_W = @CYGPATH_W@
DEFS = @DEFS@
DEPDIR = @DEPDIR@
DLLTOOL = @DLLTOOL@
ECHO = @ECHO@
ECHO_C = @ECHO_C@
ECHO_N = @ECHO_N@
ECHO_T = @ECHO_T@
EGREP = @EGREP@
EXEEXT = @EXEEXT@
F77 = @F77@
FFLAGS = @FFLAGS@
HAVE_ARES = @HAVE_ARES@
HAVE_LIBZ = @HAVE_LIBZ@
HAVE_LIBZ_FALSE = @HAVE_LIBZ_FALSE@
HAVE_LIBZ_TRUE = @HAVE_LIBZ_TRUE@
INSTALL_DATA = @INSTALL_DATA@
INSTALL_PROGRAM = @INSTALL_PROGRAM@
INSTALL_SCRIPT = @INSTALL_SCRIPT@
INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@
IPV6_ENABLED = @IPV6_ENABLED@
KRB4_ENABLED = @KRB4_ENABLED@
LDFLAGS = @LDFLAGS@
LIBOBJS = @LIBOBJS@
LIBS = @LIBS@
LIBTOOL = @LIBTOOL@
LN_S = @LN_S@
LTLIBOBJS = @LTLIBOBJS@
MAINT = @MAINT@
MAINTAINER_MODE_FALSE = @MAINTAINER_MODE_FALSE@
MAINTAINER_MODE_TRUE = @MAINTAINER_MODE_TRUE@
MAKEINFO = @MAKEINFO@
MANOPT = @MANOPT@
MIMPURE_FALSE = @MIMPURE_FALSE@
MIMPURE_TRUE = @MIMPURE_TRUE@
NO_UNDEFINED_FALSE = @NO_UNDEFINED_FALSE@
NO_UNDEFINED_TRUE = @NO_UNDEFINED_TRUE@
NROFF = @NROFF@
OBJDUMP = @OBJDUMP@
OBJEXT = @OBJEXT@
OPENSSL_ENABLED = @OPENSSL_ENABLED@
PACKAGE = @PACKAGE@
PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@
PACKAGE_NAME = @PACKAGE_NAME@
PACKAGE_STRING = @PACKAGE_STRING@
PACKAGE_TARNAME = @PACKAGE_TARNAME@
PACKAGE_VERSION = @PACKAGE_VERSION@
PATH_SEPARATOR = @PATH_SEPARATOR@
PERL = @PERL@
PKGADD_NAME = @PKGADD_NAME@
PKGADD_PKG = @PKGADD_PKG@
PKGADD_VENDOR = @PKGADD_VENDOR@
PKGCONFIG = @PKGCONFIG@
RANDOM_FILE = @RANDOM_FILE@
RANLIB = @RANLIB@
SED = @SED@
SET_MAKE = @SET_MAKE@
SHELL = @SHELL@
STRIP = @STRIP@
USE_MANUAL_FALSE = @USE_MANUAL_FALSE@
USE_MANUAL_TRUE = @USE_MANUAL_TRUE@
VERSION = @VERSION@
VERSIONNUM = @VERSIONNUM@
YACC = @YACC@
ac_ct_AR = @ac_ct_AR@
ac_ct_AS = @ac_ct_AS@
ac_ct_CC = @ac_ct_CC@
ac_ct_CXX = @ac_ct_CXX@
ac_ct_DLLTOOL = @ac_ct_DLLTOOL@
ac_ct_F77 = @ac_ct_F77@
ac_ct_OBJDUMP = @ac_ct_OBJDUMP@
ac_ct_RANLIB = @ac_ct_RANLIB@
ac_ct_STRIP = @ac_ct_STRIP@
am__fastdepCC_FALSE = @am__fastdepCC_FALSE@
am__fastdepCC_TRUE = @am__fastdepCC_TRUE@
am__fastdepCXX_FALSE = @am__fastdepCXX_FALSE@
am__fastdepCXX_TRUE = @am__fastdepCXX_TRUE@
am__include = @am__include@
am__leading_dot = @am__leading_dot@
am__quote = @am__quote@
bindir = @bindir@
build = @build@
build_alias = @build_alias@
build_cpu = @build_cpu@
build_os = @build_os@
build_vendor = @build_vendor@
datadir = @datadir@
exec_prefix = @exec_prefix@
host = @host@
host_alias = @host_alias@
host_cpu = @host_cpu@
host_os = @host_os@
host_vendor = @host_vendor@
includedir = @includedir@
infodir = @infodir@
install_sh = @install_sh@
libdir = @libdir@
libexecdir = @libexecdir@
localstatedir = @localstatedir@
mandir = @mandir@
mkdir_p = @mkdir_p@
oldincludedir = @oldincludedir@
prefix = @prefix@
program_transform_name = @program_transform_name@
sbindir = @sbindir@
sharedstatedir = @sharedstatedir@
subdirs = @subdirs@
sysconfdir = @sysconfdir@
target_alias = @target_alias@
AUTOMAKE_OPTIONS = foreign no-dependencies
man_MANS = \
curl.1 \
curl-config.1
GENHTMLPAGES = \
curl.html \
curl-config.html
HTMLPAGES = $(GENHTMLPAGES) index.html
PDFPAGES = \
curl.pdf \
curl-config.pdf
SUBDIRS = examples libcurl
CLEANFILES = $(GENHTMLPAGES) $(PDFPAGES)
EXTRA_DIST = MANUAL BUGS CONTRIBUTE FAQ FEATURES INTERNALS SSLCERTS \
README.win32 RESOURCES TODO TheArtOfHttpScripting THANKS \
VERSIONS KNOWN_BUGS BINDINGS $(man_MANS) $(HTMLPAGES) \
HISTORY INSTALL libcurl-the-guide $(PDFPAGES)
MAN2HTML = roffit < $< >$@
SUFFIXES = .1 .html .pdf
all: all-recursive
.SUFFIXES:
.SUFFIXES: .1 .html .pdf
$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps)
@for dep in $?; do \
case '$(am__configure_deps)' in \
*$$dep*) \
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh \
&& exit 0; \
exit 1;; \
esac; \
done; \
echo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign docs/Makefile'; \
cd $(top_srcdir) && \
$(AUTOMAKE) --foreign docs/Makefile
.PRECIOUS: Makefile
Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
@case '$?' in \
*config.status*) \
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \
*) \
echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \
cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \
esac;
$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps)
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps)
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
mostlyclean-libtool:
-rm -f *.lo
clean-libtool:
-rm -rf .libs _libs
distclean-libtool:
-rm -f libtool
uninstall-info-am:
install-man1: $(man1_MANS) $(man_MANS)
@$(NORMAL_INSTALL)
test -z "$(man1dir)" || $(mkdir_p) "$(DESTDIR)$(man1dir)"
@list='$(man1_MANS) $(dist_man1_MANS) $(nodist_man1_MANS)'; \
l2='$(man_MANS) $(dist_man_MANS) $(nodist_man_MANS)'; \
for i in $$l2; do \
case "$$i" in \
*.1*) list="$$list $$i" ;; \
esac; \
done; \
for i in $$list; do \
if test -f $(srcdir)/$$i; then file=$(srcdir)/$$i; \
else file=$$i; fi; \
ext=`echo $$i | sed -e 's/^.*\\.//'`; \
case "$$ext" in \
1*) ;; \
*) ext='1' ;; \
esac; \
inst=`echo $$i | sed -e 's/\\.[0-9a-z]*$$//'`; \
inst=`echo $$inst | sed -e 's/^.*\///'`; \
inst=`echo $$inst | sed '$(transform)'`.$$ext; \
echo " $(INSTALL_DATA) '$$file' '$(DESTDIR)$(man1dir)/$$inst'"; \
$(INSTALL_DATA) "$$file" "$(DESTDIR)$(man1dir)/$$inst"; \
done
uninstall-man1:
@$(NORMAL_UNINSTALL)
@list='$(man1_MANS) $(dist_man1_MANS) $(nodist_man1_MANS)'; \
l2='$(man_MANS) $(dist_man_MANS) $(nodist_man_MANS)'; \
for i in $$l2; do \
case "$$i" in \
*.1*) list="$$list $$i" ;; \
esac; \
done; \
for i in $$list; do \
ext=`echo $$i | sed -e 's/^.*\\.//'`; \
case "$$ext" in \
1*) ;; \
*) ext='1' ;; \
esac; \
inst=`echo $$i | sed -e 's/\\.[0-9a-z]*$$//'`; \
inst=`echo $$inst | sed -e 's/^.*\///'`; \
inst=`echo $$inst | sed '$(transform)'`.$$ext; \
echo " rm -f '$(DESTDIR)$(man1dir)/$$inst'"; \
rm -f "$(DESTDIR)$(man1dir)/$$inst"; \
done
# This directory's subdirectories are mostly independent; you can cd
# into them and run `make' without going through this Makefile.
# To change the values of `make' variables: instead of editing Makefiles,
# (1) if the variable is set in `config.status', edit `config.status'
# (which will cause the Makefiles to be regenerated when you run `make');
# (2) otherwise, pass the desired values on the `make' command line.
$(RECURSIVE_TARGETS):
@set fnord $$MAKEFLAGS; amf=$$2; \
dot_seen=no; \
target=`echo $@ | sed s/-recursive//`; \
list='$(SUBDIRS)'; for subdir in $$list; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
dot_seen=yes; \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
(cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| case "$$amf" in *=*) exit 1;; *k*) fail=yes;; *) exit 1;; esac; \
done; \
if test "$$dot_seen" = "no"; then \
$(MAKE) $(AM_MAKEFLAGS) "$$target-am" || exit 1; \
fi; test -z "$$fail"
mostlyclean-recursive clean-recursive distclean-recursive \
maintainer-clean-recursive:
@set fnord $$MAKEFLAGS; amf=$$2; \
dot_seen=no; \
case "$@" in \
distclean-* | maintainer-clean-*) list='$(DIST_SUBDIRS)' ;; \
*) list='$(SUBDIRS)' ;; \
esac; \
rev=''; for subdir in $$list; do \
if test "$$subdir" = "."; then :; else \
rev="$$subdir $$rev"; \
fi; \
done; \
rev="$$rev ."; \
target=`echo $@ | sed s/-recursive//`; \
for subdir in $$rev; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
(cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| case "$$amf" in *=*) exit 1;; *k*) fail=yes;; *) exit 1;; esac; \
done && test -z "$$fail"
tags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) tags); \
done
ctags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) ctags); \
done
ID: $(HEADERS) $(SOURCES) $(LISP) $(TAGS_FILES)
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) ' { files[$$0] = 1; } \
END { for (i in files) print i; }'`; \
mkid -fID $$unique
tags: TAGS
TAGS: tags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
tags=; \
here=`pwd`; \
if (etags --etags-include --version) >/dev/null 2>&1; then \
include_option=--etags-include; \
else \
include_option=--include; \
fi; \
list='$(SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test -f $$subdir/TAGS && \
tags="$$tags $$include_option=$$here/$$subdir/TAGS"; \
fi; \
done; \
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) ' { files[$$0] = 1; } \
END { for (i in files) print i; }'`; \
test -z "$(ETAGS_ARGS)$$tags$$unique" \
|| $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
$$tags $$unique
ctags: CTAGS
CTAGS: ctags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
tags=; \
here=`pwd`; \
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) ' { files[$$0] = 1; } \
END { for (i in files) print i; }'`; \
test -z "$(CTAGS_ARGS)$$tags$$unique" \
|| $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \
$$tags $$unique
GTAGS:
here=`$(am__cd) $(top_builddir) && pwd` \
&& cd $(top_srcdir) \
&& gtags -i $(GTAGS_ARGS) $$here
distclean-tags:
-rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
distdir: $(DISTFILES)
@srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; \
topsrcdirstrip=`echo "$(top_srcdir)" | sed 's|.|.|g'`; \
list='$(DISTFILES)'; for file in $$list; do \
case $$file in \
$(srcdir)/*) file=`echo "$$file" | sed "s|^$$srcdirstrip/||"`;; \
$(top_srcdir)/*) file=`echo "$$file" | sed "s|^$$topsrcdirstrip/|$(top_builddir)/|"`;; \
esac; \
if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
dir=`echo "$$file" | sed -e 's,/[^/]*$$,,'`; \
if test "$$dir" != "$$file" && test "$$dir" != "."; then \
dir="/$$dir"; \
$(mkdir_p) "$(distdir)$$dir"; \
else \
dir=''; \
fi; \
if test -d $$d/$$file; then \
if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
cp -pR $(srcdir)/$$file $(distdir)$$dir || exit 1; \
fi; \
cp -pR $$d/$$file $(distdir)$$dir || exit 1; \
else \
test -f $(distdir)/$$file \
|| cp -p $$d/$$file $(distdir)/$$file \
|| exit 1; \
fi; \
done
list='$(SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test -d "$(distdir)/$$subdir" \
|| mkdir "$(distdir)/$$subdir" \
|| exit 1; \
(cd $$subdir && \
$(MAKE) $(AM_MAKEFLAGS) \
top_distdir="../$(top_distdir)" \
distdir="../$(distdir)/$$subdir" \
distdir) \
|| exit 1; \
fi; \
done
check-am: all-am
check: check-recursive
all-am: Makefile $(MANS)
installdirs: installdirs-recursive
installdirs-am:
for dir in "$(DESTDIR)$(man1dir)"; do \
test -z "$$dir" || $(mkdir_p) "$$dir"; \
done
install: install-recursive
install-exec: install-exec-recursive
install-data: install-data-recursive
uninstall: uninstall-recursive
install-am: all-am
@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
installcheck: installcheck-recursive
install-strip:
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
`test -z '$(STRIP)' || \
echo "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'"` install
mostlyclean-generic:
clean-generic:
-test -z "$(CLEANFILES)" || rm -f $(CLEANFILES)
distclean-generic:
-rm -f $(CONFIG_CLEAN_FILES)
maintainer-clean-generic:
@echo "This command is intended for maintainers to use"
@echo "it deletes files that may require special tools to rebuild."
clean: clean-recursive
clean-am: clean-generic clean-libtool mostlyclean-am
distclean: distclean-recursive
-rm -f Makefile
distclean-am: clean-am distclean-generic distclean-libtool \
distclean-tags
dvi: dvi-recursive
dvi-am:
info: info-recursive
info-am:
install-data-am: install-man
install-exec-am:
install-info: install-info-recursive
install-man: install-man1
installcheck-am:
maintainer-clean: maintainer-clean-recursive
-rm -f Makefile
maintainer-clean-am: distclean-am maintainer-clean-generic
mostlyclean: mostlyclean-recursive
mostlyclean-am: mostlyclean-generic mostlyclean-libtool
pdf-am:
ps: ps-recursive
ps-am:
uninstall-am: uninstall-info-am uninstall-man
uninstall-info: uninstall-info-recursive
uninstall-man: uninstall-man1
.PHONY: $(RECURSIVE_TARGETS) CTAGS GTAGS all all-am check check-am \
clean clean-generic clean-libtool clean-recursive ctags \
ctags-recursive distclean distclean-generic distclean-libtool \
distclean-recursive distclean-tags distdir dvi dvi-am html \
html-am info info-am install install-am install-data \
install-data-am install-exec install-exec-am install-info \
install-info-am install-man install-man1 install-strip \
installcheck installcheck-am installdirs installdirs-am \
maintainer-clean maintainer-clean-generic \
maintainer-clean-recursive mostlyclean mostlyclean-generic \
mostlyclean-libtool mostlyclean-recursive pdf pdf-am ps ps-am \
tags tags-recursive uninstall uninstall-am uninstall-info-am \
uninstall-man uninstall-man1
html: $(HTMLPAGES)
cd libcurl; make html
pdf: $(PDFPAGES)
cd libcurl; make pdf
.1.html:
$(MAN2HTML)
.1.pdf:
@(foo=`echo $@ | sed -e 's/\.[0-9]$$//g'`; \
groff -Tps -man $< >$$foo.ps; \
ps2pdf $$foo.ps $@; \
rm $$foo.ps; \
echo "converted $< to $@")
# Tell versions [3.59,3.63) of GNU make to not export all variables.
# Otherwise a system limit (for SysV at least) may be exceeded.
.NOEXPORT:

View File

@ -0,0 +1,22 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
README.win32
Read the README file first.
Curl has been compiled, built and run on all sorts of Windows and win32
systems. While not being the main develop target, a fair share of curl users
are win32-based.
The unix-style man pages are tricky to read on windows, so therefore are all
those pages converted to HTML as well as pdf, and included in the release
archives.
The main curl.1 man page is also "built-in" in the command line tool. Use a
command line similar to this in order to extract a separate text file:
curl -M >manual.txt

72
neo/curl/docs/RESOURCES Normal file
View File

@ -0,0 +1,72 @@
_ _ ____ _
Project ___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
This document lists documents and standards used by curl.
RFC 959 - The FTP protocol
RFC 1635 - How to Use Anonymous FTP
RFC 1738 - Uniform Resource Locators
RFC 1777 - defines the LDAP protocol
RFC 1808 - Relative Uniform Resource Locators
RFC 1867 - Form-based File Upload in HTML
RFC 1950 - ZLIB Compressed Data Format Specification
RFC 1951 - DEFLATE Compressed Data Format Specification
RFC 1952 - gzip compression format
RFC 1959 - LDAP URL syntax
RFC 2045-2049 - Everything you need to know about MIME! (needed for form
based upload)
RFC 2068 - HTTP 1.1 (obsoleted by RFC 2616)
RFC 2109 - HTTP State Management Mechanism (cookie stuff)
- Also, read Netscape's specification at
http://curl.haxx.se/rfc/cookie_spec.html
RFC 2183 - The Content-Disposition Header Field
RFC 2229 - A Dictionary Server Protocol
RFC 2255 - Newer LDAP URL syntax document.
RFC 2231 - MIME Parameter Value and Encoded Word Extensions:
Character Sets, Languages, and Continuations
RFC 2388 - "Returning Values from Forms: multipart/form-data"
Use this as an addition to the RFC1867
RFC 2396 - "Uniform Resource Identifiers: Generic Syntax and Semantics" This
one obsoletes RFC 1738, but since RFC 1738 is often mentioned
I've left it in this list.
RFC 2428 - FTP Extensions for IPv6 and NATs
RFC 2577 - FTP Security Considerations
RFC 2616 - HTTP 1.1, the latest
RFC 2617 - HTTP Authentication
RFC 2718 - Guidelines for new URL Schemes
RFC 2732 - Format for Literal IPv6 Addresses in URL's
RFC 2818 - HTTP Over TLS (TLS is the successor to SSL)
RFC 2964 - Use of HTTP State Management
RFC 2965 - HTTP State Management Mechanism. Cookies. Obsoletes RFC2109

46
neo/curl/docs/SSLCERTS Normal file
View File

@ -0,0 +1,46 @@
Peer SSL Certificate Verification
=================================
Since version 7.10, libcurl performs peer SSL certificate verification by
default. This is done by installing a default CA cert bundle on 'make install'
(or similar), that CA bundle package is used by default on operations against
SSL servers.
Alas, if you communicate with HTTPS servers using certificates that are signed
by CAs present in the bundle, you will not notice any changed behavior and you
will seamlessly get a higher security level on your SSL connections since you
can be sure that the remote server really is the one it claims to be.
If the remote server uses a self-signed certificate, if you don't install
curl's CA cert bundle, if the server uses a certificate signed by a CA that
isn't included in the bundle or if the remoste host is an imposter
impersonating your favourite site, and you want to transfer files from this
server, do one of the following:
1. Tell libcurl to *not* verify the peer. With libcurl you disable with with
curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, FALSE);
With the curl command tool, you disable this with -k/--insecure.
2. Get a CA certificate that can verify the remote server and use the proper
option to point out this CA cert for verification when connecting. For
libcurl hackers: curl_easy_setopt(curl, CURLOPT_CAPATH, capath);
With the curl command tool: --cacert [file]
Neglecting to use one of the above menthods when dealing with a server using a
certficate that isn't signed by one of the certficates in the installed CA
cert bundle, will cause SSL to report an error ("certificate verify failed")
during the handshake and SSL will then refuse further communication with that
server.
This procedure has been deemed The Right Thing even though it adds this extra
trouble for some users, since it adds security to a majority of the SSL
connections that previously weren't really secure. It turned out many people
were using previous versions of curl/libcurl without realizing the need for
the CA cert options to get truly secure SSL connections.
The default path of the CA bundle installed with the curl package is:
/usr/local/share/curl/curl-ca-bundle.crt, which can be changed by running
configure with the --with-ca-bundle option pointing out the path of your
choice.

100
neo/curl/docs/THANKS Normal file
View File

@ -0,0 +1,100 @@
This project has been alive for several years. Countless people have provided
feedback that have improved curl. Here follows a (incomplete) list of people
that have contributed with non-trivial parts:
Daniel Stenberg <daniel@haxx.se>
Rafael Sagula <sagula@inf.ufrgs.br>
Sampo Kellomaki <sampo@iki.fi>
Linas Vepstas <linas@linas.org>
Bjorn Reese <breese@mail1.stofanet.dk>
Johan Anderson <johan@homemail.com>
Kjell Ericson <Kjell.Ericson@haxx.se>
Troy Engel <tengel@sonic.net>
Ryan Nelson <ryan@inch.com>
Björn Stenberg <bjorn@haxx.se>
Angus Mackay <amackay@gus.ml.org>
Eric Young <eay@cryptsoft.com>
Simon Dick <simond@totally.irrelevant.org>
Oren Tirosh <oren@monty.hishome.net>
Steven G. Johnson <stevenj@alum.mit.edu>
Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>
Andrés García <ornalux@redestb.es>
Douglas E. Wegscheid <wegscd@whirlpool.com>
Mark Butler <butlerm@xmission.com>
Eric Thelin <eric@generation-i.com>
Marc Boucher <marc@mbsi.ca>
Greg Onufer <Greg.Onufer@Eng.Sun.COM>
Doug Kaufman <dkaufman@rahul.net>
David Eriksson <david@2good.com>
Ralph Beckmann <rabe@uni-paderborn.de>
T. Yamada <tai@imasy.or.jp>
Lars J. Aas <larsa@sim.no>
Jörn Hartroth <Joern.Hartroth@computer.org>
Matthew Clarke <clamat@van.maves.ca>
Linus Nielsen Feltzing <linus@haxx.se>
Felix von Leitner <felix@convergence.de>
Dan Zitter <dzitter@zitter.net>
Jongki Suwandi <Jongki.Suwandi@eng.sun.com>
Chris Maltby <chris@aurema.com>
Ron Zapp <rzapper@yahoo.com>
Paul Marquis <pmarquis@iname.com>
Ellis Pritchard <ellis@citria.com>
Damien Adant <dams@usa.net>
Chris <cbayliss@csc.come>
Marco G. Salvagno <mgs@whiz.cjb.net>
Paul Marquis <pmarquis@iname.com>
David LeBlanc <dleblanc@qnx.com>
Rich Gray at Plus Technologies
Luong Dinh Dung <u8luong@lhsystems.hu>
Torsten Foertsch <torsten.foertsch@gmx.net>
Kristian Köhntopp <kris@koehntopp.de>
Fred Noz <FNoz@siac.com>
Caolan McNamara <caolan@csn.ul.ie>
Albert Chin-A-Young <china@thewrittenword.com>
Stephen Kick <skick@epicrealm.com>
Martin Hedenfalk <mhe@stacken.kth.se>
Richard Prescott <rip at step.polymtl.ca>
Jason S. Priebe <priebe@wral-tv.com>
T. Bharath <TBharath@responsenetworks.com>
Alexander Kourakos <awk@users.sourceforge.net>
James Griffiths <griffiths_james@yahoo.com>
Loic Dachary <loic@senga.org>
Robert Weaver <robert.weaver@sabre.com>
Ingo Ralf Blum <ingoralfblum@ingoralfblum.com>
Jun-ichiro itojun Hagino <itojun@iijlab.net>
Frederic Lepied <flepied@mandrakesoft.com>
Georg Horn <horn@koblenz-net.de>
Cris Bailiff <c.bailiff@awayweb.com>
Sterling Hughes <sterling@designmultimedia.com>
S. Moonesamy
Ingo Wilken <iw@WWW.Ecce-Terram.DE>
Pawel A. Gajda <mis@k2.net.pl>
Patrick Bihan-Faou
Nico Baggus <Nico.Baggus@mail.ing.nl>
Sergio Ballestrero
Andrew Francis <locust@familyhealth.com.au>
Tomasz Lacki <Tomasz.Lacki@primark.pl>
Georg Huettenegger <georg@ist.org>
John Lask <johnlask@hotmail.com>
Eric Lavigne <erlavigne@wanadoo.fr>
Marcus Webster <marcus.webster@phocis.com>
Götz Babin-Ebell <babin­ebell@trustcenter.de>
Andreas Damm <andreas-sourceforge@radab.org>
Jacky Lam <sylam@emsoftltd.com>
James Gallagher <jgallagher@gso.uri.edu>
Kjetil Jacobsen <kjetilja@cs.uit.no>
Markus F.X.J. Oberhumer <markus@oberhumer.com>
Miklos Nemeth <mnemeth@kfkisystems.com>
Kevin Roth <kproth@users.sourceforge.net>
Ralph Mitchell <rmitchell@eds.com>
Dan Fandrich <dan@coneharvesters.com>
Jean-Philippe Barrette-LaPierre <jpb@rrette.com>
Richard Bramante <RBramante@on.com>
Daniel Kouril <kouril@ics.muni.cz>
Dirk Manske <dm@nettraffic.de>
David Meyer <meyer@paracel.com>
Dominick Meglio <codemstr@ptd.net>
Gisle Vanem <gvanem@broadpark.no>
Giuseppe Attardi <attardi@di.unipi.it>
Tor Arntsen <tor@spacetec.no>
David Byron <DByron@everdreamcorp.com>

199
neo/curl/docs/TODO Normal file
View File

@ -0,0 +1,199 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
TODO
Things to do in project cURL. Please tell us what you think, contribute and
send us patches that improve things! Also check the http://curl.haxx.se/dev
web section for various technical development notes.
All bugs documented in the KNOWN_BUGS document are subject for fixing!
LIBCURL
* Introduce an interface to libcurl that allows applications to easier get to
know what cookies that are received. Pushing interface that calls a
callback on each received cookie? Querying interface that asks about
existing cookies? We probably need both. Enable applications to modify
existing cookies as well. http://curl.haxx.se/dev/COOKIES
* Introduce another callback interface for upload/download that makes one
less copy of data and thus a faster operation.
[http://curl.haxx.se/dev/no_copy_callbacks.txt]
* More data sharing. curl_share_* functions already exist and work, and they
can be extended to share more. For example, enable sharing of the ares
channel.
* Introduce a new error code indicating authentication problems (for proxy
CONNECT error 407 for example). This cannot be an error code, we must not
return informational stuff as errors, consider a new info returned by
curl_easy_getinfo() #845941
* Option to set the SO_KEEPALIVE socket option to make libcurl notice and
disconnect very long time idle connections.
LIBCURL - multi interface
* Add curl_multi_timeout() to make libcurl's ares-functionality better.
* Make sure we don't ever loop because of non-blocking sockets return
EWOULDBLOCK or similar. This FTP command sending, the SSL connection etc.
* Make transfers treated more carefully. We need a way to tell libcurl we
have data to write, as the current system expects us to upload data each
time the socket is writable and there is no way to say that we want to
upload data soon just not right now, without that aborting the upload. The
opposite situation should be possible as well, that we tell libcurl we're
ready to accept read data. Today libcurl feeds the data as soon as it is
available for reading, no matter what.
DOCUMENTATION
* More and better
FTP
* Support the most common FTP proxies, Philip Newton provided a list
allegedly from ncftp:
http://curl.haxx.se/mail/archive-2003-04/0126.html
* Make CURLOPT_FTPPORT support an additional port number on the IP/if/name,
like "blabla:[port]" or possibly even "blabla:[portfirst]-[portsecond]".
* FTP ASCII transfers do not follow RFC959. They don't convert the data
accordingly.
* Since USERPWD always override the user and password specified in URLs, we
might need another way to specify user+password for anonymous ftp logins.
HTTP
* Digest and GSS-Negotiate support for HTTP proxies. They only work on
direct-connections to the server.
* Pipelining. Sending multiple requests before the previous one(s) are done.
This could possibly be implemented using the multi interface to queue
requests and the response data.
TELNET
* Reading input (to send to the remote server) on stdin is a crappy solution
for library purposes. We need to invent a good way for the application to
be able to provide the data to send.
* Move the telnet support's network select() loop go away and merge the code
into the main transfer loop. Until this is done, the multi interface won't
work for telnet.
SSL
* If you really want to improve the SSL situation, you should probably have a
look at SSL cafile loading as well - quick traces look to me like these are
done on every request as well, when they should only be necessary once per
ssl context (or once per handle). Even better would be to support the SSL
CAdir option - instead of loading all of the root CA certs for every
request, this option allows you to only read the CA chain that is actually
required (into the cache)...
* Add an interface to libcurl that enables "session IDs" to get
exported/imported. Cris Bailiff said: "OpenSSL has functions which can
serialise the current SSL state to a buffer of your choice, and
recover/reset the state from such a buffer at a later date - this is used
by mod_ssl for apache to implement and SSL session ID cache". This whole
idea might become moot if we enable the 'data sharing' as mentioned in the
LIBCURL label above.
* OpenSSL supports a callback for customised verification of the peer
certificate, but this doesn't seem to be exposed in the libcurl APIs. Could
it be? There's so much that could be done if it were! (brought by Chris
Clark)
* Make curl's SSL layer option capable of using other free SSL libraries.
Such as the Mozilla Security Services
(http://www.mozilla.org/projects/security/pki/nss/) and GNUTLS
(http://gnutls.hellug.gr/)
LDAP
* Look over the implementation. The looping will have to "go away" from the
lib/ldap.c source file and get moved to the main network code so that the
multi interface and friends will work for LDAP as well.
CLIENT
* Add an option that prevents cURL from overwiting existing local files. When
used, and there already is an existing file with the target file name
(either -O or -o), a number should be appended (and increased if already
existing). So that index.html becomes first index.html.1 and then
index.html.2 etc. Jeff Pohlmeyer suggested.
* "curl ftp://site.com/*.txt"
* The client could be told to use maximum N simultaneous transfers and then
just make sure that happens. It should of course not make more than one
connection to the same remote host. This would require the client to use
the multi interface.
* Extending the capabilities of the multipart formposting. How about leaving
the ';type=foo' syntax as it is and adding an extra tag (headers) which
works like this: curl -F "coolfiles=@fil1.txt;headers=@fil1.hdr" where
fil1.hdr contains extra headers like
Content-Type: text/plain; charset=KOI8-R"
Content-Transfer-Encoding: base64
X-User-Comment: Please don't use browser specific HTML code
which should overwrite the program reasonable defaults (plain/text,
8bit...) (Idea brough to us by kromJx)
* ability to specify the classic computing suffixes on the range
specifications. For example, to download the first 500 Kilobytes of a file,
be able to specify the following for the -r option: "-r 0-500K" or for the
first 2 Megabytes of a file: "-r 0-2M". (Mark Smith suggested)
* --data-encode that URL encodes the data before posting
http://curl.haxx.se/mail/archive-2003-11/0091.html (Kevin Roth suggested)
BUILD
* Consider extending 'roffit' to produce decent ASCII output, and use that
instead of (g)nroff when building src/hugehelp.c
TEST SUITE
* Make the test servers able to serve multiple running test suites. Like if
two users run 'make test' at once.
* Make runtests.pl capable of changing port numbers for the servers. This was
the intention from the start, but in practise it is now hard.
* If perl wasn't found by the configure script, don't attempt to run the
tests but explain something nice why it doesn't.
* Extend the test suite to include more protocols. The telnet could just do
ftp or http operations (for which we have test servers).
* Make the test suite work on more platforms. OpenBSD and Mac OS. Remove
fork()s and it should become even more portable.
NEXT MAJOR RELEASE
* curl_easy_cleanup() returns void, but curl_multi_cleanup() returns a
CURLMcode. These should be changed to be the same.
* curl_formparse() should be removed
* remove obsolete defines from curl/curl.h
* remove the following functions from the public API:
curl_getenv
curl_mprintf (and variations)
curl_strequal
curl_strnequal
They will instead become curlx_ - alternatives. That makes the curl app
still capable of building with them from source.

View File

@ -0,0 +1,399 @@
Online: http://curl.haxx.se/docs/httpscripting.shtml
Author: Daniel Stenberg <daniel@haxx.se>
Date: November 6, 2001
Version: 0.6
The Art Of Scripting HTTP Requests Using Curl
=============================================
This document will assume that you're familiar with HTML and general
networking.
The possibility to write scripts is essential to make a good computer
system. Unix' capability to be extended by shell scripts and various tools to
run various automated commands and scripts is one reason why it has succeeded
so well.
The increasing amount of applications moving to the web has made "HTTP
Scripting" more frequently requested and wanted. To be able to automatically
extract information from the web, to fake users, to post or upload data to
web servers are all important tasks today.
Curl is a command line tool for doing all sorts of URL manipulations and
transfers, but this particular document will focus on how to use it when
doing HTTP requests for fun and profit. I'll assume that you know how to
invoke 'curl --help' or 'curl --manual' to get basic information about it.
Curl is not written to do everything for you. It makes the requests, it gets
the data, it sends data and it retrieves the information. You probably need
to glue everything together using some kind of script language or repeated
manual invokes.
1. The HTTP Protocol
HTTP is the protocol used to fetch data from web servers. It is a very simple
protocol that is built upon TCP/IP. The protocol also allows information to
get sent to the server from the client using a few different methods, as will
be shown here.
HTTP is plain ASCII text lines being sent by the client to a server to
request a particular action, and then the server replies a few text lines
before the actual requested content is sent to the client.
Using curl's option -v will display what kind of commands curl sends to the
server, as well as a few other informational texts. -v is the single most
useful option when it comes to debug or even understand the curl<->server
interaction.
2. URL
The Uniform Resource Locator format is how you specify the address of a
particular resource on the Internet. You know these, you've seen URLs like
http://curl.haxx.se or https://yourbank.com a million times.
3. GET a page
The simplest and most common request/operation made using HTTP is to get a
URL. The URL could itself refer to a web page, an image or a file. The client
issues a GET request to the server and receives the document it asked for.
If you issue the command line
curl http://curl.haxx.se
you get a web page returned in your terminal window. The entire HTML document
that that URL holds.
All HTTP replies contain a set of headers that are normally hidden, use
curl's -i option to display them as well as the rest of the document. You can
also ask the remote server for ONLY the headers by using the -I option (which
will make curl issue a HEAD request).
4. Forms
Forms are the general way a web site can present a HTML page with fields for
the user to enter data in, and then press some kind of 'OK' or 'submit'
button to get that data sent to the server. The server then typically uses
the posted data to decide how to act. Like using the entered words to search
in a database, or to add the info in a bug track system, display the entered
address on a map or using the info as a login-prompt verifying that the user
is allowed to see what it is about to see.
Of course there has to be some kind of program in the server end to receive
the data you send. You cannot just invent something out of the air.
4.1 GET
A GET-form uses the method GET, as specified in HTML like:
<form method="GET" action="junk.cgi">
<input type=text name="birthyear">
<input type=submit name=press value="OK">
</form>
In your favorite browser, this form will appear with a text box to fill in
and a press-button labeled "OK". If you fill in '1905' and press the OK
button, your browser will then create a new URL to get for you. The URL will
get "junk.cgi?birthyear=1905&press=OK" appended to the path part of the
previous URL.
If the original form was seen on the page "www.hotmail.com/when/birth.html",
the second page you'll get will become
"www.hotmail.com/when/junk.cgi?birthyear=1905&press=OK".
Most search engines work this way.
To make curl do the GET form post for you, just enter the expected created
URL:
curl "www.hotmail.com/when/junk.cgi?birthyear=1905&press=OK"
4.2 POST
The GET method makes all input field names get displayed in the URL field of
your browser. That's generally a good thing when you want to be able to
bookmark that page with your given data, but it is an obvious disadvantage
if you entered secret information in one of the fields or if there are a
large amount of fields creating a very long and unreadable URL.
The HTTP protocol then offers the POST method. This way the client sends the
data separated from the URL and thus you won't see any of it in the URL
address field.
The form would look very similar to the previous one:
<form method="POST" action="junk.cgi">
<input type=text name="birthyear">
<input type=submit name=press value=" OK ">
</form>
And to use curl to post this form with the same data filled in as before, we
could do it like:
curl -d "birthyear=1905&press=%20OK%20" www.hotmail.com/when/junk.cgi
This kind of POST will use the Content-Type
application/x-www-form-urlencoded and is the most widely used POST kind.
The data you send to the server MUST already be properly encoded, curl will
not do that for you. For example, if you want the data to contain a space,
you need to replace that space with %20 etc. Failing to comply with this
will most likely cause your data to be received wrongly and messed up.
4.3 FILE UPLOAD POST
Back in late 1995 they defined a new way to post data over HTTP. It was
documented in the RFC 1867, why this method sometimes is referred to as
a RFC1867-posting.
This method is mainly designed to better support file uploads. A form that
allows a user to upload a file could be written like this in HTML:
<form method="POST" enctype='multipart/form-data' action="upload.cgi">
<input type=file name=upload>
<input type=submit name=press value="OK">
</form>
This clearly shows that the Content-Type about to be sent is
multipart/form-data.
To post to a form like this with curl, you enter a command line like:
curl -F upload=@localfilename -F press=OK [URL]
4.4 HIDDEN FIELDS
A very common way for HTML based application to pass state information
between pages is to add hidden fields to the forms. Hidden fields are
already filled in, they aren't displayed to the user and they get passed
along just as all the other fields.
A similar example form with one visible field, one hidden field and one
submit button could look like:
<form method="POST" action="foobar.cgi">
<input type=text name="birthyear">
<input type=hidden name="person" value="daniel">
<input type=submit name="press" value="OK">
</form>
To post this with curl, you won't have to think about if the fields are
hidden or not. To curl they're all the same:
curl -d "birthyear=1905&press=OK&person=daniel" [URL]
4.5 FIGURE OUT WHAT A POST LOOKS LIKE
When you're about fill in a form and send to a server by using curl instead
of a browser, you're of course very interested in sending a POST exactly the
way your browser does.
An easy way to get to see this, is to save the HTML page with the form on
your local disk, modify the 'method' to a GET, and press the submit button
(you could also change the action URL if you want to).
You will then clearly see the data get appended to the URL, separated with a
'?'-letter as GET forms are supposed to.
5. PUT
The perhaps best way to upload data to a HTTP server is to use PUT. Then
again, this of course requires that someone put a program or script on the
server end that knows how to receive a HTTP PUT stream.
Put a file to a HTTP server with curl:
curl -T uploadfile www.uploadhttp.com/receive.cgi
6. AUTHENTICATION
Authentication is the ability to tell the server your username and password
so that it can verify that you're allowed to do the request you're doing. The
Basic authentication used in HTTP (which is the type curl uses by default) is
*plain* *text* based, which means it sends username and password only
slightly obfuscated, but still fully readable by anyone that sniffs on the
network between you and the remote server.
To tell curl to use a user and password for authentication:
curl -u name:password www.secrets.com
The site might require a different authentication method (check the headers
returned by the server), and then --ntlm, --digest, --negotiate or even
--anyauth might be options that suit you.
Sometimes your HTTP access is only available through the use of a HTTP
proxy. This seems to be especially common at various companies. A HTTP proxy
may require its own user and password to allow the client to get through to
the Internet. To specify those with curl, run something like:
curl -U proxyuser:proxypassword curl.haxx.se
If your proxy requires the authentication to be done using the NTLM method,
use --proxy-ntlm.
If you use any one these user+password options but leave out the password
part, curl will prompt for the password interactively.
Do note that when a program is run, its parameters are possible to see when
listing the running processes of the system. Thus, other users may be able to
watch your passwords if you pass them as plain command line options. There
are ways to circumvent this.
7. REFERER
A HTTP request may include a 'referer' field, which can be used to tell from
which URL the client got to this particular resource. Some programs/scripts
check the referer field of requests to verify that this wasn't arriving from
an external site or an unknown page. While this is a stupid way to check
something so easily forged, many scripts still do it. Using curl, you can put
anything you want in the referer-field and thus more easily be able to fool
the server into serving your request.
Use curl to set the referer field with:
curl -e http://curl.haxx.se daniel.haxx.se
8. USER AGENT
Very similar to the referer field, all HTTP requests may set the User-Agent
field. It names what user agent (client) that is being used. Many
applications use this information to decide how to display pages. Silly web
programmers try to make different pages for users of different browsers to
make them look the best possible for their particular browsers. They usually
also do different kinds of javascript, vbscript etc.
At times, you will see that getting a page with curl will not return the same
page that you see when getting the page with your browser. Then you know it
is time to set the User Agent field to fool the server into thinking you're
one of those browsers.
To make curl look like Internet Explorer on a Windows 2000 box:
curl -A "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)" [URL]
Or why not look like you're using Netscape 4.73 on a Linux (PIII) box:
curl -A "Mozilla/4.73 [en] (X11; U; Linux 2.2.15 i686)" [URL]
9. REDIRECTS
When a resource is requested from a server, the reply from the server may
include a hint about where the browser should go next to find this page, or a
new page keeping newly generated output. The header that tells the browser
to redirect is Location:.
Curl does not follow Location: headers by default, but will simply display
such pages in the same manner it display all HTTP replies. It does however
feature an option that will make it attempt to follow the Location: pointers.
To tell curl to follow a Location:
curl -L www.sitethatredirects.com
If you use curl to POST to a site that immediately redirects you to another
page, you can safely use -L and -d/-F together. Curl will only use POST in
the first request, and then revert to GET in the following operations.
10. COOKIES
The way the web browsers do "client side state control" is by using
cookies. Cookies are just names with associated contents. The cookies are
sent to the client by the server. The server tells the client for what path
and host name it wants the cookie sent back, and it also sends an expiration
date and a few more properties.
When a client communicates with a server with a name and path as previously
specified in a received cookie, the client sends back the cookies and their
contents to the server, unless of course they are expired.
Many applications and servers use this method to connect a series of requests
into a single logical session. To be able to use curl in such occasions, we
must be able to record and send back cookies the way the web application
expects them. The same way browsers deal with them.
The simplest way to send a few cookies to the server when getting a page with
curl is to add them on the command line like:
curl -b "name=Daniel" www.cookiesite.com
Cookies are sent as common HTTP headers. This is practical as it allows curl
to record cookies simply by recording headers. Record cookies with curl by
using the -D option like:
curl -D headers_and_cookies www.cookiesite.com
(Take note that the -c option described below is a better way to store
cookies.)
Curl has a full blown cookie parsing engine built-in that comes to use if you
want to reconnect to a server and use cookies that were stored from a
previous connection (or handicrafted manually to fool the server into
believing you had a previous connection). To use previously stored cookies,
you run curl like:
curl -b stored_cookies_in_file www.cookiesite.com
Curl's "cookie engine" gets enabled when you use the -b option. If you only
want curl to understand received cookies, use -b with a file that doesn't
exist. Example, if you want to let curl understand cookies from a page and
follow a location (and thus possibly send back cookies it received), you can
invoke it like:
curl -b nada -L www.cookiesite.com
Curl has the ability to read and write cookie files that use the same file
format that Netscape and Mozilla do. It is a convenient way to share cookies
between browsers and automatic scripts. The -b switch automatically detects
if a given file is such a cookie file and parses it, and by using the
-c/--cookie-jar option you'll make curl write a new cookie file at the end of
an operation:
curl -b cookies.txt -c newcookies.txt www.cookiesite.com
11. HTTPS
There are a few ways to do secure HTTP transfers. The by far most common
protocol for doing this is what is generally known as HTTPS, HTTP over
SSL. SSL encrypts all the data that is sent and received over the network and
thus makes it harder for attackers to spy on sensitive information.
SSL (or TLS as the latest version of the standard is called) offers a
truckload of advanced features to allow all those encryptions and key
infrastructure mechanisms encrypted HTTP requires.
Curl supports encrypted fetches thanks to the freely available OpenSSL
libraries. To get a page from a HTTPS server, simply run curl like:
curl https://that.secure.server.com
11.1 CERTIFICATES
In the HTTPS world, you use certificates to validate that you are the one
you you claim to be, as an addition to normal passwords. Curl supports
client-side certificates. All certificates are locked with a PIN-code, why
you need to enter the unlock-code before the certificate can be used by
curl. The PIN-code can be specified on the command line or if not, entered
interactively when curl queries for it. Use a certificate with curl on a
HTTPS server like:
curl -E mycert.pem https://that.secure.server.com
curl also tries to verify that the server is who it claims to be, by
verifying the server's certificate against a CA cert bundle. Failing the
verification will cause curl to deny the connection. You must then use -k in
case you want to tell curl to ignore that the server can't be verified.
12. REFERENCES
RFC 2616 is a must to read if you want in-depth understanding of the HTTP
protocol.
RFC 2396 explains the URL syntax.
RFC 2109 defines how cookies are supposed to work.
RFC 1867 defines the HTTP post upload format.
http://www.openssl.org is the home of the OpenSSL project
http://curl.haxx.se is the home of the cURL project

64
neo/curl/docs/VERSIONS Normal file
View File

@ -0,0 +1,64 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
Version Numbers and Releases
Curl is not only curl. Curl is also libcurl. They're actually individually
versioned, but they mostly follow each other rather closely.
The version numbering is always built up using the same system:
X.Y[.Z][-preN]
Where
X is main version number
Y is release number
Z is patch number
N is pre-release number
One of these numbers will get bumped in each new release. The numbers to the
right of a bumped number will be reset to zero. If Z is zero, it is not
included in the version number. The pre release number is only included in
pre releases (they're never used in public, official, releases).
The main version number will get bumped when *really* big, world colliding
changes are made. The release number is bumped when big changes are
performed. The patch number is bumped when the changes are mere bugfixes and
only minor feature changes. The pre-release is a counter, to identify which
pre-release a certain release is.
When reaching the end of a pre-release period, the version without the
pre-release part will be released as a public release.
It means that after release 1.2.3, we can release 2.0 if something really big
has been made, 1.3 if not that big changes were made or 1.2.4 if mostly bugs
were fixed. Before 1.2.4 is released, we might release a 1.2.4-pre1 release
for the brave people to try before the actual release.
Bumping, as in increasing the number with 1, is unconditionally only
affecting one of the numbers (except the ones to the right of it, that may be
set to zero). 1 becomes 2, 3 becomes 4, 9 becomes 10, 88 becomes 89 and 99
becomes 100. So, after 1.2.9 comes 1.2.10. After 3.99.3, 3.100 might come.
All original curl source release archives are named according to the libcurl
version (not according to the curl client version that, as said before, might
differ).
As a service to any application that might want to support new libcurl
features while still being able to build with older versions, all releases
have the libcurl version stored in the curl/curl.h file using a static
numbering scheme that can be used for comparison. The version number is
defined as:
#define LIBCURL_VERSION_NUM 0xXXYYZZ
Where XX, YY and ZZ are the main version, release and patch numbers in
hexadecimal. All three numbers are always represented using two digits. 1.2
would appear as "0x010200" while version 9.11.7 appears as "0x090b07".
This 6-digit hexadecimal number does not show pre-release number, and it is
always a greater number in a more recent release. It makes comparisons with
greater than and less than work.

View File

@ -0,0 +1,64 @@
.\" You can view this file with:
.\" nroff -man curl-config.1
.\" Written by Daniel Stenberg
.\"
.TH curl-config 1 "8 Oct 2002" "Curl 7.10" "curl-config manual"
.SH NAME
curl-config \- Get information about a libcurl installation
.SH SYNOPSIS
.B curl-config [options]
.SH DESCRIPTION
.B curl-config
displays information about a previous curl and libcurl installation.
.SH OPTIONS
.IP "--ca"
Displays the built-in path to the CA cert bundle this libcurl uses.
.IP "--cc"
Displays the compiler used to build libcurl.
.IP "--cflags"
Set of compiler options (CFLAGS) to use when compiling files that use
libcurl. Currently that is only thw include path to the curl include files.
.IP "--feature"
Lists what particular main features the installed libcurl was built with. At
the time of writing, this list may include SSL, KRB4 or IPv6. Do not assume
any particular order. The keywords will be separated by newlines. There may be
none, one or several keywords in the list.
.IP "--help"
Displays the available options.
.IP "--libs"
Shows the complete set of libs and other linker options you will need in order
to link your application with libcurl.
.IP "--prefix"
This is the prefix used when libcurl was installed. Libcurl is then installed
in $prefix/lib and its header files are installed in $prefix/include and so
on. The prefix is set with "configure --prefix".
.IP "--version"
Outputs version information about the installed libcurl.
.IP "--vernum"
Outputs version information about the installed libcurl, in numerical mode.
This outputs the version number, in hexadecimal, with 8 bits for each part;
major, minor, patch. So that libcurl 7.7.4 would appear as 070704 and libcurl
12.13.14 would appear as 0c0d0e...
.SH "EXAMPLES"
What linker options do I need when I link with libcurl?
$ curl-config --libs
What compiler options do I need when I compile using libcurl functions?
$ curl-config --cflags
How do I know if libcurl was built with SSL support?
$ curl-config --feature | grep SSL
What's the installed libcurl version?
$ curl-config --version
How do I build a single file with a one-line command?
$ `curl-config --cc --cflags --libs` -o example example.c
.SH "SEE ALSO"
.BR curl (1)

View File

@ -0,0 +1,82 @@
<html><head>
<title>curl-config man page</title>
<meta name="generator" content="roffit 0.5">
<STYLE type="text/css">
P.level0 {
padding-left: 2em;
}
P.level1 {
padding-left: 4em;
}
P.level2 {
padding-left: 6em;
}
span.emphasis {
font-style: italic;
}
span.bold {
font-weight: bold;
}
span.manpage {
font-weight: bold;
}
h2.nroffsh {
background-color: #e0e0e0;
}
span.nroffip {
font-weight: bold;
font-size: 120%;
font-family: monospace;
}
p.roffit {
text-align: center;
font-size: 80%;
}
</STYLE>
</head><body>
<p class="level0"><a name="NAME"></a><h2 class="nroffsh">NAME</h2>
<p class="level0">curl-config - Get information about a libcurl installation <a name="SYNOPSIS"></a><h2 class="nroffsh">SYNOPSIS</h2>
<p class="level0"><span Class="bold">curl-config [options]</span> <a name="DESCRIPTION"></a><h2 class="nroffsh">DESCRIPTION</h2>
<p class="level0"><span Class="bold">curl-config</span> displays information about a previous curl and libcurl installation. <a name="OPTIONS"></a><h2 class="nroffsh">OPTIONS</h2>
<p class="level0">
<p class="level0"><a name="--ca"></a><span class="nroffip">--ca</span>
<p class="level1">Displays the built-in path to the CA cert bundle this libcurl uses.
<p class="level0"><a name="--cc"></a><span class="nroffip">--cc</span>
<p class="level1">Displays the compiler used to build libcurl.
<p class="level0"><a name="--cflags"></a><span class="nroffip">--cflags</span>
<p class="level1">Set of compiler options (CFLAGS) to use when compiling files that use libcurl. Currently that is only thw include path to the curl include files.
<p class="level0"><a name="--feature"></a><span class="nroffip">--feature</span>
<p class="level1">Lists what particular main features the installed libcurl was built with. At the time of writing, this list may include SSL, KRB4 or IPv6. Do not assume any particular order. The keywords will be separated by newlines. There may be none, one or several keywords in the list.
<p class="level0"><a name="--help"></a><span class="nroffip">--help</span>
<p class="level1">Displays the available options.
<p class="level0"><a name="--libs"></a><span class="nroffip">--libs</span>
<p class="level1">Shows the complete set of libs and other linker options you will need in order to link your application with libcurl.
<p class="level0"><a name="--prefix"></a><span class="nroffip">--prefix</span>
<p class="level1">This is the prefix used when libcurl was installed. Libcurl is then installed in $prefix/lib and its header files are installed in $prefix/include and so on. The prefix is set with "configure --prefix".
<p class="level0"><a name="--version"></a><span class="nroffip">--version</span>
<p class="level1">Outputs version information about the installed libcurl.
<p class="level0"><a name="--vernum"></a><span class="nroffip">--vernum</span>
<p class="level1">Outputs version information about the installed libcurl, in numerical mode. This outputs the version number, in hexadecimal, with 8 bits for each part; major, minor, patch. So that libcurl 7.7.4 would appear as 070704 and libcurl 12.13.14 would appear as 0c0d0e... <a name="EXAMPLES"></a><h2 class="nroffsh">EXAMPLES</h2>
<p class="level0">What linker options do I need when I link with libcurl?
<p class="level0">&nbsp; $ curl-config --libs
<p class="level0">What compiler options do I need when I compile using libcurl functions?
<p class="level0">&nbsp; $ curl-config --cflags
<p class="level0">How do I know if libcurl was built with SSL support?
<p class="level0">&nbsp; $ curl-config --feature | grep SSL
<p class="level0">What's the installed libcurl version?
<p class="level0">&nbsp; $ curl-config --version
<p class="level0">How do I build a single file with a one-line command?
<p class="level0">&nbsp; $ `curl-config --cc --cflags --libs` -o example example.c
<p class="level0"><a name="SEE"></a><h2 class="nroffsh">SEE ALSO</h2>
<p class="level0"><span Class="manpage">curl (1)</span> <p class="roffit">
This HTML page was made with <a href="http://daniel.haxx.se/projects/roffit/">roffit</a>.
</body></html>

Binary file not shown.

1148
neo/curl/docs/curl.1 Normal file

File diff suppressed because it is too large Load Diff

599
neo/curl/docs/curl.html Normal file
View File

@ -0,0 +1,599 @@
<html><head>
<title>curl man page</title>
<meta name="generator" content="roffit 0.5">
<STYLE type="text/css">
P.level0 {
padding-left: 2em;
}
P.level1 {
padding-left: 4em;
}
P.level2 {
padding-left: 6em;
}
span.emphasis {
font-style: italic;
}
span.bold {
font-weight: bold;
}
span.manpage {
font-weight: bold;
}
h2.nroffsh {
background-color: #e0e0e0;
}
span.nroffip {
font-weight: bold;
font-size: 120%;
font-family: monospace;
}
p.roffit {
text-align: center;
font-size: 80%;
}
</STYLE>
</head><body>
<p class="level0"><a name="NAME"></a><h2 class="nroffsh">NAME</h2>
<p class="level0">curl - transfer a URL <a name="SYNOPSIS"></a><h2 class="nroffsh">SYNOPSIS</h2>
<p class="level0"><span Class="bold">curl [options]</span> <a class="emphasis" href="#URL">[URL...]</a> <a name="DESCRIPTION"></a><h2 class="nroffsh">DESCRIPTION</h2>
<p class="level0"><span Class="bold">curl</span> is a tool to transfer data from or to a server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, GOPHER, DICT, TELNET, LDAP or FILE). The command is designed to work without user interaction.
<p class="level0">curl offers a busload of useful tricks like proxy support, user authentication, ftp upload, HTTP post, SSL (https:) connections, cookies, file transfer resume and more. As you will see below, the amount of features will make your head spin!
<p class="level0">curl is powered by libcurl for all transfer-related features. See <span Class="manpage">libcurl (3)</span> for details. <a name="URL"></a><h2 class="nroffsh">URL</h2>
<p class="level0">The URL syntax is protocol dependent. You'll find a detailed description in RFC 2396.
<p class="level0">You can specify multiple URLs or parts of URLs by writing part sets within braces as in:
<p class="level0">&nbsp;<a href="http://site">http://site</a>.{one,two,three}.com
<p class="level0">or you can get sequences of alphanumeric series by using [] as in:
<p class="level0">&nbsp;ftp://ftp.numericals.com/file[1-100].txt &nbsp;ftp://ftp.numericals.com/file[001-100].txt (with leading zeros) &nbsp;ftp://ftp.letters.com/file[a-z].txt
<p class="level0">No nesting of the sequences is supported at the moment:
<p class="level0">&nbsp;<a href="http://www.any.org/archive">http://www.any.org/archive</a>[1996-1999]/volume[1-4]part{a,b,c,index}.html
<p class="level0">You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order.
<p class="level0">Curl will attempt to re-use connections for multiple file transfers, so that getting many files from the same server will not do multiple connects / handshakes. This improves speed. Of course this is only done on files specified on a single command line and cannot be used between separate curl invokes. <a name="OPTIONS"></a><h2 class="nroffsh">OPTIONS</h2>
<p class="level0">
<p class="level0"><a name="-a--append"></a><span class="nroffip">-a/--append</span>
<p class="level1">(FTP) When used in an FTP upload, this will tell curl to append to the target file instead of overwriting it. If the file doesn't exist, it will be created.
<p class="level1">If this option is used twice, the second one will disable append mode again.
<p class="level0"><a name="-A--user-agent"></a><span class="nroffip">-A/--user-agent &lt;agent string&gt;</span>
<p class="level1">(HTTP) Specify the User-Agent string to send to the HTTP server. Some badly done CGIs fail if its not set to "Mozilla/4.0". To encode blanks in the string, surround the string with single quote marks. This can also be set with the <a class="emphasis" href="#-H--header">-H/--header</a> option of course.
<p class="level1">If this option is set more than once, the last one will be the one that's used.
<p class="level0"><a name="--anyauth"></a><span class="nroffip">--anyauth</span>
<p class="level1">(HTTP) Tells curl to figure out authentication method by itself, and use the most secure one the remote site claims it supports. This is done by first doing a request and checking the response-headers, thus inducing an extra network round-trip. This is used instead of setting a specific authentication method, which you can do with <a class="emphasis" href="#--basic">--basic</a>, <a class="emphasis" href="#--digest">--digest</a>, <a class="emphasis" href="#--ntlm">--ntlm</a>, and <a class="emphasis" href="#--negotiate">--negotiate</a>. (Added in 7.10.6)
<p class="level1">If this option is used several times, the following occurrences make no difference.
<p class="level0"><a name="-b--cookie"></a><span class="nroffip">-b/--cookie &lt;name=data&gt;</span>
<p class="level1">(HTTP) Pass the data to the HTTP server as a cookie. It is supposedly the data previously received from the server in a "Set-Cookie:" line. The data should be in the format "NAME1=VALUE1; NAME2=VALUE2".
<p class="level1">If no '=' letter is used in the line, it is treated as a filename to use to read previously stored cookie lines from, which should be used in this session if they match. Using this method also activates the "cookie parser" which will make curl record incoming cookies too, which may be handy if you're using this in combination with the <a class="emphasis" href="#-L--location">-L/--location</a> option. The file format of the file to read cookies from should be plain HTTP headers or the Netscape/Mozilla cookie file format.
<p class="level1"><span Class="bold">NOTE</span> that the file specified with <a class="emphasis" href="#-b--cookie">-b/--cookie</a> is only used as input. No cookies will be stored in the file. To store cookies, use the <a class="emphasis" href="#-c--cookie-jar">-c/--cookie-jar</a> option or you could even save the HTTP headers to a file using <a class="emphasis" href="#-D--dump-header">-D/--dump-header</a>!
<p class="level1">If this option is set more than once, the last one will be the one that's used.
<p class="level0"><a name="-B--use-ascii"></a><span class="nroffip">-B/--use-ascii</span>
<p class="level1">Use ASCII transfer when getting an FTP file or LDAP info. For FTP, this can also be enforced by using an URL that ends with ";type=A". This option causes data sent to stdout to be in text mode for win32 systems.
<p class="level1">If this option is used twice, the second one will disable ASCII usage.
<p class="level0"><a name="--basic"></a><span class="nroffip">--basic</span>
<p class="level1">(HTTP) Tells curl to use HTTP Basic authentication. This is the default and this option is usually pointless, unless you use it to override a previously set option that sets a different authentication method (such as <a class="emphasis" href="#--ntlm">--ntlm</a>, <a class="emphasis" href="#--digest">--digest</a> and <a class="emphasis" href="#--negotiate">--negotiate</a>). (Added in 7.10.6)
<p class="level1">If this option is used several times, the following occurrences make no difference.
<p class="level0"><a name="--ciphers"></a><span class="nroffip">--ciphers &lt;list of ciphers&gt;</span>
<p class="level1">(SSL) Specifies which ciphers to use in the connection. The list of ciphers must be using valid ciphers. Read up on SSL cipher list details on this URL: <span Class="emphasis"><a href="http://www.openssl.org/docs/apps/ciphers.html">http://www.openssl.org/docs/apps/ciphers.html</a></span>
<p class="level1">If this option is used several times, the last one will override the others.
<p class="level0"><a name="--compressed"></a><span class="nroffip">--compressed</span>
<p class="level1">(HTTP) Request a compressed response using one of the algorithms libcurl supports, and return the uncompressed document. If this option is used and the server sends an unsupported encoding, Curl will report an error.
<p class="level1">If this option is used several times, each occurrence will toggle it on/off.
<p class="level0"><a name="--connect-timeout"></a><span class="nroffip">--connect-timeout &lt;seconds&gt;</span>
<p class="level1">Maximum time in seconds that you allow the connection to the server to take. This only limits the connection phase, once curl has connected this option is of no more use. See also the <span Class="emphasis">--max-time</span> option.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-c--cookie-jar"></a><span class="nroffip">-c/--cookie-jar &lt;file name&gt;</span>
<p class="level1">Specify to which file you want curl to write all cookies after a completed operation. Curl writes all cookies previously read from a specified file as well as all cookies received from remote server(s). If no cookies are known, no file will be written. The file will be written using the Netscape cookie file format. If you set the file name to a single dash, "-", the cookies will be written to stdout.
<p class="level1"><span Class="bold">NOTE</span> If the cookie jar can't be created or written to, the whole curl operation won't fail or even report an error clearly. Using -v will get a warning displayed, but that is the only visible feedback you get about this possibly lethal situation.
<p class="level1">If this option is used several times, the last specfied file name will be used.
<p class="level0"><a name="-C--continue-at"></a><span class="nroffip">-C/--continue-at &lt;offset&gt;</span>
<p class="level1">Continue/Resume a previous file transfer at the given offset. The given offset is the exact number of bytes that will be skipped counted from the beginning of the source file before it is transfered to the destination. If used with uploads, the ftp server command SIZE will not be used by curl.
<p class="level1">Use "-C -" to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--create-dirs"></a><span class="nroffip">--create-dirs</span>
<p class="level1">When used in conjunction with the -o option, curl will create the necessary local directory hierarchy as needed.
<p class="level0"><a name="--crlf"></a><span class="nroffip">--crlf</span>
<p class="level1">(FTP) Convert LF to CRLF in upload. Useful for MVS (OS/390).
<p class="level1">If this option is used twice, the second will again disable crlf converting.
<p class="level0"><a name="-d--data"></a><span class="nroffip">-d/--data &lt;data&gt;</span>
<p class="level1">(HTTP) Sends the specified data in a POST request to the HTTP server, in a way that can emulate as if a user has filled in a HTML form and pressed the submit button. Note that the data is sent exactly as specified with no extra processing (with all newlines cut off). The data is expected to be &zerosp;"url-encoded". This will cause curl to pass the data to the server using the content-type application/x-www-form-urlencoded. Compare to <a class="emphasis" href="#-F--form">-F/--form</a>. If this option is used more than once on the same command line, the data pieces specified will be merged together with a separating &-letter. Thus, using '-d name=daniel -d skill=lousy' would generate a post chunk that looks like &zerosp;'name=daniel&skill=lousy'.
<p class="level1">If you start the data with the letter @, the rest should be a file name to read the data from, or - if you want curl to read the data from stdin. The contents of the file must already be url-encoded. Multiple files can also be specified. Posting data from a file named 'foobar' would thus be done with <span Class="emphasis">--data</span> @foobar".
<p class="level1">To post data purely binary, you should instead use the <a class="emphasis" href="#--data-binary">--data-binary</a> option.
<p class="level1"><a class="emphasis" href="#-d--data">-d/--data</a> is the same as <a class="emphasis" href="#--data-ascii">--data-ascii</a>.
<p class="level1">If this option is used several times, the ones following the first will append data.
<p class="level0"><a name="--data-ascii"></a><span class="nroffip">--data-ascii &lt;data&gt;</span>
<p class="level1">(HTTP) This is an alias for the <a class="emphasis" href="#-d--data">-d/--data</a> option.
<p class="level1">If this option is used several times, the ones following the first will append data.
<p class="level0"><a name="--data-binary"></a><span class="nroffip">--data-binary &lt;data&gt;</span>
<p class="level1">(HTTP) This posts data in a similar manner as <a class="emphasis" href="#--data-ascii">--data-ascii</a> does, although when using this option the entire context of the posted data is kept as-is. If you want to post a binary file without the strip-newlines feature of the <a class="emphasis" href="#--data-ascii">--data-ascii</a> option, this is for you.
<p class="level1">If this option is used several times, the ones following the first will append data.
<p class="level0"><a name="--digest"></a><span class="nroffip">--digest</span>
<p class="level1">(HTTP) Enables HTTP Digest authentication. This is a authentication that prevents the password from being sent over the wire in clear text. Use this in combination with the normal <a class="emphasis" href="#-u--user">-u/--user</a> option to set user name and password. See also <a class="emphasis" href="#--ntlm">--ntlm</a>, <a class="emphasis" href="#--negotiate">--negotiate</a> and <a class="emphasis" href="#--anyauth">--anyauth</a> for related options. (Added in curl 7.10.6)
<p class="level1">If this option is used several times, the following occurrences make no difference.
<p class="level0"><a name="--disable-eprt"></a><span class="nroffip">--disable-eprt</span>
<p class="level1">(FTP) Tell curl to disable the use of the EPRT and LPRT commands when doing active FTP transfers. Curl will normally always first attempt to use EPRT, then LPRT before using PORT, but with this option, it will use PORT right away. EPRT and LPRT are extensions to the original FTP protocol, may not work on all servers but enable more functionality in a better way than the traditional PORT command. (Aded in 7.10.5)
<p class="level1">If this option is used several times, each occurrence will toggle this on/off.
<p class="level0"><a name="--disable-epsv"></a><span class="nroffip">--disable-epsv</span>
<p class="level1">(FTP) Tell curl to disable the use of the EPSV command when doing passive FTP transfers. Curl will normally always first attempt to use EPSV before PASV, but with this option, it will not try using EPSV.
<p class="level1">If this option is used several times, each occurrence will toggle this on/off.
<p class="level0"><a name="-D--dump-header"></a><span class="nroffip">-D/--dump-header &lt;file&gt;</span>
<p class="level1">Write the protocol headers to the specified file.
<p class="level1">This option is handy to use when you want to store the headers that a HTTP site sends to you. Cookies from the headers could then be read in a second curl invoke by using the <a class="emphasis" href="#-b--cookie">-b/--cookie</a> option! The <a class="emphasis" href="#-c--cookie-jar">-c/--cookie-jar</a> option is however a better way to store cookies.
<p class="level1">When used on FTP, the ftp server response lines are considered being "headers" and thus are saved there.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-e--referer"></a><span class="nroffip">-e/--referer &lt;URL&gt;</span>
<p class="level1">(HTTP) Sends the "Referer Page" information to the HTTP server. This can also be set with the <a class="emphasis" href="#-H--header">-H/--header</a> flag of course. When used with <a class="emphasis" href="#-L--location">-L/--location</a> you can append ";auto" to the referer URL to make curl automatically set the previous URL when it follows a Location: header. The ";auto" string can be used alone, even if you don't set an initial referer.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--environment"></a><span class="nroffip">--environment</span>
<p class="level1">(RISC OS ONLY) Sets a range of environment variables, using the names the -w option supports, to easier allow extraction of useful information after having run curl.
<p class="level1">If this option is used several times, each occurrence will toggle this on/off.
<p class="level0"><a name="--egd-file"></a><span class="nroffip">--egd-file &lt;file&gt;</span>
<p class="level1">(HTTPS) Specify the path name to the Entropy Gathering Daemon socket. The socket is used to seed the random engine for SSL connections. See also the <a class="emphasis" href="#--random-file">--random-file</a> option.
<p class="level0"><a name="-E--cert"></a><span class="nroffip">-E/--cert &lt;certificate[:password]&gt;</span>
<p class="level1">(HTTPS) Tells curl to use the specified certificate file when getting a file with HTTPS. The certificate must be in PEM format. If the optional password isn't specified, it will be queried for on the terminal. Note that this certificate is the private key and the private certificate concatenated!
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--cert-type"></a><span class="nroffip">--cert-type &lt;type&gt;</span>
<p class="level1">(SSL) Tells curl what certificate type the provided certificate is in. PEM, DER and ENG are recognized types.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--cacert"></a><span class="nroffip">--cacert &lt;CA certificate&gt;</span>
<p class="level1">(HTTPS) Tells curl to use the specified certificate file to verify the peer. The file may contain multiple CA certificates. The certificate(s) must be in PEM format.
<p class="level1">curl recognizes the environment variable named 'CURL_CA_BUNDLE' if that is set, and uses the given path as a path to a CA cert bundle. This option overrides that variable.
<p class="level1">The windows version of curl will automatically look for a CA certs file named &acute;curl-ca-bundle.crt&acute;, either in the same directory as curl.exe, or in the Current Working Directory, or in any folder along your PATH.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--capath"></a><span class="nroffip">--capath &lt;CA certificate directory&gt;</span>
<p class="level1">(HTTPS) Tells curl to use the specified certificate directory to verify the peer. The certificates must be in PEM format, and the directory must have been processed using the c_rehash utility supplied with openssl. Using <a class="emphasis" href="#--capath">--capath</a> can allow curl to make https connections much more efficiently than using <a class="emphasis" href="#--cacert">--cacert</a> if the <a class="emphasis" href="#--cacert">--cacert</a> file contains many CA certificates.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-f--fail"></a><span class="nroffip">-f/--fail</span>
<p class="level1">(HTTP) Fail silently (no output at all) on server errors. This is mostly done like this to better enable scripts etc to better deal with failed attempts. In normal cases when a HTTP server fails to deliver a document, it returns a HTML document stating so (which often also describes why and more). This flag will prevent curl from outputting that and fail silently instead.
<p class="level1">If this option is used twice, the second will again disable silent failure.
<p class="level0"><a name="--ftp-create-dirs"></a><span class="nroffip">--ftp-create-dirs</span>
<p class="level1">(FTP) When an FTP URL/operation uses a path that doesn't currently exist on the server, the standard behavior of curl is to fail. Using this option, curl will instead attempt to create missing directories. (Added in 7.10.7)
<p class="level1">If this option is used twice, the second will again disable silent failure.
<p class="level0"><a name="--ftp-pasv"></a><span class="nroffip">--ftp-pasv</span>
<p class="level1">(FTP) Use PASV when transfering. PASV is the internal default behavior, but using this option can be used to override a previos --ftp-port option. (Added in 7.11.0)
<p class="level1">If this option is used twice, the second will again disable silent failure.
<p class="level0"><a name="--ftp-ssl"></a><span class="nroffip">--ftp-ssl</span>
<p class="level1">(FTP) Make the FTP connection switch to use SSL/TLS. (Added in 7.11.0)
<p class="level1">If this option is used twice, the second will again disable silent failure.
<p class="level0"><a name="-F--form"></a><span class="nroffip">-F/--form &lt;name=content&gt;</span>
<p class="level1">(HTTP) This lets curl emulate a filled in form in which a user has pressed the submit button. This causes curl to POST data using the content-type multipart/form-data according to RFC1867. This enables uploading of binary files etc. To force the 'content' part to be be a file, prefix the file name with an @ sign. To just get the content part from a file, prefix the file name with the letter &lt;. The difference between @ and &lt; is then that @ makes a file get attached in the post as a file upload, while the &lt; makes a text field and just get the contents for that text field from a file.
<p class="level1">Example, to send your password file to the server, where &zerosp;'password' is the name of the form-field to which /etc/passwd will be the input:
<p class="level1"><span Class="bold">curl</span> -F password=@/etc/passwd www.mypasswords.com
<p class="level1">To read the file's content from stdin insted of a file, use - where the file name should've been. This goes for both @ and &lt; constructs.
<p class="level1">You can also tell curl what Content-Type to use for the file upload part, by using 'type=', in a manner similar to:
<p class="level1"><span Class="bold">curl</span> -F "web=@index.html;type=text/html" url.com
<p class="level1">See further examples and details in the MANUAL.
<p class="level1">This option can be used multiple times.
<p class="level0"><a name="-g--globoff"></a><span class="nroffip">-g/--globoff</span>
<p class="level1">This option switches off the "URL globbing parser". When you set this option, you can specify URLs that contain the letters {}[] without having them being interpreted by curl itself. Note that these letters are not normal legal URL contents but they should be encoded according to the URI standard.
<p class="level0"><a name="-G--get"></a><span class="nroffip">-G/--get</span>
<p class="level1">When used, this option will make all data specified with <a class="emphasis" href="#-d--data">-d/--data</a> or <a class="emphasis" href="#--data-binary">--data-binary</a> to be used in a HTTP GET request instead of the POST request that otherwise would be used. The data will be appended to the URL with a '?' separator.
<p class="level1">If used in combination with -I, the POST data will instead be appended to the URL with a HEAD request.
<p class="level1">If used multiple times, nothing special happens.
<p class="level0"><a name="-h--help"></a><span class="nroffip">-h/--help</span>
<p class="level1">Usage help.
<p class="level0"><a name="-H--header"></a><span class="nroffip">-H/--header &lt;header&gt;</span>
<p class="level1">(HTTP) Extra header to use when getting a web page. You may specify any number of extra headers. Note that if you should add a custom header that has the same name as one of the internal ones curl would use, your externally set header will be used instead of the internal one. This allows you to make even trickier stuff than curl would normally do. You should not replace internally set headers without knowing perfectly well what you're doing. Replacing an internal header with one without content on the right side of the colon will prevent that header from appearing.
<p class="level1">See also the <a class="emphasis" href="#-A--user-agent">-A/--user-agent</a> and <a class="emphasis" href="#-e--referer">-e/--referer</a> options.
<p class="level1">This option can be used multiple times to add/replace/remove multiple headers.
<p class="level0"><a name="-i--include"></a><span class="nroffip">-i/--include</span>
<p class="level1">(HTTP) Include the HTTP-header in the output. The HTTP-header includes things like server-name, date of the document, HTTP-version and more...
<p class="level1">If this option is used twice, the second will again disable header include.
<p class="level0"><a name="--interface"></a><span class="nroffip">--interface &lt;name&gt;</span>
<p class="level1">Perform an operation using a specified interface. You can enter interface name, IP address or host name. An example could look like:
<p class="level1">&nbsp;curl --interface eth0:1 <a href="http://www.netscape.com">http://www.netscape.com</a>/
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-I--head"></a><span class="nroffip">-I/--head</span>
<p class="level1">(HTTP/FTP/FILE) Fetch the HTTP-header only! HTTP-servers feature the command HEAD which this uses to get nothing but the header of a document. When used on a FTP or FILE file, curl displays the file size and last modification time only.
<p class="level1">If this option is used twice, the second will again disable header only.
<p class="level0"><a name="-j--junk-session-cookies"></a><span class="nroffip">-j/--junk-session-cookies</span>
<p class="level1">(HTTP) When curl is told to read cookies from a given file, this option will make it discard all "session cookies". This will basicly have the same effect as if a new session is started. Typical browsers always discard session cookies when they're closed down. (Added in 7.9.7)
<p class="level1">If this option is used several times, each occurrence will toggle this on/off.
<p class="level0"><a name="-k--insecure"></a><span class="nroffip">-k/--insecure</span>
<p class="level1">(SSL) This option explicitly allows curl to perform "insecure" SSL connections and transfers. Starting with curl 7.10, all SSL connections will be attempted to be made secure by using the CA certificate bundle installed by default. This makes all connections considered "insecure" to fail unless <a class="emphasis" href="#-k--insecure">-k/--insecure</a> is used.
<p class="level1">If this option is used twice, the second time will again disable it.
<p class="level0"><a name="--key"></a><span class="nroffip">--key &lt;key&gt;</span>
<p class="level1">(SSL) Private key file name. Allows you to provide your private key in this separate file.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--key-type"></a><span class="nroffip">--key-type &lt;type&gt;</span>
<p class="level1">(SSL) Private key file type. Specify which type your <a class="emphasis" href="#--key">--key</a> provided private key is. DER, PEM and ENG are supported.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--krb4"></a><span class="nroffip">--krb4 &lt;level&gt;</span>
<p class="level1">(FTP) Enable kerberos4 authentication and use. The level must be entered and should be one of 'clear', 'safe', 'confidential' or 'private'. Should you use a level that is not one of these, 'private' will instead be used.
<p class="level1">This option requiures that the library was built with kerberos4 support. This is not very common. Use <a class="emphasis" href="#-V--version">-V/--version</a> to see if your curl supports it.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-K--config"></a><span class="nroffip">-K/--config &lt;config file&gt;</span>
<p class="level1">Specify which config file to read curl arguments from. The config file is a text file in which command line arguments can be written which then will be used as if they were written on the actual command line. Options and their parameters must be specified on the same config file line. If the parameter is to contain white spaces, the parameter must be inclosed within quotes. If the first column of a config line is a '#' character, the rest of the line will be treated as a comment.
<p class="level1">Specify the filename as '-' to make curl read the file from stdin.
<p class="level1">Note that to be able to specify a URL in the config file, you need to specify it using the <a class="emphasis" href="#--url">--url</a> option, and not by simply writing the URL on its own line. So, it could look similar to this:
<p class="level1">url = "<a href="http://curl.haxx.se/docs">http://curl.haxx.se/docs</a>/"
<p class="level1">This option can be used multiple times.
<p class="level0"><a name="--limit-rate"></a><span class="nroffip">--limit-rate &lt;speed&gt;</span>
<p class="level1">Specify the maximum transfer rate you want curl to use. This feature is useful if you have a limited pipe and you'd like your transfer not use your entire bandwidth.
<p class="level1">The given speed is measured in bytes/second, unless a suffix is appended. Appending 'k' or 'K' will count the number as kilobytes, 'm' or M' makes it megabytes while 'g' or 'G' makes it gigabytes. Examples: 200K, 3m and 1G.
<p class="level1">If you are also using the <a class="emphasis" href="#-Y--speed-limit">-Y/--speed-limit</a> option, that option will take precedence and might cripple the rate-limiting slightly, to help keeping the speed-limit logic working.
<p class="level1">This option was introduced in curl 7.10.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-l--list-only"></a><span class="nroffip">-l/--list-only</span>
<p class="level1">(FTP) When listing an FTP directory, this switch forces a name-only view. Especially useful if you want to machine-parse the contents of an FTP directory since the normal directory view doesn't use a standard look or format.
<p class="level1">This option causes an FTP NLST command to be sent. Some FTP servers list only files in their response to NLST; they do not include subdirectories and symbolic links.
<p class="level1">If this option is used twice, the second will again disable list only.
<p class="level0"><a name="-L--location"></a><span class="nroffip">-L/--location</span>
<p class="level1">(HTTP/HTTPS) If the server reports that the requested page has a different location (indicated with the header line Location:) this flag will let curl attempt to reattempt the get on the new place. If used together with <a class="emphasis" href="#-i--include">-i/--include</a> or <a class="emphasis" href="#-I--head">-I/--head</a>, headers from all requested pages will be shown. If authentication is used, curl will only send its credentials to the initial host, so if a redirect takes curl to a different host, it won't intercept the user+password. See also <a class="emphasis" href="#--location-trusted">--location-trusted</a> on how to change this.
<p class="level1">If this option is used twice, the second will again disable location following.
<p class="level0"><a name="--location-trusted"></a><span class="nroffip">--location-trusted</span>
<p class="level1">(HTTP/HTTPS) Like <a class="emphasis" href="#-L--location">-L/--location</a>, but will allow sending the name + password to all hosts that the site may redirect to. This may or may not introduce a security breach if the site redirects you do a site to which you'll send your authentication info (which is plaintext in the case of HTTP Basic authentication).
<p class="level1">If this option is used twice, the second will again disable location following.
<p class="level0"><a name="--max-filesize"></a><span class="nroffip">--max-filesize &lt;bytes&gt;</span>
<p class="level1">Specify the maximum size (in bytes) of a file to download. If the file requested is larger than this value, the transfer will not start and curl will return with exit code 63.
<p class="level1">NOTE: The file size is not always known prior to download, and for such files this option has no effect even if the file transfer ends up being larger than this given limit. This concerns both FTP and HTTP transfers.
<p class="level0"><a name="-m--max-time"></a><span class="nroffip">-m/--max-time &lt;seconds&gt;</span>
<p class="level1">Maximum time in seconds that you allow the whole operation to take. This is useful for preventing your batch jobs from hanging for hours due to slow networks or links going down. This doesn't work fully in win32 systems. See also the <a class="emphasis" href="#--connect-timeout">--connect-timeout</a> option.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-M--manual"></a><span class="nroffip">-M/--manual</span>
<p class="level1">Manual. Display the huge help text.
<p class="level0"><a name="-n--netrc"></a><span class="nroffip">-n/--netrc</span>
<p class="level1">Makes curl scan the <span Class="emphasis">.netrc</span> file in the user's home directory for login name and password. This is typically used for ftp on unix. If used with http, curl will enable user authentication. See <span Class="manpage">netrc(4)</span> or <span Class="manpage">ftp(1)</span> for details on the file format. Curl will not complain if that file hasn't the right permissions (it should not be world nor group readable). The environment variable "HOME" is used to find the home directory.
<p class="level1">A quick and very simple example of how to setup a <span Class="emphasis">.netrc</span> to allow curl to ftp to the machine host.domain.com with user name &zerosp;'myself' and password 'secret' should look similar to:
<p class="level1"><span Class="bold">machine host.domain.com login myself password secret</span>
<p class="level1">If this option is used twice, the second will again disable netrc usage.
<p class="level0"><a name="--netrc-optional"></a><span class="nroffip">--netrc-optional</span>
<p class="level1">Very similar to <span Class="emphasis">--netrc</span>, but this option makes the .netrc usage <span Class="bold">optional</span> and not mandatory as the <span Class="emphasis">--netrc</span> does.
<p class="level0"><a name="--negotiate"></a><span class="nroffip">--negotiate</span>
<p class="level1">(HTTP) Enables GSS-Negotiate authentication. The GSS-Negotiate method was designed by Microsoft and is used in their web aplications. It is primarily meant as a support for Kerberos5 authentication but may be also used along with another authentication methods. For more information see IETF draft draft-brezak-spnego-http-04.txt. (Added in 7.10.6)
<p class="level1">This option requiures that the library was built with GSSAPI support. This is not very common. Use <a class="emphasis" href="#-V--version">-V/--version</a> to see if your version supports GSS-Negotiate.
<p class="level1">If this option is used several times, the following occurrences make no difference.
<p class="level0"><a name="-N--no-buffer"></a><span class="nroffip">-N/--no-buffer</span>
<p class="level1">Disables the buffering of the output stream. In normal work situations, curl will use a standard buffered output stream that will have the effect that it will output the data in chunks, not necessarily exactly when the data arrives. Using this option will disable that buffering.
<p class="level1">If this option is used twice, the second will again switch on buffering.
<p class="level0"><a name="--ntlm"></a><span class="nroffip">--ntlm</span>
<p class="level1">(HTTP) Enables NTLM authentication. The NTLM authentication method was designed by Microsoft and is used by IIS web servers. It is a proprietary protocol, reversed engineered by clever people and implemented in curl based on their efforts. This kind of behavior should not be endorsed, you should encourage everyone who uses NTLM to switch to a public and documented authentication method instead. Such as Digest. (Added in 7.10.6)
<p class="level1">If you want to enable NTLM for your proxy authentication, then use <a class="emphasis" href="#--proxy-ntlm">--proxy-ntlm</a>.
<p class="level1">This option requiures that the library was built with SSL support. Use <a class="emphasis" href="#-V--version">-V/--version</a> to see if your curl supports NTLM.
<p class="level1">If this option is used several times, the following occurrences make no difference.
<p class="level0"><a name="-o--output"></a><span class="nroffip">-o/--output &lt;file&gt;</span>
<p class="level1">Write output to &lt;file&gt; instead of stdout. If you are using {} or [] to fetch multiple documents, you can use '#' followed by a number in the &lt;file&gt; specifier. That variable will be replaced with the current string for the URL being fetched. Like in:
<p class="level1">&nbsp; curl http://{one,two}.site.com -o "file_#1.txt"
<p class="level1">or use several variables like:
<p class="level1">&nbsp; curl http://{site,host}.host[1-5].com -o "#1_#2"
<p class="level1">You may use this option as many times as you have number of URLs.
<p class="level1">See also the <a class="emphasis" href="#--create-dirs">--create-dirs</a> option to create the local directories dynamically.
<p class="level0"><a name="-O--remote-name"></a><span class="nroffip">-O/--remote-name</span>
<p class="level1">Write output to a local file named like the remote file we get. (Only the file part of the remote file is used, the path is cut off.)
<p class="level1">You may use this option as many times as you have number of URLs.
<p class="level0"><a name="--pass"></a><span class="nroffip">--pass &lt;phrase&gt;</span>
<p class="level1">(SSL) Pass phrase for the private key
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--proxy-ntlm"></a><span class="nroffip">--proxy-ntlm</span>
<p class="level1">Tells curl to use NTLM authentication when communicating with the given proxy. Use <a class="emphasis" href="#--ntlm">--ntlm</a> for enabling NTLM with a remote host.
<p class="level1">If this option is used twice, the second will again disable proxy NTLM.
<p class="level0"><a name="-p--proxytunnel"></a><span class="nroffip">-p/--proxytunnel</span>
<p class="level1">When an HTTP proxy is used (<a class="emphasis" href="#-x--proxy">-x/--proxy</a>), this option will cause non-HTTP protocols to attempt to tunnel through the proxy instead of merely using it to do HTTP-like operations. The tunnel approach is made with the HTTP proxy CONNECT request and requires that the proxy allows direct connect to the remote port number curl wants to tunnel through to.
<p class="level1">If this option is used twice, the second will again disable proxy tunnel.
<p class="level0"><a name="-P--ftp-port"></a><span class="nroffip">-P/--ftp-port &lt;address&gt;</span>
<p class="level1">(FTP) Reverses the initiator/listener roles when connecting with ftp. This switch makes Curl use the PORT command instead of PASV. In practice, PORT tells the server to connect to the client's specified address and port, while PASV asks the server for an ip address and port to connect to. &lt;address&gt; should be one of:
<p class="level2">
<p class="level1"><a name="interface"></a><span class="nroffip">interface</span>
<p class="level2">i.e "eth0" to specify which interface's IP address you want to use (Unix only)
<p class="level1"><a name="IP"></a><span class="nroffip">IP address</span>
<p class="level2">i.e "192.168.10.1" to specify exact IP number
<p class="level1"><a name="host"></a><span class="nroffip">host name</span>
<p class="level2">i.e "my.host.domain" to specify machine
<p class="level1"><a name="-"></a><span class="nroffip">-</span>
<p class="level2">(any single-letter string) to make it pick the machine's default
<p class="level1">
<p class="level1">If this option is used several times, the last one will be used. Disable the use of PORT with <a class="emphasis" href="#--ftp-pasv">--ftp-pasv</a>. Disable the attempt to use the EPRT command instead of PORT by using <a class="emphasis" href="#--disable-eprt">--disable-eprt</a>. EPRT is really PORT++.
<p class="level0"><a name="-q"></a><span class="nroffip">-q</span>
<p class="level1">If used as the first parameter on the command line, the <span Class="emphasis">$HOME/.curlrc</span> file will not be read and used as a config file.
<p class="level0"><a name="-Q--quote"></a><span class="nroffip">-Q/--quote &lt;comand&gt;</span>
<p class="level1">(FTP) Send an arbitrary command to the remote FTP server, by using the QUOTE command of the server. Not all servers support this command, and the set of QUOTE commands are server specific! Quote commands are sent BEFORE the transfer is taking place. To make commands take place after a successful transfer, prefix them with a dash '-'. You may specify any amount of commands to be run before and after the transfer. If the server returns failure for one of the commands, the entire operation will be aborted.
<p class="level1">This option can be used multiple times.
<p class="level0"><a name="--random-file"></a><span class="nroffip">--random-file &lt;file&gt;</span>
<p class="level1">(HTTPS) Specify the path name to file containing what will be considered as random data. The data is used to seed the random engine for SSL connections. See also the <a class="emphasis" href="#--egd-file">--egd-file</a> option.
<p class="level0"><a name="-r--range"></a><span class="nroffip">-r/--range &lt;range&gt;</span>
<p class="level1">(HTTP/FTP) Retrieve a byte range (i.e a partial document) from a HTTP/1.1 or FTP server. Ranges can be specified in a number of ways.
<p class="level2">
<p class="level2"><span Class="bold">0-499</span> specifies the first 500 bytes
<p class="level2"><span Class="bold">500-999</span> specifies the second 500 bytes
<p class="level2"><span Class="bold">-500</span> specifies the last 500 bytes
<p class="level2"><span Class="bold">9500</span> specifies the bytes from offset 9500 and forward
<p class="level2"><span Class="bold">0-0,-1</span> specifies the first and last byte only(*)(H)
<p class="level2"><span Class="bold">500-700,600-799</span> specifies 300 bytes from offset 500(H)
<p class="level2"><span Class="bold">100-199,500-599</span> specifies two separate 100 bytes ranges(*)(H)
<p class="level1">
<p class="level1">(*) = NOTE that this will cause the server to reply with a multipart response!
<p class="level1">You should also be aware that many HTTP/1.1 servers do not have this feature enabled, so that when you attempt to get a range, you'll instead get the whole document.
<p class="level1">FTP range downloads only support the simple syntax 'start-stop' (optionally with one of the numbers omitted). It depends on the non-RFC command SIZE.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-R--remote-time"></a><span class="nroffip">-R/--remote-time</span>
<p class="level1">When used, this will make libcurl attempt to figure out the timestamp of the remote file, and if that is available make the local file get that same timestamp.
<p class="level1">If this option is used twice, the second time disables this again.
<p class="level0"><a name="-s--silent"></a><span class="nroffip">-s/--silent</span>
<p class="level1">Silent mode. Don't show progress meter or error messages. Makes Curl mute.
<p class="level1">If this option is used twice, the second will again disable mute.
<p class="level0"><a name="-S--show-error"></a><span class="nroffip">-S/--show-error</span>
<p class="level1">When used with -s it makes curl show error message if it fails.
<p class="level1">If this option is used twice, the second will again disable show error.
<p class="level0"><a name="--socks"></a><span class="nroffip">--socks &lt;host[:port]&gt;</span>
<p class="level1">Use the specified SOCKS5 proxy. If the port number is not specified, it is assumed at port 1080. (Option added in 7.11.1)
<p class="level1">This option overrides any previous use of <a class="emphasis" href="#-x--proxy">-x/--proxy</a>, as they are mutually exclusive.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--stderr"></a><span class="nroffip">--stderr &lt;file&gt;</span>
<p class="level1">Redirect all writes to stderr to the specified file instead. If the file name is a plain '-', it is instead written to stdout. This option has no point when you're using a shell with decent redirecting capabilities.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-t--telnet-option"></a><span class="nroffip">-t/--telnet-option &lt;OPT=val&gt;</span>
<p class="level1">Pass options to the telnet protocol. Supported options are:
<p class="level1">TTYPE=&lt;term&gt; Sets the terminal type.
<p class="level1">XDISPLOC=&lt;X display&gt; Sets the X display location.
<p class="level1">NEW_ENV=&lt;var,val&gt; Sets an environment variable.
<p class="level0"><a name="-T--upload-file"></a><span class="nroffip">-T/--upload-file &lt;file&gt;</span>
<p class="level1">This transfers the specified local file to the remote URL. If there is no file part in the specified URL, Curl will append the local file name. NOTE that you must use a trailing / on the last directory to really prove to Curl that there is no file name or curl will think that your last directory name is the remote file name to use. That will most likely cause the upload operation to fail. If this is used on a http(s) server, the PUT command will be used.
<p class="level1">Use the file name "-" (a single dash) to use stdin instead of a given file.
<p class="level1">Before 7.10.8, when this option was used several times, the last one was used.
<p class="level1">In curl 7.10.8 and later, you can specify one -T for each URL on the command line. Each -T + URL pair specifies what to upload and to where. curl also supports "globbing" of the -T argument, meaning that you can upload multiple files to a single URL by using the same URL globbing style supported in the URL, like this:
<p class="level1">curl -T "{file1,file2}" <a href="http://www.uploadtothissite.com">http://www.uploadtothissite.com</a>
<p class="level1">or even
<p class="level1">curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/
<p class="level0"><a name="--trace"></a><span class="nroffip">--trace &lt;file&gt;</span>
<p class="level1">Enables a full trace dump of all incoming and outgoing data, including descriptive information, to the given output file. Use "-" as filename to have the output sent to stdout.
<p class="level1">If this option is used several times, the last one will be used. (Added in 7.9.7)
<p class="level0"><a name="--trace-ascii"></a><span class="nroffip">--trace-ascii &lt;file&gt;</span>
<p class="level1">Enables a full trace dump of all incoming and outgoing data, including descriptive information, to the given output file. Use "-" as filename to have the output sent to stdout.
<p class="level1">This is very similar to <a class="emphasis" href="#--trace">--trace</a>, but leaves out the hex part and only shows the ASCII part of the dump. It makes smaller output that might be easier to read for untrained humans.
<p class="level1">If this option is used several times, the last one will be used. (Added in 7.9.7)
<p class="level0"><a name="-u--user"></a><span class="nroffip">-u/--user &lt;user:password&gt;</span>
<p class="level1">Specify user and password to use for server authentication.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-U--proxy-user"></a><span class="nroffip">-U/--proxy-user &lt;user:password&gt;</span>
<p class="level1">Specify user and password to use for proxy authentication.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="--url"></a><span class="nroffip">--url &lt;URL&gt;</span>
<p class="level1">Specify a URL to fetch. This option is mostly handy when you want to specify URL(s) in a config file.
<p class="level1">This option may be used any number of times. To control where this URL is written, use the <a class="emphasis" href="#-o--output">-o/--output</a> or the <a class="emphasis" href="#-O--remote-name">-O/--remote-name</a> options.
<p class="level0"><a name="-v--verbose"></a><span class="nroffip">-v/--verbose</span>
<p class="level1">Makes the fetching more verbose/talkative. Mostly usable for debugging. Lines starting with '&gt;' means data sent by curl, '&lt;' means data received by curl that is hidden in normal cases and lines starting with '*' means additional info provided by curl.
<p class="level1">Note that if you want to see HTTP headers in the output, <a class="emphasis" href="#-i--include">-i/--include</a> might be option you're looking for.
<p class="level1">If you think this option still doesn't give you enough details, consider using <a class="emphasis" href="#--trace">--trace</a> or <a class="emphasis" href="#--trace-ascii">--trace-ascii</a> instead.
<p class="level1">If this option is used twice, the second will again disable verbose.
<p class="level0"><a name="-V--version"></a><span class="nroffip">-V/--version</span>
<p class="level1">Displays information about curl and the libcurl version it uses.
<p class="level1">The first line includes the full version of curl, libcurl and other 3rd party libraries linked with the executable.
<p class="level1">The second line (starts with "Protocols:") shows all protocols that libcurl reports to support.
<p class="level1">The third line (starts with "Features:") shows specific features libcurl reports to offer. Available features include:
<p class="level2">
<p class="level1"><a name="IPv6"></a><span class="nroffip">IPv6</span>
<p class="level2">You can use IPv6 with this.
<p class="level1"><a name="krb4"></a><span class="nroffip">krb4</span>
<p class="level2">Krb4 for ftp is supported.
<p class="level1"><a name="SSL"></a><span class="nroffip">SSL</span>
<p class="level2">HTTPS and FTPS are supported.
<p class="level1"><a name="libz"></a><span class="nroffip">libz</span>
<p class="level2">Automatic decompression of compressed files over HTTP is supported.
<p class="level1"><a name="NTLM"></a><span class="nroffip">NTLM</span>
<p class="level2">NTLM authenticaion is supported.
<p class="level1"><a name="GSS-Negotiate"></a><span class="nroffip">GSS-Negotiate</span>
<p class="level2">Negotiate authenticaion is supported.
<p class="level1"><a name="Debug"></a><span class="nroffip">Debug</span>
<p class="level2">This curl uses a libcurl built with Debug. This enables more error-tracking and memory debugging etc. For curl-developers only!
<p class="level1"><a name="AsynchDNS"></a><span class="nroffip">AsynchDNS</span>
<p class="level2">This curl uses asynchronous name resolves.
<p class="level1"><a name="SPNEGO"></a><span class="nroffip">SPNEGO</span>
<p class="level2">SPNEGO Negotiate authenticaion is supported.
<p class="level1"><a name="Largefile"></a><span class="nroffip">Largefile</span>
<p class="level2">This curl supports transfers of large files, files larger than 2GB.
<p class="level1">
<p class="level0"><a name="-w--write-out"></a><span class="nroffip">-w/--write-out &lt;format&gt;</span>
<p class="level1">Defines what to display after a completed and successful operation. The format is a string that may contain plain text mixed with any number of variables. The string can be specified as "string", to get read from a particular file you specify it "@filename" and to tell curl to read the format from stdin you write "@-".
<p class="level1">The variables present in the output format will be substituted by the value or text that curl thinks fit, as described below. All variables are specified like %{variable_name} and to output a normal % you just write them like %%. You can output a newline by using n, a carriage return with r and a tab space with t.
<p class="level1"><span Class="bold">NOTE:</span> The %-letter is a special letter in the win32-environment, where all occurrences of % must be doubled when using this option.
<p class="level1">Available variables are at this point:
<p class="level2">
<p class="level2"><span Class="bold">url_effective</span> The URL that was fetched last. This is mostly meaningful if you've told curl to follow location: headers.
<p class="level2"><span Class="bold">http_code</span> The numerical code that was found in the last retrieved HTTP(S) page.
<p class="level2"><span Class="bold">time_total</span> The total time, in seconds, that the full operation lasted. The time will be displayed with millisecond resolution.
<p class="level2"><span Class="bold">time_namelookup</span> The time, in seconds, it took from the start until the name resolving was completed.
<p class="level2"><span Class="bold">time_connect</span> The time, in seconds, it took from the start until the connect to the remote host (or proxy) was completed.
<p class="level2"><span Class="bold">time_pretransfer</span> The time, in seconds, it took from the start until the file transfer is just about to begin. This includes all pre-transfer commands and negotiations that are specific to the particular protocol(s) involved.
<p class="level2"><span Class="bold">time_starttransfer</span> The time, in seconds, it took from the start until the first byte is just about to be transfered. This includes time_pretransfer and also the time the server needs to calculate the result.
<p class="level2"><span Class="bold">size_download</span> The total amount of bytes that were downloaded.
<p class="level2"><span Class="bold">size_upload</span> The total amount of bytes that were uploaded.
<p class="level2"><span Class="bold">size_header</span> The total amount of bytes of the downloaded headers.
<p class="level2"><span Class="bold">size_request</span> The total amount of bytes that were sent in the HTTP request.
<p class="level2"><span Class="bold">speed_download</span> The average download speed that curl measured for the complete download.
<p class="level2"><span Class="bold">speed_upload</span> The average upload speed that curl measured for the complete upload.
<p class="level2"><span Class="bold">content_type</span> The Content-Type of the requested document, if there was any. (Added in 7.9.5)
<p class="level1">
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-x--proxy"></a><span class="nroffip">-x/--proxy &lt;proxyhost[:port]&gt;</span>
<p class="level1">Use specified HTTP proxy. If the port number is not specified, it is assumed at port 1080.
<p class="level1">This option overrides existing environment variables that sets proxy to use. If there's an environment variable setting a proxy, you can set proxy to &zerosp;"" to override it.
<p class="level1"><span Class="bold">Note</span> that all operations that are performed over a HTTP proxy will transparantly be converted to HTTP. It means that certain protocol specific operations might not be available. This is not the case if you can tunnel through the proxy, as done with the <a class="emphasis" href="#-p--proxytunnel">-p/--proxytunnel</a> option.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-X--request"></a><span class="nroffip">-X/--request &lt;command&gt;</span>
<p class="level1">(HTTP) Specifies a custom request to use when communicating with the HTTP server. The specified request will be used instead of the standard GET. Read the HTTP 1.1 specification for details and explanations.
<p class="level1">(FTP) Specifies a custom FTP command to use instead of LIST when doing file lists with ftp.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-y--speed-time"></a><span class="nroffip">-y/--speed-time &lt;time&gt;</span>
<p class="level1">If a download is slower than speed-limit bytes per second during a speed-time period, the download gets aborted. If speed-time is used, the default speed-limit will be 1 unless set with -y.
<p class="level1">This option controls transfers and thus will not affect slow connects etc. If this is a concern for you, try the <a class="emphasis" href="#--connect-timeout">--connect-timeout</a> option.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-Y--speed-limit"></a><span class="nroffip">-Y/--speed-limit &lt;speed&gt;</span>
<p class="level1">If a download is slower than this given speed, in bytes per second, for speed-time seconds it gets aborted. speed-time is set with -Y and is 30 if not set.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-z--time-cond"></a><span class="nroffip">-z/--time-cond &lt;date expression&gt;</span>
<p class="level1">(HTTP) Request to get a file that has been modified later than the given time and date, or one that has been modified before that time. The date expression can be all sorts of date strings or if it doesn't match any internal ones, it tries to get the time from a given file name instead! See the <span Class="manpage">GNU date(1)</span> or <span Class="manpage">curl_getdate(3)</span> man pages for date expression details.
<p class="level1">Start the date expression with a dash (-) to make it request for a document that is older than the given date/time, default is a document that is newer than the specified date/time.
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-Z--max-redirs"></a><span class="nroffip">-Z/--max-redirs &lt;num&gt;</span>
<p class="level1">Set maximum number of redirection-followings allowed. If <a class="emphasis" href="#-L--location">-L/--location</a> is used, this option can be used to prevent curl from following redirections &zerosp;"in absurdum".
<p class="level1">If this option is used several times, the last one will be used.
<p class="level0"><a name="-0--http10"></a><span class="nroffip">-0/--http1.0</span>
<p class="level1">(HTTP) Forces curl to issue its requests using HTTP 1.0 instead of using its internally preferred: HTTP 1.1.
<p class="level0"><a name="-1--tlsv1"></a><span class="nroffip">-1/--tlsv1</span>
<p class="level1">(HTTPS) Forces curl to use TSL version 1 when negotiating with a remote TLS server.
<p class="level0"><a name="-2--sslv2"></a><span class="nroffip">-2/--sslv2</span>
<p class="level1">(HTTPS) Forces curl to use SSL version 2 when negotiating with a remote SSL server.
<p class="level0"><a name="-3--sslv3"></a><span class="nroffip">-3/--sslv3</span>
<p class="level1">(HTTPS) Forces curl to use SSL version 3 when negotiating with a remote SSL server.
<p class="level0"><a name="-4--ipv4"></a><span class="nroffip">-4/--ipv4</span>
<p class="level1">If libcurl is capable of resolving an address to multiple IP versions (which it is if it is ipv6-capable), this option tells libcurl to resolve names to IPv4 addresses only. (Added in 7.10.8)
<p class="level0"><a name="-6--ipv6"></a><span class="nroffip">-6/--ipv6</span>
<p class="level1">If libcurl is capable of resolving an address to multiple IP versions (which it is if it is ipv6-capable), this option tells libcurl to resolve names to IPv6 addresses only. (Added in 7.10.8)
<p class="level0"><a name="---progress-bar"></a><span class="nroffip">-#/--progress-bar</span>
<p class="level1">Make curl display progress information as a progress bar instead of the default statistics.
<p class="level1">If this option is used twice, the second will again disable the progress bar. <a name="FILES"></a><h2 class="nroffsh">FILES</h2>
<p class="level0"><span Class="emphasis">~/.curlrc</span>
<p class="level1">Default config file.
<p class="level1"><a name="ENVIRONMENT"></a><h2 class="nroffsh">ENVIRONMENT</h2>
<p class="level0">
<p class="level0"><a name="httpproxy"></a><span class="nroffip">http_proxy [protocol://]&lt;host&gt;[:port]</span>
<p class="level1">Sets proxy server to use for HTTP.
<p class="level0"><a name="HTTPSPROXY"></a><span class="nroffip">HTTPS_PROXY [protocol://]&lt;host&gt;[:port]</span>
<p class="level1">Sets proxy server to use for HTTPS.
<p class="level0"><a name="FTPPROXY"></a><span class="nroffip">FTP_PROXY [protocol://]&lt;host&gt;[:port]</span>
<p class="level1">Sets proxy server to use for FTP.
<p class="level0"><a name="GOPHERPROXY"></a><span class="nroffip">GOPHER_PROXY [protocol://]&lt;host&gt;[:port]</span>
<p class="level1">Sets proxy server to use for GOPHER.
<p class="level0"><a name="ALLPROXY"></a><span class="nroffip">ALL_PROXY [protocol://]&lt;host&gt;[:port]</span>
<p class="level1">Sets proxy server to use if no protocol-specific proxy is set.
<p class="level0"><a name="NOPROXY"></a><span class="nroffip">NO_PROXY &lt;comma-separated list of hosts&gt;</span>
<p class="level1">list of host names that shouldn't go through any proxy. If set to a asterisk '*' only, it matches all hosts. <a name="EXIT"></a><h2 class="nroffsh">EXIT CODES</h2>
<p class="level0">There exists a bunch of different error codes and their corresponding error messages that may appear during bad conditions. At the time of this writing, the exit codes are:
<p class="level0"><a name="1"></a><span class="nroffip">1</span>
<p class="level1">Unsupported protocol. This build of curl has no support for this protocol.
<p class="level0"><a name="2"></a><span class="nroffip">2</span>
<p class="level1">Failed to initialize.
<p class="level0"><a name="3"></a><span class="nroffip">3</span>
<p class="level1">URL malformat. The syntax was not correct.
<p class="level0"><a name="4"></a><span class="nroffip">4</span>
<p class="level1">URL user malformatted. The user-part of the URL syntax was not correct.
<p class="level0"><a name="5"></a><span class="nroffip">5</span>
<p class="level1">Couldn't resolve proxy. The given proxy host could not be resolved.
<p class="level0"><a name="6"></a><span class="nroffip">6</span>
<p class="level1">Couldn't resolve host. The given remote host was not resolved.
<p class="level0"><a name="7"></a><span class="nroffip">7</span>
<p class="level1">Failed to connect to host.
<p class="level0"><a name="8"></a><span class="nroffip">8</span>
<p class="level1">FTP weird server reply. The server sent data curl couldn't parse.
<p class="level0"><a name="9"></a><span class="nroffip">9</span>
<p class="level1">FTP access denied. The server denied login.
<p class="level0"><a name="10"></a><span class="nroffip">10</span>
<p class="level1">FTP user/password incorrect. Either one or both were not accepted by the server.
<p class="level0"><a name="11"></a><span class="nroffip">11</span>
<p class="level1">FTP weird PASS reply. Curl couldn't parse the reply sent to the PASS request.
<p class="level0"><a name="12"></a><span class="nroffip">12</span>
<p class="level1">FTP weird USER reply. Curl couldn't parse the reply sent to the USER request.
<p class="level0"><a name="13"></a><span class="nroffip">13</span>
<p class="level1">FTP weird PASV reply, Curl couldn't parse the reply sent to the PASV request.
<p class="level0"><a name="14"></a><span class="nroffip">14</span>
<p class="level1">FTP weird 227 format. Curl couldn't parse the 227-line the server sent.
<p class="level0"><a name="15"></a><span class="nroffip">15</span>
<p class="level1">FTP can't get host. Couldn't resolve the host IP we got in the 227-line.
<p class="level0"><a name="16"></a><span class="nroffip">16</span>
<p class="level1">FTP can't reconnect. Couldn't connect to the host we got in the 227-line.
<p class="level0"><a name="17"></a><span class="nroffip">17</span>
<p class="level1">FTP couldn't set binary. Couldn't change transfer method to binary.
<p class="level0"><a name="18"></a><span class="nroffip">18</span>
<p class="level1">Partial file. Only a part of the file was transfered.
<p class="level0"><a name="19"></a><span class="nroffip">19</span>
<p class="level1">FTP couldn't download/access the given file, the RETR (or similar) command failed.
<p class="level0"><a name="20"></a><span class="nroffip">20</span>
<p class="level1">FTP write error. The transfer was reported bad by the server.
<p class="level0"><a name="21"></a><span class="nroffip">21</span>
<p class="level1">FTP quote error. A quote command returned error from the server.
<p class="level0"><a name="22"></a><span class="nroffip">22</span>
<p class="level1">HTTP page not retrieved. The requested url was not found or returned another error with the HTTP error code being 400 or above. This return code only appears if <a class="emphasis" href="#-f--fail">-f/--fail</a> is used.
<p class="level0"><a name="23"></a><span class="nroffip">23</span>
<p class="level1">Write error. Curl couldn't write data to a local filesystem or similar.
<p class="level0"><a name="24"></a><span class="nroffip">24</span>
<p class="level1">Malformed user. User name badly specified.
<p class="level0"><a name="25"></a><span class="nroffip">25</span>
<p class="level1">FTP couldn't STOR file. The server denied the STOR operation, used for FTP uploading.
<p class="level0"><a name="26"></a><span class="nroffip">26</span>
<p class="level1">Read error. Various reading problems.
<p class="level0"><a name="27"></a><span class="nroffip">27</span>
<p class="level1">Out of memory. A memory allocation request failed.
<p class="level0"><a name="28"></a><span class="nroffip">28</span>
<p class="level1">Operation timeout. The specified time-out period was reached according to the conditions.
<p class="level0"><a name="29"></a><span class="nroffip">29</span>
<p class="level1">FTP couldn't set ASCII. The server returned an unknown reply.
<p class="level0"><a name="30"></a><span class="nroffip">30</span>
<p class="level1">FTP PORT failed. The PORT command failed. Not all FTP servers support the PORT command, try doing a transfer using PASV instead!
<p class="level0"><a name="31"></a><span class="nroffip">31</span>
<p class="level1">FTP couldn't use REST. The REST command failed. This command is used for resumed FTP transfers.
<p class="level0"><a name="32"></a><span class="nroffip">32</span>
<p class="level1">FTP couldn't use SIZE. The SIZE command failed. The command is an extension to the original FTP spec RFC 959.
<p class="level0"><a name="33"></a><span class="nroffip">33</span>
<p class="level1">HTTP range error. The range "command" didn't work.
<p class="level0"><a name="34"></a><span class="nroffip">34</span>
<p class="level1">HTTP post error. Internal post-request generation error.
<p class="level0"><a name="35"></a><span class="nroffip">35</span>
<p class="level1">SSL connect error. The SSL handshaking failed.
<p class="level0"><a name="36"></a><span class="nroffip">36</span>
<p class="level1">FTP bad download resume. Couldn't continue an earlier aborted download.
<p class="level0"><a name="37"></a><span class="nroffip">37</span>
<p class="level1">FILE couldn't read file. Failed to open the file. Permissions?
<p class="level0"><a name="38"></a><span class="nroffip">38</span>
<p class="level1">LDAP cannot bind. LDAP bind operation failed.
<p class="level0"><a name="39"></a><span class="nroffip">39</span>
<p class="level1">LDAP search failed.
<p class="level0"><a name="40"></a><span class="nroffip">40</span>
<p class="level1">Library not found. The LDAP library was not found.
<p class="level0"><a name="41"></a><span class="nroffip">41</span>
<p class="level1">Function not found. A required LDAP function was not found.
<p class="level0"><a name="42"></a><span class="nroffip">42</span>
<p class="level1">Aborted by callback. An application told curl to abort the operation.
<p class="level0"><a name="43"></a><span class="nroffip">43</span>
<p class="level1">Internal error. A function was called with a bad parameter.
<p class="level0"><a name="44"></a><span class="nroffip">44</span>
<p class="level1">Internal error. A function was called in a bad order.
<p class="level0"><a name="45"></a><span class="nroffip">45</span>
<p class="level1">Interface error. A specified outgoing interface could not be used.
<p class="level0"><a name="46"></a><span class="nroffip">46</span>
<p class="level1">Bad password entered. An error was signaled when the password was entered.
<p class="level0"><a name="47"></a><span class="nroffip">47</span>
<p class="level1">Too many redirects. When following redirects, curl hit the maximum amount.
<p class="level0"><a name="48"></a><span class="nroffip">48</span>
<p class="level1">Unknown TELNET option specified.
<p class="level0"><a name="49"></a><span class="nroffip">49</span>
<p class="level1">Malformed telnet option.
<p class="level0"><a name="51"></a><span class="nroffip">51</span>
<p class="level1">The remote peer's SSL certificate wasn't ok
<p class="level0"><a name="52"></a><span class="nroffip">52</span>
<p class="level1">The server didn't reply anything, which here is considered an error.
<p class="level0"><a name="53"></a><span class="nroffip">53</span>
<p class="level1">SSL crypto engine not found
<p class="level0"><a name="54"></a><span class="nroffip">54</span>
<p class="level1">Cannot set SSL crypto engine as default
<p class="level0"><a name="55"></a><span class="nroffip">55</span>
<p class="level1">Failed sending network data
<p class="level0"><a name="56"></a><span class="nroffip">56</span>
<p class="level1">Failure in receiving network data
<p class="level0"><a name="57"></a><span class="nroffip">57</span>
<p class="level1">Share is in use (internal error)
<p class="level0"><a name="58"></a><span class="nroffip">58</span>
<p class="level1">Problem with the local certificate
<p class="level0"><a name="59"></a><span class="nroffip">59</span>
<p class="level1">Couldn't use specified SSL cipher
<p class="level0"><a name="60"></a><span class="nroffip">60</span>
<p class="level1">Problem with the CA cert (path? permission?)
<p class="level0"><a name="61"></a><span class="nroffip">61</span>
<p class="level1">Unrecognized transfer encoding
<p class="level0"><a name="62"></a><span class="nroffip">62</span>
<p class="level1">Invalid LDAP URL
<p class="level0"><a name="63"></a><span class="nroffip">63</span>
<p class="level1">Maximum file size exceeded
<p class="level0"><a name="XX"></a><span class="nroffip">XX</span>
<p class="level1">There will appear more error codes here in future releases. The existing ones are meant to never change. <a name="AUTHORS"></a><h2 class="nroffsh">AUTHORS / CONTRIBUTORS</h2>
<p class="level0">Daniel Stenberg is the main author, but the whole list of contributors is found in the separate THANKS file. <a name="WWW"></a><h2 class="nroffsh">WWW</h2>
<p class="level0"><a href="http://curl.haxx.se">http://curl.haxx.se</a> <a name="FTP"></a><h2 class="nroffsh">FTP</h2>
<p class="level0">ftp://ftp.sunet.se/pub/www/utilities/curl/ <a name="SEE"></a><h2 class="nroffsh">SEE ALSO</h2>
<p class="level0"><span Class="manpage">ftp (1)</span> <span Class="manpage">wget (1)</span> <span Class="manpage">snarf (1)</span> <p class="roffit">
This HTML page was made with <a href="http://daniel.haxx.se/projects/roffit/">roffit</a>.
</body></html>

BIN
neo/curl/docs/curl.pdf Normal file

Binary file not shown.

View File

@ -0,0 +1,15 @@
#
# $Id: Makefile.am,v 1.25 2003/10/03 13:46:27 bagder Exp $
#
AUTOMAKE_OPTIONS = foreign no-dependencies
EXTRA_DIST = README curlgtk.c sepheaders.c simple.c postit2.c \
persistant.c ftpget.c Makefile.example \
multithread.c getinmemory.c ftpupload.c httpput.c \
simplessl.c ftpgetresp.c http-post.c post-callback.c \
multi-app.c multi-double.c multi-single.c multi-post.c \
fopen.c simplepost.c makefile.dj curlx.c
all:
@echo "done"

View File

@ -0,0 +1,42 @@
#############################################################################
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# $Id: Makefile.example,v 1.3 2002/08/14 23:01:14 bagder Exp $
#
# What to call the final executable
TARGET = example
# Which object files that the executable consists of
OBJS= ftpget.o
# What compiler to use
CC = gcc
# Compiler flags, -g for debug, -c to make an object file
CFLAGS = -c -g
# This should point to a directory that holds libcurl, if it isn't
# in the system's standard lib dir
# We also set a -L to include the directory where we have the openssl
# libraries
LDFLAGS = -L/home/dast/lib -L/usr/local/ssl/lib
# We need -lcurl for the curl stuff
# We need -lsocket and -lnsl when on Solaris
# We need -lssl and -lcrypto when using libcurl with SSL support
# We need -ldl for dlopen() if that is in libdl
# We need -lpthread for the pthread example
LIBS = -lcurl -lsocket -lnsl -lssl -lcrypto -dl
# Link the target with all objects and libraries
$(TARGET) : $(OBJS)
$(CC) -o $(TARGET) $(OBJS) $(LDFLAGS) $(LIBS)
# Compile the source files into object files
ftpget.o : ftpget.c
$(CC) $(CFLAGS) $<

View File

@ -0,0 +1,368 @@
# Makefile.in generated by automake 1.8.3 from Makefile.am.
# @configure_input@
# Copyright (C) 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002,
# 2003, 2004 Free Software Foundation, Inc.
# This Makefile.in is free software; the Free Software Foundation
# gives unlimited permission to copy and/or distribute it,
# with or without modifications, as long as this notice is preserved.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY, to the extent permitted by law; without
# even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE.
@SET_MAKE@
#
# $Id: Makefile.am,v 1.25 2003/10/03 13:46:27 bagder Exp $
#
srcdir = @srcdir@
top_srcdir = @top_srcdir@
VPATH = @srcdir@
pkgdatadir = $(datadir)/@PACKAGE@
pkglibdir = $(libdir)/@PACKAGE@
pkgincludedir = $(includedir)/@PACKAGE@
top_builddir = ../..
am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd
INSTALL = @INSTALL@
install_sh_DATA = $(install_sh) -c -m 644
install_sh_PROGRAM = $(install_sh) -c
install_sh_SCRIPT = $(install_sh) -c
INSTALL_HEADER = $(INSTALL_DATA)
transform = $(program_transform_name)
NORMAL_INSTALL = :
PRE_INSTALL = :
POST_INSTALL = :
NORMAL_UNINSTALL = :
PRE_UNINSTALL = :
POST_UNINSTALL = :
host_triplet = @host@
subdir = docs/examples
DIST_COMMON = README $(srcdir)/Makefile.am $(srcdir)/Makefile.in
ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
am__aclocal_m4_deps = $(top_srcdir)/acinclude.m4 \
$(top_srcdir)/configure.ac
am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
$(ACLOCAL_M4)
mkinstalldirs = $(SHELL) $(top_srcdir)/mkinstalldirs
CONFIG_HEADER = $(top_builddir)/lib/config.h \
$(top_builddir)/src/config.h
CONFIG_CLEAN_FILES =
depcomp =
am__depfiles_maybe =
SOURCES =
DIST_SOURCES =
DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
ACLOCAL = @ACLOCAL@
AMDEP_FALSE = @AMDEP_FALSE@
AMDEP_TRUE = @AMDEP_TRUE@
AMTAR = @AMTAR@
AR = @AR@
AS = @AS@
AUTOCONF = @AUTOCONF@
AUTOHEADER = @AUTOHEADER@
AUTOMAKE = @AUTOMAKE@
AWK = @AWK@
CABUNDLE_FALSE = @CABUNDLE_FALSE@
CABUNDLE_TRUE = @CABUNDLE_TRUE@
CC = @CC@
CCDEPMODE = @CCDEPMODE@
CFLAGS = @CFLAGS@
CPP = @CPP@
CPPFLAGS = @CPPFLAGS@
CURL_CA_BUNDLE = @CURL_CA_BUNDLE@
CURL_DISABLE_DICT = @CURL_DISABLE_DICT@
CURL_DISABLE_FILE = @CURL_DISABLE_FILE@
CURL_DISABLE_FTP = @CURL_DISABLE_FTP@
CURL_DISABLE_GOPHER = @CURL_DISABLE_GOPHER@
CURL_DISABLE_HTTP = @CURL_DISABLE_HTTP@
CURL_DISABLE_LDAP = @CURL_DISABLE_LDAP@
CURL_DISABLE_TELNET = @CURL_DISABLE_TELNET@
CXX = @CXX@
CXXCPP = @CXXCPP@
CXXDEPMODE = @CXXDEPMODE@
CXXFLAGS = @CXXFLAGS@
CYGPATH_W = @CYGPATH_W@
DEFS = @DEFS@
DEPDIR = @DEPDIR@
DLLTOOL = @DLLTOOL@
ECHO = @ECHO@
ECHO_C = @ECHO_C@
ECHO_N = @ECHO_N@
ECHO_T = @ECHO_T@
EGREP = @EGREP@
EXEEXT = @EXEEXT@
F77 = @F77@
FFLAGS = @FFLAGS@
HAVE_ARES = @HAVE_ARES@
HAVE_LIBZ = @HAVE_LIBZ@
HAVE_LIBZ_FALSE = @HAVE_LIBZ_FALSE@
HAVE_LIBZ_TRUE = @HAVE_LIBZ_TRUE@
INSTALL_DATA = @INSTALL_DATA@
INSTALL_PROGRAM = @INSTALL_PROGRAM@
INSTALL_SCRIPT = @INSTALL_SCRIPT@
INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@
IPV6_ENABLED = @IPV6_ENABLED@
KRB4_ENABLED = @KRB4_ENABLED@
LDFLAGS = @LDFLAGS@
LIBOBJS = @LIBOBJS@
LIBS = @LIBS@
LIBTOOL = @LIBTOOL@
LN_S = @LN_S@
LTLIBOBJS = @LTLIBOBJS@
MAINT = @MAINT@
MAINTAINER_MODE_FALSE = @MAINTAINER_MODE_FALSE@
MAINTAINER_MODE_TRUE = @MAINTAINER_MODE_TRUE@
MAKEINFO = @MAKEINFO@
MANOPT = @MANOPT@
MIMPURE_FALSE = @MIMPURE_FALSE@
MIMPURE_TRUE = @MIMPURE_TRUE@
NO_UNDEFINED_FALSE = @NO_UNDEFINED_FALSE@
NO_UNDEFINED_TRUE = @NO_UNDEFINED_TRUE@
NROFF = @NROFF@
OBJDUMP = @OBJDUMP@
OBJEXT = @OBJEXT@
OPENSSL_ENABLED = @OPENSSL_ENABLED@
PACKAGE = @PACKAGE@
PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@
PACKAGE_NAME = @PACKAGE_NAME@
PACKAGE_STRING = @PACKAGE_STRING@
PACKAGE_TARNAME = @PACKAGE_TARNAME@
PACKAGE_VERSION = @PACKAGE_VERSION@
PATH_SEPARATOR = @PATH_SEPARATOR@
PERL = @PERL@
PKGADD_NAME = @PKGADD_NAME@
PKGADD_PKG = @PKGADD_PKG@
PKGADD_VENDOR = @PKGADD_VENDOR@
PKGCONFIG = @PKGCONFIG@
RANDOM_FILE = @RANDOM_FILE@
RANLIB = @RANLIB@
SED = @SED@
SET_MAKE = @SET_MAKE@
SHELL = @SHELL@
STRIP = @STRIP@
USE_MANUAL_FALSE = @USE_MANUAL_FALSE@
USE_MANUAL_TRUE = @USE_MANUAL_TRUE@
VERSION = @VERSION@
VERSIONNUM = @VERSIONNUM@
YACC = @YACC@
ac_ct_AR = @ac_ct_AR@
ac_ct_AS = @ac_ct_AS@
ac_ct_CC = @ac_ct_CC@
ac_ct_CXX = @ac_ct_CXX@
ac_ct_DLLTOOL = @ac_ct_DLLTOOL@
ac_ct_F77 = @ac_ct_F77@
ac_ct_OBJDUMP = @ac_ct_OBJDUMP@
ac_ct_RANLIB = @ac_ct_RANLIB@
ac_ct_STRIP = @ac_ct_STRIP@
am__fastdepCC_FALSE = @am__fastdepCC_FALSE@
am__fastdepCC_TRUE = @am__fastdepCC_TRUE@
am__fastdepCXX_FALSE = @am__fastdepCXX_FALSE@
am__fastdepCXX_TRUE = @am__fastdepCXX_TRUE@
am__include = @am__include@
am__leading_dot = @am__leading_dot@
am__quote = @am__quote@
bindir = @bindir@
build = @build@
build_alias = @build_alias@
build_cpu = @build_cpu@
build_os = @build_os@
build_vendor = @build_vendor@
datadir = @datadir@
exec_prefix = @exec_prefix@
host = @host@
host_alias = @host_alias@
host_cpu = @host_cpu@
host_os = @host_os@
host_vendor = @host_vendor@
includedir = @includedir@
infodir = @infodir@
install_sh = @install_sh@
libdir = @libdir@
libexecdir = @libexecdir@
localstatedir = @localstatedir@
mandir = @mandir@
mkdir_p = @mkdir_p@
oldincludedir = @oldincludedir@
prefix = @prefix@
program_transform_name = @program_transform_name@
sbindir = @sbindir@
sharedstatedir = @sharedstatedir@
subdirs = @subdirs@
sysconfdir = @sysconfdir@
target_alias = @target_alias@
AUTOMAKE_OPTIONS = foreign no-dependencies
EXTRA_DIST = README curlgtk.c sepheaders.c simple.c postit2.c \
persistant.c ftpget.c Makefile.example \
multithread.c getinmemory.c ftpupload.c httpput.c \
simplessl.c ftpgetresp.c http-post.c post-callback.c \
multi-app.c multi-double.c multi-single.c multi-post.c \
fopen.c simplepost.c makefile.dj curlx.c
all: all-am
.SUFFIXES:
$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps)
@for dep in $?; do \
case '$(am__configure_deps)' in \
*$$dep*) \
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh \
&& exit 0; \
exit 1;; \
esac; \
done; \
echo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign docs/examples/Makefile'; \
cd $(top_srcdir) && \
$(AUTOMAKE) --foreign docs/examples/Makefile
.PRECIOUS: Makefile
Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
@case '$?' in \
*config.status*) \
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \
*) \
echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \
cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \
esac;
$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps)
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps)
cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
mostlyclean-libtool:
-rm -f *.lo
clean-libtool:
-rm -rf .libs _libs
distclean-libtool:
-rm -f libtool
uninstall-info-am:
tags: TAGS
TAGS:
ctags: CTAGS
CTAGS:
distdir: $(DISTFILES)
@srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; \
topsrcdirstrip=`echo "$(top_srcdir)" | sed 's|.|.|g'`; \
list='$(DISTFILES)'; for file in $$list; do \
case $$file in \
$(srcdir)/*) file=`echo "$$file" | sed "s|^$$srcdirstrip/||"`;; \
$(top_srcdir)/*) file=`echo "$$file" | sed "s|^$$topsrcdirstrip/|$(top_builddir)/|"`;; \
esac; \
if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
dir=`echo "$$file" | sed -e 's,/[^/]*$$,,'`; \
if test "$$dir" != "$$file" && test "$$dir" != "."; then \
dir="/$$dir"; \
$(mkdir_p) "$(distdir)$$dir"; \
else \
dir=''; \
fi; \
if test -d $$d/$$file; then \
if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
cp -pR $(srcdir)/$$file $(distdir)$$dir || exit 1; \
fi; \
cp -pR $$d/$$file $(distdir)$$dir || exit 1; \
else \
test -f $(distdir)/$$file \
|| cp -p $$d/$$file $(distdir)/$$file \
|| exit 1; \
fi; \
done
check-am: all-am
check: check-am
all-am: Makefile
installdirs:
install: install-am
install-exec: install-exec-am
install-data: install-data-am
uninstall: uninstall-am
install-am: all-am
@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
installcheck: installcheck-am
install-strip:
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
`test -z '$(STRIP)' || \
echo "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'"` install
mostlyclean-generic:
clean-generic:
distclean-generic:
-rm -f $(CONFIG_CLEAN_FILES)
maintainer-clean-generic:
@echo "This command is intended for maintainers to use"
@echo "it deletes files that may require special tools to rebuild."
clean: clean-am
clean-am: clean-generic clean-libtool mostlyclean-am
distclean: distclean-am
-rm -f Makefile
distclean-am: clean-am distclean-generic distclean-libtool
dvi: dvi-am
dvi-am:
html: html-am
info: info-am
info-am:
install-data-am:
install-exec-am:
install-info: install-info-am
install-man:
installcheck-am:
maintainer-clean: maintainer-clean-am
-rm -f Makefile
maintainer-clean-am: distclean-am maintainer-clean-generic
mostlyclean: mostlyclean-am
mostlyclean-am: mostlyclean-generic mostlyclean-libtool
pdf: pdf-am
pdf-am:
ps: ps-am
ps-am:
uninstall-am: uninstall-info-am
.PHONY: all all-am check check-am clean clean-generic clean-libtool \
distclean distclean-generic distclean-libtool distdir dvi \
dvi-am html html-am info info-am install install-am \
install-data install-data-am install-exec install-exec-am \
install-info install-info-am install-man install-strip \
installcheck installcheck-am installdirs maintainer-clean \
maintainer-clean-generic mostlyclean mostlyclean-generic \
mostlyclean-libtool pdf pdf-am ps ps-am uninstall uninstall-am \
uninstall-info-am
all:
@echo "done"
# Tell versions [3.59,3.63) of GNU make to not export all variables.
# Otherwise a system limit (for SysV at least) may be exceeded.
.NOEXPORT:

View File

@ -0,0 +1,25 @@
EXAMPLES
This directory is for libcurl programming examples. They are meant to show
some simple steps on how you can build your own application to take full
advantage of libcurl.
If you end up with other small but still useful example sources, please mail
them for submission in future packages and on the web site.
The Makefile.example is an example makefile that could be used to build these
examples. Just edit the file according to your system and requirements first.
Most examples should build fine using a command line like this:
$ `curl-config --cc --cflags --libs` -o example example.c
Some compilers don't like having the arguments in this order but instead
want you do reorganize them like:
$ `curl-config --cc` -o example example.c `curl-config --cflags --libs`
*PLEASE* do not use the curl.haxx.se site as a test target for your libcurl
applications/experiments. Even if the examples in this directory use that site
as an example URL at some places, it doesn't mean that the URLs work or that
we expect you to actually torture our web site with your tests! Thanks.

View File

@ -0,0 +1,106 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: curlgtk.c,v 1.4 2004/02/09 07:12:33 bagder Exp $
*/
/* Copyright (c) 2000 David Odin (aka DindinX) for MandrakeSoft */
/* an attempt to use the curl library in concert with a gtk-threaded application */
#include <stdio.h>
#include <gtk/gtk.h>
#include <curl/curl.h>
#include <curl/types.h> /* new for v7 */
#include <curl/easy.h> /* new for v7 */
GtkWidget *Bar;
size_t my_write_func(void *ptr, size_t size, size_t nmemb, FILE *stream)
{
return fwrite(ptr, size, nmemb, stream);
}
size_t my_read_func(void *ptr, size_t size, size_t nmemb, FILE *stream)
{
return fread(ptr, size, nmemb, stream);
}
int my_progress_func(GtkWidget *Bar,
double t, /* dltotal */
double d, /* dlnow */
double ultotal,
double ulnow)
{
/* printf("%d / %d (%g %%)\n", d, t, d*100.0/t);*/
gdk_threads_enter();
gtk_progress_set_value(GTK_PROGRESS(Bar), d*100.0/t);
gdk_threads_leave();
return 0;
}
void *curl_thread(void *ptr)
{
CURL *curl;
CURLcode res;
FILE *outfile;
gchar *url = ptr;
curl = curl_easy_init();
if(curl)
{
outfile = fopen("test.curl", "w");
curl_easy_setopt(curl, CURLOPT_URL, url);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, outfile);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, my_write_func);
curl_easy_setopt(curl, CURLOPT_READFUNCTION, my_read_func);
curl_easy_setopt(curl, CURLOPT_NOPROGRESS, FALSE);
curl_easy_setopt(curl, CURLOPT_PROGRESSFUNCTION, my_progress_func);
curl_easy_setopt(curl, CURLOPT_PROGRESSDATA, Bar);
res = curl_easy_perform(curl);
fclose(outfile);
/* always cleanup */
curl_easy_cleanup(curl);
}
return NULL;
}
int main(int argc, char **argv)
{
GtkWidget *Window, *Frame, *Frame2;
GtkAdjustment *adj;
/* Init thread */
g_thread_init(NULL);
gtk_init(&argc, &argv);
Window = gtk_window_new(GTK_WINDOW_TOPLEVEL);
Frame = gtk_frame_new(NULL);
gtk_frame_set_shadow_type(GTK_FRAME(Frame), GTK_SHADOW_OUT);
gtk_container_add(GTK_CONTAINER(Window), Frame);
Frame2 = gtk_frame_new(NULL);
gtk_frame_set_shadow_type(GTK_FRAME(Frame2), GTK_SHADOW_IN);
gtk_container_add(GTK_CONTAINER(Frame), Frame2);
gtk_container_set_border_width(GTK_CONTAINER(Frame2), 5);
adj = (GtkAdjustment*)gtk_adjustment_new(0, 0, 100, 0, 0, 0);
Bar = gtk_progress_bar_new_with_adjustment(adj);
gtk_container_add(GTK_CONTAINER(Frame2), Bar);
gtk_widget_show_all(Window);
if (!g_thread_create(&curl_thread, argv[1], FALSE, NULL) != 0)
g_warning("can't create the thread");
gdk_threads_enter();
gtk_main();
gdk_threads_leave();
return 0;
}

View File

@ -0,0 +1,480 @@
/*
curlx.c Authors: Peter Sylvester, Jean-Paul Merlin
This is a little program to demonstrate the usage of
- an ssl initialisation callback setting a user key and trustbases
coming from a pkcs12 file
- using an ssl application callback to find a URI in the
certificate presented during ssl session establishment.
*/
/*
* Copyright (c) 2003 The OpenEvidence Project. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions, the following disclaimer,
* and the original OpenSSL and SSLeay Licences below.
*
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions, the following disclaimer
* and the original OpenSSL and SSLeay Licences below in
* the documentation and/or other materials provided with the
* distribution.
*
* 3. All advertising materials mentioning features or use of this
* software must display the following acknowledgments:
* "This product includes software developed by the Openevidence Project
* for use in the OpenEvidence Toolkit. (http://www.openevidence.org/)"
* This product includes software developed by the OpenSSL Project
* for use in the OpenSSL Toolkit (http://www.openssl.org/)"
* This product includes cryptographic software written by Eric Young
* (eay@cryptsoft.com). This product includes software written by Tim
* Hudson (tjh@cryptsoft.com)."
*
* 4. The names "OpenEvidence Toolkit" and "OpenEvidence Project" must not be
* used to endorse or promote products derived from this software without
* prior written permission. For written permission, please contact
* openevidence-core@openevidence.org.
*
* 5. Products derived from this software may not be called "OpenEvidence"
* nor may "OpenEvidence" appear in their names without prior written
* permission of the OpenEvidence Project.
*
* 6. Redistributions of any form whatsoever must retain the following
* acknowledgments:
* "This product includes software developed by the OpenEvidence Project
* for use in the OpenEvidence Toolkit (http://www.openevidence.org/)
* This product includes software developed by the OpenSSL Project
* for use in the OpenSSL Toolkit (http://www.openssl.org/)"
* This product includes cryptographic software written by Eric Young
* (eay@cryptsoft.com). This product includes software written by Tim
* Hudson (tjh@cryptsoft.com)."
*
* THIS SOFTWARE IS PROVIDED BY THE OpenEvidence PROJECT ``AS IS'' AND ANY
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenEvidence PROJECT OR
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.
* ====================================================================
*
* This product includes software developed by the OpenSSL Project
* for use in the OpenSSL Toolkit (http://www.openssl.org/)
* This product includes cryptographic software written by Eric Young
* (eay@cryptsoft.com). This product includes software written by Tim
* Hudson (tjh@cryptsoft.com).
*
*/
#include <stdio.h>
#include <stdlib.h>
#include <curl/curl.h>
#include <openssl/x509v3.h>
#include <openssl/x509_vfy.h>
#include <openssl/crypto.h>
#include <openssl/lhash.h>
#include <openssl/objects.h>
#include <openssl/err.h>
#include <openssl/evp.h>
#include <openssl/x509.h>
#include <openssl/pkcs12.h>
#include <openssl/bio.h>
#include <openssl/ssl.h>
static char *curlx_usage[]={
"usage: curlx args\n",
" -p12 arg - tia file ",
" -envpass arg - environement variable which content the tia private key password",
" -out arg - output file (response)- default stdout",
" -in arg - input file (request)- default stdin",
" -connect arg - URL of the server for the connection ex: www.openevidenve.org",
" -mimetype arg - MIME type for data in ex : application/timestamp-query or application/dvcs -default application/timestamp-query",
" -acceptmime arg - MIME type acceptable for the response ex : application/timestamp-response or application/dvcs -default none",
" -accesstype arg - an Object identifier in an AIA/SIA method, e.g. AD_DVCS or ad_timestamping",
NULL
};
/*
./curlx -p12 psy.p12 -envpass XX -in request -verbose -accesstype AD_DVCS
-mimetype application/dvcs -acceptmime application/dvcs -out response
*/
/* This is a context that we pass to all callbacks */
typedef struct sslctxparm_st {
unsigned char * p12file ;
const char * pst ;
PKCS12 * p12 ;
EVP_PKEY * pkey ;
X509 * usercert ;
STACK_OF(X509) * ca ;
CURL * curl;
BIO * errorbio;
int accesstype ;
int verbose;
} sslctxparm;
/* some helper function. */
static char *i2s_ASN1_IA5STRING( ASN1_IA5STRING *ia5)
{
char *tmp;
if(!ia5 || !ia5->length) return NULL;
tmp = OPENSSL_malloc(ia5->length + 1);
memcpy(tmp, ia5->data, ia5->length);
tmp[ia5->length] = 0;
return tmp;
}
/* A conveniance routine to get an access URI. */
static unsigned char *my_get_ext(X509 * cert, const int type, int extensiontype) {
int i;
STACK_OF(ACCESS_DESCRIPTION) * accessinfo ;
accessinfo = X509_get_ext_d2i(cert, extensiontype, NULL, NULL) ;
if (!sk_ACCESS_DESCRIPTION_num(accessinfo)) return NULL;
for (i = 0; i < sk_ACCESS_DESCRIPTION_num(accessinfo); i++) {
ACCESS_DESCRIPTION * ad = sk_ACCESS_DESCRIPTION_value(accessinfo, i);
if (OBJ_obj2nid(ad->method) == type) {
if (ad->location->type == GEN_URI) {
return i2s_ASN1_IA5STRING(ad->location->d.ia5);
}
return NULL;
}
}
return NULL;
}
/* This is an application verification call back, it does not
perform any addition verification but tries to find a URL
in the presented certificat. If found, this will become
the URL to be used in the POST.
*/
static int ssl_app_verify_callback(X509_STORE_CTX *ctx, void *arg) {
sslctxparm * p = (sslctxparm *) arg;
int ok;
if (p->verbose > 2) BIO_printf(p->errorbio,"entering ssl_app_verify_callback\n");
if ((ok= X509_verify_cert(ctx)) && ctx->cert) {
unsigned char * accessinfo ;
if (p->verbose > 1) X509_print_ex(p->errorbio,ctx->cert,0,0);
if (accessinfo = my_get_ext(ctx->cert,p->accesstype ,NID_sinfo_access)) {
if (p->verbose) BIO_printf(p->errorbio,"Setting URL from SIA to: %s\n",accessinfo);
curl_easy_setopt(p->curl, CURLOPT_URL,accessinfo);
} else if (accessinfo = my_get_ext(ctx->cert,p->accesstype ,NID_info_access)) {
if (p->verbose) BIO_printf(p->errorbio,"Setting URL from AIA to: %s\n",accessinfo);
curl_easy_setopt(p->curl, CURLOPT_URL,accessinfo);
}
}
if (p->verbose > 2) BIO_printf(p->errorbio,"leaving ssl_app_verify_callback with %d\n",ok);
return(ok);
}
/* This is an example of an curl SSL initialisation call back. The callback sets:
- a private key and certificate
- a trusted ca certificate
- a preferred cipherlist
- an application verification callback (the function above)
*/
static CURLcode sslctxfun(CURL * curl, void * sslctx, void * parm) {
sslctxparm * p = (sslctxparm *) parm;
SSL_CTX * ctx = (SSL_CTX *) sslctx ;
if (!SSL_CTX_use_certificate(ctx,p->usercert)) {
BIO_printf(p->errorbio, "SSL_CTX_use_certificate problem\n"); goto err;
}
if (!SSL_CTX_use_PrivateKey(ctx,p->pkey)) {
BIO_printf(p->errorbio, "SSL_CTX_use_PrivateKey\n"); goto err;
}
if (!SSL_CTX_check_private_key(ctx)) {
BIO_printf(p->errorbio, "SSL_CTX_check_private_key\n"); goto err;
}
SSL_CTX_set_quiet_shutdown(ctx,1);
SSL_CTX_set_cipher_list(ctx,"RC4-MD5");
SSL_CTX_set_mode(ctx, SSL_MODE_AUTO_RETRY);
X509_STORE_add_cert(ctx->cert_store,sk_X509_value(p->ca,sk_X509_num(p->ca)-1));
SSL_CTX_set_verify_depth(ctx,2);
SSL_CTX_set_verify(ctx,SSL_VERIFY_PEER,NULL);
SSL_CTX_set_cert_verify_callback(ctx, ssl_app_verify_callback, parm);
return CURLE_OK ;
err:
ERR_print_errors(p->errorbio);
return CURLE_SSL_CERTPROBLEM;
}
int main(int argc, char **argv) {
BIO* in=NULL;
BIO* out=NULL;
char * outfile = NULL;
char * infile = NULL ;
int tabLength=100;
char *binaryptr;
char* mimetype;
char* mimetypeaccept=NULL;
char* contenttype;
char** pp;
unsigned char* hostporturl = NULL;
binaryptr=(char*)malloc(tabLength);
BIO * p12bio ;
char **args = argv + 1;
unsigned char * serverurl;
sslctxparm p;
char *response;
p.verbose = 0;
CURLcode res;
struct curl_slist * headers=NULL;
p.errorbio = BIO_new_fp (stderr, BIO_NOCLOSE);
curl_global_init(CURL_GLOBAL_DEFAULT);
/* we need some more for the P12 decoding */
OpenSSL_add_all_ciphers();
OpenSSL_add_all_digests();
ERR_load_crypto_strings();
int badarg=0;
while (*args && *args[0] == '-') {
if (!strcmp (*args, "-in")) {
if (args[1]) {
infile=*(++args);
} else badarg=1;
} else if (!strcmp (*args, "-out")) {
if (args[1]) {
outfile=*(++args);
} else badarg=1;
} else if (!strcmp (*args, "-p12")) {
if (args[1]) {
p.p12file = *(++args);
} else badarg=1;
} else if (strcmp(*args,"-envpass") == 0) {
if (args[1]) {
p.pst = getenv(*(++args));
} else badarg=1;
} else if (strcmp(*args,"-connect") == 0) {
if (args[1]) {
hostporturl = *(++args);
} else badarg=1;
} else if (strcmp(*args,"-mimetype") == 0) {
if (args[1]) {
mimetype = *(++args);
} else badarg=1;
} else if (strcmp(*args,"-acceptmime") == 0) {
if (args[1]) {
mimetypeaccept = *(++args);
} else badarg=1;
} else if (strcmp(*args,"-accesstype") == 0) {
if (args[1]) {
if ((p.accesstype = OBJ_obj2nid(OBJ_txt2obj(*++args,0))) == 0) badarg=1;
} else badarg=1;
} else if (strcmp(*args,"-verbose") == 0) {
p.verbose++;
} else badarg=1;
args++;
}
if (mimetype==NULL || mimetypeaccept == NULL) badarg = 1;
if (badarg) {
for (pp=curlx_usage; (*pp != NULL); pp++)
BIO_printf(p.errorbio,"%s\n",*pp);
BIO_printf(p.errorbio,"\n");
goto err;
}
/* set input */
if ((in=BIO_new(BIO_s_file())) == NULL) {
BIO_printf(p.errorbio, "Error setting input bio\n");
goto err;
} else if (infile == NULL)
BIO_set_fp(in,stdin,BIO_NOCLOSE|BIO_FP_TEXT);
else if (BIO_read_filename(in,infile) <= 0) {
BIO_printf(p.errorbio, "Error opening input file %s\n", infile);
BIO_free(in);
goto err;
}
/* set output */
if ((out=BIO_new(BIO_s_file())) == NULL) {
BIO_printf(p.errorbio, "Error setting output bio.\n");
goto err;
} else if (outfile == NULL)
BIO_set_fp(out,stdout,BIO_NOCLOSE|BIO_FP_TEXT);
else if (BIO_write_filename(out,outfile) <= 0) {
BIO_printf(p.errorbio, "Error opening output file %s\n", outfile);
BIO_free(out);
goto err;
}
p.errorbio = BIO_new_fp (stderr, BIO_NOCLOSE);
if (!(p.curl = curl_easy_init())) {
BIO_printf(p.errorbio, "Cannot init curl lib\n");
goto err;
}
if (!(p12bio = BIO_new_file(p.p12file , "rb"))) {
BIO_printf(p.errorbio, "Error opening P12 file %s\n", p.p12file); goto err;
}
if (!(p.p12 = d2i_PKCS12_bio (p12bio, NULL))) {
BIO_printf(p.errorbio, "Cannot decode P12 structure %s\n", p.p12file); goto err;
}
p.ca= NULL;
if (!(PKCS12_parse (p.p12, p.pst, &(p.pkey), &(p.usercert), &(p.ca) ) )) {
BIO_printf(p.errorbio,"Invalid P12 structure in %s\n", p.p12file); goto err;
}
if (sk_X509_num(p.ca) <= 0) {
BIO_printf(p.errorbio,"No trustworthy CA given.%s\n", p.p12file); goto err;
}
if (p.verbose > 1) X509_print_ex(p.errorbio,p.usercert,0,0);
/* determine URL to go */
if (hostporturl) {
serverurl=(char*) malloc(9+strlen(hostporturl));
sprintf(serverurl,"https://%s",hostporturl);
} else if (p.accesstype != 0) { /* see whether we can find an AIA or SIA for a given access type */
if (!(serverurl = my_get_ext(p.usercert,p.accesstype,NID_info_access))) {
BIO_printf(p.errorbio,"no service URL in user cert cherching in others certificats\n");
int j=0;
int find=0;
for (j=0;j<sk_X509_num(p.ca);j++) {
if ((serverurl = my_get_ext(sk_X509_value(p.ca,j),p.accesstype,NID_info_access))) break;
if ((serverurl = my_get_ext(sk_X509_value(p.ca,j),p.accesstype,NID_sinfo_access))) break;
}
}
}
if (!serverurl) {
BIO_printf(p.errorbio, "no service URL in certificats, check '-accesstype (AD_DVCS | ad_timestamping)' or use '-connect'\n"); goto err;
}
if (p.verbose) BIO_printf(p.errorbio, "Service URL: <%s>\n", serverurl);
curl_easy_setopt(p.curl, CURLOPT_URL, serverurl);
/* Now specify the POST binary data */
curl_easy_setopt(p.curl, CURLOPT_POSTFIELDS, binaryptr);
curl_easy_setopt(p.curl, CURLOPT_POSTFIELDSIZE,tabLength);
/* pass our list of custom made headers */
contenttype=(char*) malloc(15+strlen(mimetype));
sprintf(contenttype,"Content-type: %s",mimetype);
headers = curl_slist_append(headers,contenttype);
curl_easy_setopt(p.curl, CURLOPT_HTTPHEADER, headers);
if (p.verbose) BIO_printf(p.errorbio, "Service URL: <%s>\n", serverurl);
{
FILE *outfp;
BIO_get_fp(out,&outfp);
curl_easy_setopt(p.curl, CURLOPT_FILE,outfp);
}
res = curl_easy_setopt(p.curl, CURLOPT_SSL_CTX_FUNCTION, sslctxfun) ;
if (res != CURLE_OK)
BIO_printf(p.errorbio,"%d %s=%d %d\n", __LINE__, "CURLOPT_SSL_CTX_FUNCTION",CURLOPT_SSL_CTX_FUNCTION,res);
curl_easy_setopt(p.curl, CURLOPT_SSL_CTX_DATA, &p);
{
int lu; int i=0;
while ((lu = BIO_read (in,&binaryptr[i],tabLength-i)) >0 ) {
i+=lu;
if (i== tabLength) {
tabLength+=100;
binaryptr=(char*)realloc(binaryptr,tabLength); /* should be more careful */
}
}
tabLength = i;
}
/* Now specify the POST binary data */
curl_easy_setopt(p.curl, CURLOPT_POSTFIELDS, binaryptr);
curl_easy_setopt(p.curl, CURLOPT_POSTFIELDSIZE,tabLength);
/* Perform the request, res will get the return code */
BIO_printf(p.errorbio,"%d %s %d\n", __LINE__, "curl_easy_perform", res = curl_easy_perform(p.curl));
{
int result =curl_easy_getinfo(p.curl,CURLINFO_CONTENT_TYPE,&response);
if( mimetypeaccept && p.verbose)
if(!strcmp(mimetypeaccept,response))
BIO_printf(p.errorbio,"the response has a correct mimetype : %s\n",response);
else
BIO_printf(p.errorbio,"the reponse doesn\'t has an acceptable mime type, it is %s instead of %s\n",response,mimetypeaccept);
}
/*** code d'erreur si accept mime ***, egalement code return HTTP != 200 ***/
/* free the header list*/
curl_slist_free_all(headers);
/* always cleanup */
curl_easy_cleanup(p.curl);
BIO_free(in);
BIO_free(out);
return (EXIT_SUCCESS);
err: BIO_printf(p.errorbio,"error");
exit(1);
}

View File

@ -0,0 +1,560 @@
/*****************************************************************************
*
* This example source code introduces a c library buffered I/O interface to
* URL reads it supports fopen(), fread(), fgets(), feof(), fclose(),
* rewind(). Supported functions have identical prototypes to their normal c
* lib namesakes and are preceaded by url_ .
*
* Using this code you can replace your program's fopen() with url_fopen()
* and fread() with url_fread() and it become possible to read remote streams
* instead of (only) local files. Local files (ie those that can be directly
* fopened) will drop back to using the underlying clib implementations
*
* See the main() function at the bottom that shows an app that retrives from a
* specified url using fgets() and fread() and saves as two output files.
*
* Coyright (c)2003 Simtec Electronics
*
* Re-implemented by Vincent Sanders <vince@kyllikki.org> with extensive
* reference to original curl example code
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. The name of the author may not be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
* IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
* OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
* IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*
* This example requires libcurl 7.9.7 or later.
*/
#include <stdio.h>
#include <string.h>
#include <sys/time.h>
#include <stdlib.h>
#include <errno.h>
#include <curl/curl.h>
enum fcurl_type_e { CFTYPE_NONE=0, CFTYPE_FILE=1, CFTYPE_CURL=2 };
struct fcurl_data
{
enum fcurl_type_e type; /* type of handle */
union {
CURL *curl;
FILE *file;
} handle; /* handle */
char *buffer; /* buffer to store cached data*/
int buffer_len; /* currently allocated buffers length */
int buffer_pos; /* end of data in buffer*/
int still_running; /* Is background url fetch still in progress */
};
typedef struct fcurl_data URL_FILE;
/* exported functions */
URL_FILE *url_fopen(char *url,const char *operation);
int url_fclose(URL_FILE *file);
int url_feof(URL_FILE *file);
size_t url_fread(void *ptr, size_t size, size_t nmemb, URL_FILE *file);
char * url_fgets(char *ptr, int size, URL_FILE *file);
void url_rewind(URL_FILE *file);
/* we use a global one for convenience */
CURLM *multi_handle;
/* curl calls this routine to get more data */
static size_t
write_callback(char *buffer,
size_t size,
size_t nitems,
void *userp)
{
char *newbuff;
int rembuff;
URL_FILE *url = (URL_FILE *)userp;
size *= nitems;
rembuff=url->buffer_len - url->buffer_pos;//remaining space in buffer
if(size > rembuff)
{
//not enuf space in buffer
newbuff=realloc(url->buffer,url->buffer_len + (size - rembuff));
if(newbuff==NULL)
{
fprintf(stderr,"callback buffer grow failed\n");
size=rembuff;
}
else
{
/* realloc suceeded increase buffer size*/
url->buffer_len+=size - rembuff;
url->buffer=newbuff;
/*printf("Callback buffer grown to %d bytes\n",url->buffer_len);*/
}
}
memcpy(&url->buffer[url->buffer_pos], buffer, size);
url->buffer_pos += size;
/*fprintf(stderr, "callback %d size bytes\n", size);*/
return size;
}
/* use to attempt to fill the read buffer up to requested number of bytes */
static int
curl_fill_buffer(URL_FILE *file,int want,int waittime)
{
fd_set fdread;
fd_set fdwrite;
fd_set fdexcep;
int maxfd;
struct timeval timeout;
int rc;
/* only attempt to fill buffer if transactions still running and buffer
* doesnt exceed required size already
*/
if((!file->still_running) || (file->buffer_pos > want))
return 0;
/* attempt to fill buffer */
do
{
FD_ZERO(&fdread);
FD_ZERO(&fdwrite);
FD_ZERO(&fdexcep);
/* set a suitable timeout to fail on */
timeout.tv_sec = 60; /* 1 minute */
timeout.tv_usec = 0;
/* get file descriptors from the transfers */
curl_multi_fdset(multi_handle, &fdread, &fdwrite, &fdexcep, &maxfd);
rc = select(maxfd+1, &fdread, &fdwrite, &fdexcep, &timeout);
switch(rc) {
case -1:
/* select error */
break;
case 0:
break;
default:
/* timeout or readable/writable sockets */
/* note we *could* be more efficient and not wait for
* CURLM_CALL_MULTI_PERFORM to clear here and check it on re-entry
* but that gets messy */
while(curl_multi_perform(multi_handle, &file->still_running) ==
CURLM_CALL_MULTI_PERFORM);
break;
}
} while(file->still_running && (file->buffer_pos < want));
return 1;
}
/* use to remove want bytes from the front of a files buffer */
static int
curl_use_buffer(URL_FILE *file,int want)
{
/* sort out buffer */
if((file->buffer_pos - want) <=0)
{
/* ditch buffer - write will recreate */
if(file->buffer)
free(file->buffer);
file->buffer=NULL;
file->buffer_pos=0;
file->buffer_len=0;
}
else
{
/* move rest down make it available for later */
memmove(file->buffer,
&file->buffer[want],
(file->buffer_pos - want));
file->buffer_pos -= want;
}
return 0;
}
URL_FILE *
url_fopen(char *url,const char *operation)
{
/* this code could check for URLs or types in the 'url' and
basicly use the real fopen() for standard files */
URL_FILE *file;
(void)operation;
file = (URL_FILE *)malloc(sizeof(URL_FILE));
if(!file)
return NULL;
memset(file, 0, sizeof(URL_FILE));
if((file->handle.file=fopen(url,operation)))
{
file->type = CFTYPE_FILE; /* marked as URL */
}
else
{
file->type = CFTYPE_CURL; /* marked as URL */
file->handle.curl = curl_easy_init();
curl_easy_setopt(file->handle.curl, CURLOPT_URL, url);
curl_easy_setopt(file->handle.curl, CURLOPT_WRITEDATA, file);
curl_easy_setopt(file->handle.curl, CURLOPT_VERBOSE, FALSE);
curl_easy_setopt(file->handle.curl, CURLOPT_WRITEFUNCTION, write_callback);
if(!multi_handle)
multi_handle = curl_multi_init();
curl_multi_add_handle(multi_handle, file->handle.curl);
/* lets start the fetch */
while(curl_multi_perform(multi_handle, &file->still_running) ==
CURLM_CALL_MULTI_PERFORM );
if((file->buffer_pos == 0) && (!file->still_running))
{
/* if still_running is 0 now, we should return NULL */
/* make sure the easy handle is not in the multi handle anymore */
curl_multi_remove_handle(multi_handle, file->handle.curl);
/* cleanup */
curl_easy_cleanup(file->handle.curl);
free(file);
file = NULL;
}
}
return file;
}
int
url_fclose(URL_FILE *file)
{
int ret=0;/* default is good return */
switch(file->type)
{
case CFTYPE_FILE:
ret=fclose(file->handle.file); /* passthrough */
break;
case CFTYPE_CURL:
/* make sure the easy handle is not in the multi handle anymore */
curl_multi_remove_handle(multi_handle, file->handle.curl);
/* cleanup */
curl_easy_cleanup(file->handle.curl);
break;
default: /* unknown or supported type - oh dear */
ret=EOF;
errno=EBADF;
break;
}
if(file->buffer)
free(file->buffer);/* free any allocated buffer space */
free(file);
return ret;
}
int
url_feof(URL_FILE *file)
{
int ret=0;
switch(file->type)
{
case CFTYPE_FILE:
ret=feof(file->handle.file);
break;
case CFTYPE_CURL:
if((file->buffer_pos == 0) && (!file->still_running))
ret = 1;
break;
default: /* unknown or supported type - oh dear */
ret=-1;
errno=EBADF;
break;
}
return ret;
}
size_t
url_fread(void *ptr, size_t size, size_t nmemb, URL_FILE *file)
{
size_t want;
switch(file->type)
{
case CFTYPE_FILE:
want=fread(ptr,size,nmemb,file->handle.file);
break;
case CFTYPE_CURL:
want = nmemb * size;
curl_fill_buffer(file,want,1);
/* check if theres data in the buffer - if not curl_fill_buffer()
* either errored or EOF */
if(!file->buffer_pos)
return 0;
/* ensure only available data is considered */
if(file->buffer_pos < want)
want = file->buffer_pos;
/* xfer data to caller */
memcpy(ptr, file->buffer, want);
curl_use_buffer(file,want);
want = want / size; /* number of items - nb correct op - checked
* with glibc code*/
/*printf("(fread) return %d bytes %d left\n", want,file->buffer_pos);*/
break;
default: /* unknown or supported type - oh dear */
want=0;
errno=EBADF;
break;
}
return want;
}
char *
url_fgets(char *ptr, int size, URL_FILE *file)
{
int want = size - 1;/* always need to leave room for zero termination */
int loop;
switch(file->type)
{
case CFTYPE_FILE:
ptr = fgets(ptr,size,file->handle.file);
break;
case CFTYPE_CURL:
curl_fill_buffer(file,want,1);
/* check if theres data in the buffer - if not fill either errored or
* EOF */
if(!file->buffer_pos)
return NULL;
/* ensure only available data is considered */
if(file->buffer_pos < want)
want = file->buffer_pos;
/*buffer contains data */
/* look for newline or eof */
for(loop=0;loop < want;loop++)
{
if(file->buffer[loop] == '\n')
{
want=loop+1;/* include newline */
break;
}
}
/* xfer data to caller */
memcpy(ptr, file->buffer, want);
ptr[want]=0;/* allways null terminate */
curl_use_buffer(file,want);
/*printf("(fgets) return %d bytes %d left\n", want,file->buffer_pos);*/
break;
default: /* unknown or supported type - oh dear */
ptr=NULL;
errno=EBADF;
break;
}
return ptr;/*success */
}
void
url_rewind(URL_FILE *file)
{
switch(file->type)
{
case CFTYPE_FILE:
rewind(file->handle.file); /* passthrough */
break;
case CFTYPE_CURL:
/* halt transaction */
curl_multi_remove_handle(multi_handle, file->handle.curl);
/* restart */
curl_multi_add_handle(multi_handle, file->handle.curl);
/* ditch buffer - write will recreate - resets stream pos*/
if(file->buffer)
free(file->buffer);
file->buffer=NULL;
file->buffer_pos=0;
file->buffer_len=0;
break;
default: /* unknown or supported type - oh dear */
break;
}
}
/* Small main program to retrive from a url using fgets and fread saving the
* output to two test files (note the fgets method will corrupt binary files if
* they contain 0 chars */
int
main(int argc, char *argv[])
{
URL_FILE *handle;
FILE *outf;
int nread;
char buffer[256];
char *url;
if(argc < 2)
{
url="http://192.168.7.3/testfile";/* default to testurl */
}
else
{
url=argv[1];/* use passed url */
}
/* copy from url line by line with fgets */
outf=fopen("fgets.test","w+");
if(!outf)
{
perror("couldnt open fgets output file\n");
return 1;
}
handle = url_fopen(url, "r");
if(!handle)
{
printf("couldn't url_fopen()\n");
fclose(outf);
return 2;
}
while(!url_feof(handle))
{
url_fgets(buffer,sizeof(buffer),handle);
fwrite(buffer,1,strlen(buffer),outf);
}
url_fclose(handle);
fclose(outf);
/* Copy from url with fread */
outf=fopen("fread.test","w+");
if(!outf)
{
perror("couldnt open fread output file\n");
return 1;
}
handle = url_fopen("testfile", "r");
if(!handle) {
printf("couldn't url_fopen()\n");
fclose(outf);
return 2;
}
do {
nread = url_fread(buffer, 1,sizeof(buffer), handle);
fwrite(buffer,1,nread,outf);
} while(nread);
url_fclose(handle);
fclose(outf);
/* Test rewind */
outf=fopen("rewind.test","w+");
if(!outf)
{
perror("couldnt open fread output file\n");
return 1;
}
handle = url_fopen("testfile", "r");
if(!handle) {
printf("couldn't url_fopen()\n");
fclose(outf);
return 2;
}
nread = url_fread(buffer, 1,sizeof(buffer), handle);
fwrite(buffer,1,nread,outf);
url_rewind(handle);
buffer[0]='\n';
fwrite(buffer,1,1,outf);
nread = url_fread(buffer, 1,sizeof(buffer), handle);
fwrite(buffer,1,nread,outf);
url_fclose(handle);
fclose(outf);
return 0;/* all done */
}

View File

@ -0,0 +1,83 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: ftpget.c,v 1.3 2003/12/08 14:13:19 bagder Exp $
*/
#include <stdio.h>
#include <curl/curl.h>
#include <curl/types.h>
#include <curl/easy.h>
/*
* This is an example showing how to get a single file from an FTP server.
* It delays the actual destination file creation until the first write
* callback so that it won't create an empty file in case the remote file
* doesn't exist or something else fails.
*/
struct FtpFile {
char *filename;
FILE *stream;
};
int my_fwrite(void *buffer, size_t size, size_t nmemb, void *stream)
{
struct FtpFile *out=(struct FtpFile *)stream;
if(out && !out->stream) {
/* open file for writing */
out->stream=fopen(out->filename, "wb");
if(!out->stream)
return -1; /* failure, can't open file to write */
}
return fwrite(buffer, size, nmemb, out->stream);
}
int main(void)
{
CURL *curl;
CURLcode res;
struct FtpFile ftpfile={
"curl.tar.gz", /* name to store the file as if succesful */
NULL
};
curl_global_init(CURL_GLOBAL_DEFAULT);
curl = curl_easy_init();
if(curl) {
/* Get curl 7.9.2 from sunet.se's FTP site: */
curl_easy_setopt(curl, CURLOPT_URL,
"ftp://ftp.sunet.se/pub/www/utilities/curl/curl-7.9.2.tar.gz");
/* Define our callback to get called when there's data to be written */
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, my_fwrite);
/* Set a pointer to our struct to pass to the callback */
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &ftpfile);
/* Switch on full protocol/debug output */
curl_easy_setopt(curl, CURLOPT_VERBOSE, TRUE);
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
if(CURLE_OK != res) {
/* we failed */
fprintf(stderr, "curl told us %d\n", res);
}
}
if(ftpfile.stream)
fclose(ftpfile.stream); /* close the local file */
curl_global_cleanup();
return 0;
}

View File

@ -0,0 +1,61 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: ftpgetresp.c,v 1.2 2003/12/08 14:13:19 bagder Exp $
*/
#include <stdio.h>
#include <curl/curl.h>
#include <curl/types.h>
#include <curl/easy.h>
/*
* Similar to ftpget.c but this also stores the received response-lines
* in a separate file using our own callback!
*
* This functionality was introduced in libcurl 7.9.3.
*/
size_t
write_response(void *ptr, size_t size, size_t nmemb, void *data)
{
FILE *writehere = (FILE *)data;
return fwrite(ptr, size, nmemb, writehere);
}
int main(int argc, char **argv)
{
CURL *curl;
CURLcode res;
FILE *ftpfile;
FILE *respfile;
/* local file name to store the file as */
ftpfile = fopen("ftp-list", "wb"); /* b is binary, needed on win32 */
/* local file name to store the FTP server's response lines in */
respfile = fopen("ftp-responses", "wb"); /* b is binary, needed on win32 */
curl = curl_easy_init();
if(curl) {
/* Get a file listing from sunet */
curl_easy_setopt(curl, CURLOPT_URL, "ftp://ftp.sunet.se/");
curl_easy_setopt(curl, CURLOPT_WRITEDATA, ftpfile);
curl_easy_setopt(curl, CURLOPT_HEADERFUNCTION, write_response);
curl_easy_setopt(curl, CURLOPT_WRITEHEADER, respfile);
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
fclose(ftpfile); /* close the local file */
fclose(respfile); /* close the response file */
return 0;
}

View File

@ -0,0 +1,91 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: ftpupload.c,v 1.4 2004/01/05 22:29:30 bagder Exp $
*/
#include <stdio.h>
#include <curl/curl.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
/*
* This example shows an FTP upload, with a rename of the file just after
* a successful upload.
*
* Example based on source code provided by Erick Nuwendam. Thanks!
*/
#define LOCAL_FILE "/tmp/uploadthis.txt"
#define UPLOAD_FILE_AS "while-uploading.txt"
#define REMOTE_URL "ftp://localhost/" UPLOAD_FILE_AS
#define RENAME_FILE_TO "renamed-and-fine.txt"
int main(int argc, char **argv)
{
CURL *curl;
CURLcode res;
FILE *ftpfile;
FILE * hd_src ;
int hd ;
struct stat file_info;
struct curl_slist *headerlist=NULL;
char buf_1 [] = "RNFR " UPLOAD_FILE_AS;
char buf_2 [] = "RNTO " RENAME_FILE_TO;
/* get the file size of the local file */
hd = open(LOCAL_FILE, O_RDONLY) ;
fstat(hd, &file_info);
close(hd) ;
/* get a FILE * of the same file, could also be made with
fdopen() from the previous descriptor, but hey this is just
an example! */
hd_src = fopen(LOCAL_FILE, "rb");
/* In windows, this will init the winsock stuff */
curl_global_init(CURL_GLOBAL_ALL);
/* get a curl handle */
curl = curl_easy_init();
if(curl) {
/* build a list of commands to pass to libcurl */
headerlist = curl_slist_append(headerlist, buf_1);
headerlist = curl_slist_append(headerlist, buf_2);
/* enable uploading */
curl_easy_setopt(curl, CURLOPT_UPLOAD, TRUE) ;
/* specify target */
curl_easy_setopt(curl,CURLOPT_URL, REMOTE_URL);
/* pass in that last of FTP commands to run after the transfer */
curl_easy_setopt(curl, CURLOPT_POSTQUOTE, headerlist);
/* now specify which file to upload */
curl_easy_setopt(curl, CURLOPT_READDATA, hd_src);
/* and give the size of the upload (optional) */
curl_easy_setopt(curl, CURLOPT_INFILESIZE_LARGE, file_info.st_size);
/* Now run off and do what you've been told! */
res = curl_easy_perform(curl);
/* clean up the FTP commands list */
curl_slist_free_all (headerlist);
/* always cleanup */
curl_easy_cleanup(curl);
}
fclose(hd_src); /* close the local file */
curl_global_cleanup();
return 0;
}

View File

@ -0,0 +1,83 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: getinmemory.c,v 1.5 2003/12/08 14:13:19 bagder Exp $
*
* Example source code to show how the callback function can be used to
* download data into a chunk of memory instead of storing it in a file.
*
* This exact source code has not been verified to work.
*/
#include <stdio.h>
#include <curl/curl.h>
#include <curl/types.h>
#include <curl/easy.h>
struct MemoryStruct {
char *memory;
size_t size;
};
size_t
WriteMemoryCallback(void *ptr, size_t size, size_t nmemb, void *data)
{
register int realsize = size * nmemb;
struct MemoryStruct *mem = (struct MemoryStruct *)data;
mem->memory = (char *)realloc(mem->memory, mem->size + realsize + 1);
if (mem->memory) {
memcpy(&(mem->memory[mem->size]), ptr, realsize);
mem->size += realsize;
mem->memory[mem->size] = 0;
}
return realsize;
}
int main(int argc, char **argv)
{
CURL *curl_handle;
struct MemoryStruct chunk;
chunk.memory=NULL; /* we expect realloc(NULL, size) to work */
chunk.size = 0; /* no data at this point */
curl_global_init(CURL_GLOBAL_ALL);
/* init the curl session */
curl_handle = curl_easy_init();
/* specify URL to get */
curl_easy_setopt(curl_handle, CURLOPT_URL, "http://cool.haxx.se/");
/* send all data to this function */
curl_easy_setopt(curl_handle, CURLOPT_WRITEFUNCTION, WriteMemoryCallback);
/* we pass our 'chunk' struct to the callback function */
curl_easy_setopt(curl_handle, CURLOPT_WRITEDATA, (void *)&chunk);
/* get it! */
curl_easy_perform(curl_handle);
/* cleanup curl stuff */
curl_easy_cleanup(curl_handle);
/*
* Now, our chunk.memory points to a memory block that is chunk.size
* bytes big and contains the remote file.
*
* Do something nice with it!
*
* You should be aware of the fact that at this point we might have an
* allocated data block, and nothing has yet deallocated that data. So when
* you're done with it, you should free() it as a nice application.
*/
return 0;
}

View File

@ -0,0 +1,35 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: http-post.c,v 1.1 2002/01/10 09:00:02 bagder Exp $
*/
#include <stdio.h>
#include <curl/curl.h>
int main(void)
{
CURL *curl;
CURLcode res;
curl = curl_easy_init();
if(curl) {
/* First set the URL that is about to receive our POST. This URL can
just as well be a https:// URL if that is what should receive the
data. */
curl_easy_setopt(curl, CURLOPT_URL, "http://postit.example.com/moo.cgi");
/* Now specify the POST data */
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, "name=daniel&project=curl");
/* Perform the request, res will get the return code */
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
return 0;
}

View File

@ -0,0 +1,101 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: httpput.c,v 1.5 2004/01/05 22:29:30 bagder Exp $
*/
#include <stdio.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <curl/curl.h>
/*
* This example shows a HTTP PUT operation. PUTs a file given as a command
* line argument to the URL also given on the command line.
*
* This example also uses its own read callback.
*/
size_t read_callback(void *ptr, size_t size, size_t nmemb, void *stream)
{
size_t retcode;
/* in real-world cases, this would probably get this data differently
as this fread() stuff is exactly what the library already would do
by default internally */
retcode = fread(ptr, size, nmemb, stream);
fprintf(stderr, "*** We read %d bytes from file\n", retcode);
return retcode;
}
int main(int argc, char **argv)
{
CURL *curl;
CURLcode res;
FILE *ftpfile;
FILE * hd_src ;
int hd ;
struct stat file_info;
char *file;
char *url;
if(argc < 3)
return 1;
file= argv[1];
url = argv[2];
/* get the file size of the local file */
hd = open(file, O_RDONLY) ;
fstat(hd, &file_info);
close(hd) ;
/* get a FILE * of the same file, could also be made with
fdopen() from the previous descriptor, but hey this is just
an example! */
hd_src = fopen(file, "rb");
/* In windows, this will init the winsock stuff */
curl_global_init(CURL_GLOBAL_ALL);
/* get a curl handle */
curl = curl_easy_init();
if(curl) {
/* we want to use our own read function */
curl_easy_setopt(curl, CURLOPT_READFUNCTION, read_callback);
/* enable uploading */
curl_easy_setopt(curl, CURLOPT_UPLOAD, TRUE) ;
/* HTTP PUT please */
curl_easy_setopt(curl, CURLOPT_PUT, TRUE);
/* specify target URL, and note that this URL should include a file
name, not only a directory */
curl_easy_setopt(curl,CURLOPT_URL, url);
/* now specify which file to upload */
curl_easy_setopt(curl, CURLOPT_READDATA, hd_src);
/* and give the size of the upload */
curl_easy_setopt(curl, CURLOPT_INFILESIZE_LARGE, file_info.st_size);
/* Now run off and do what you've been told! */
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
fclose(hd_src); /* close the local file */
curl_global_cleanup();
return 0;
}

View File

@ -0,0 +1,31 @@
#
# Adapted for djgpp / Watt-32 / DOS by
# Gisle Vanem <giva@bgnett.no>
#
include ../../packages/DOS/common.dj
CFLAGS += -I../../include
LIBS = ../../lib/libcurl.a
ifeq ($(USE_SSL),1)
LIBS += $(OPENSSL_ROOT)/lib/libssl.a $(OPENSSL_ROOT)/lib/libcrypt.a
endif
LIBS += $(WATT32_ROOT)/lib/libwatt.a $(ZLIB_ROOT)/libz.a
PROGRAMS = fopen.exe ftpget.exe ftpgetre.exe ftpuploa.exe getinmem.exe \
http-pos.exe httpput.exe multi-ap.exe multi-do.exe \
multi-po.exe multi-si.exe persista.exe post-cal.exe \
postit2.exe sepheade.exe simple.exe simpless.exe
all: $(PROGRAMS)
.c.exe:
$(CC) $(CFLAGS) -o $@ $^ $(LIBS)
@echo
clean:
rm -f $(PROGRAMS)

View File

@ -0,0 +1,128 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: multi-app.c,v 1.4 2003/08/28 11:21:14 bagder Exp $
*
* This is an example application source code using the multi interface.
*/
#include <stdio.h>
#include <string.h>
/* somewhat unix-specific */
#include <sys/time.h>
#include <unistd.h>
/* curl stuff */
#include <curl/curl.h>
/*
* Download a HTTP file and upload an FTP file simultaneously.
*/
#define HANDLECOUNT 2 /* Number of simultaneous transfers */
#define HTTP_HANDLE 0 /* Index for the HTTP transfer */
#define FTP_HANDLE 1 /* Index for the FTP transfer */
int main(int argc, char **argv)
{
CURL *handles[HANDLECOUNT];
CURLM *multi_handle;
int still_running; /* keep number of running handles */
int i;
CURLMsg *msg; /* for picking up messages with the transfer status */
int msgs_left; /* how many messages are left */
/* Allocate one CURL handle per transfer */
for (i=0; i<HANDLECOUNT; i++)
handles[i] = curl_easy_init();
/* set the options (I left out a few, you'll get the point anyway) */
curl_easy_setopt(handles[HTTP_HANDLE], CURLOPT_URL, "http://website.com");
curl_easy_setopt(handles[FTP_HANDLE], CURLOPT_URL, "ftp://ftpsite.com");
curl_easy_setopt(handles[FTP_HANDLE], CURLOPT_UPLOAD, TRUE);
/* init a multi stack */
multi_handle = curl_multi_init();
/* add the individual transfers */
for (i=0; i<HANDLECOUNT; i++)
curl_multi_add_handle(multi_handle, handles[i]);
/* we start some action by calling perform right away */
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
while(still_running) {
struct timeval timeout;
int rc; /* select() return code */
fd_set fdread;
fd_set fdwrite;
fd_set fdexcep;
int maxfd;
FD_ZERO(&fdread);
FD_ZERO(&fdwrite);
FD_ZERO(&fdexcep);
/* set a suitable timeout to play around with */
timeout.tv_sec = 1;
timeout.tv_usec = 0;
/* get file descriptors from the transfers */
curl_multi_fdset(multi_handle, &fdread, &fdwrite, &fdexcep, &maxfd);
rc = select(maxfd+1, &fdread, &fdwrite, &fdexcep, &timeout);
switch(rc) {
case -1:
/* select error */
break;
case 0:
/* timeout, do something else */
break;
default:
/* one or more of curl's file descriptors say there's data to read
or write */
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
break;
}
}
curl_multi_cleanup(multi_handle);
/* See how the transfers went */
while ((msg = curl_multi_info_read(multi_handle, &msgs_left))) {
if (msg->msg == CURLMSG_DONE) {
int idx, found = 0;
/* Find out which handle this message is about */
for (idx=0; (!found && (idx<HANDLECOUNT)); idx++) found = (msg->easy_handle == handles[idx]);
switch (idx) {
case HTTP_HANDLE:
printf("HTTP transfer completed with status %d\n", msg->data.result);
break;
case FTP_HANDLE:
printf("FTP transfer completed with status %d\n", msg->data.result);
break;
}
}
}
/* Free the CURL handles */
for (i=0; i<HANDLECOUNT; i++)
curl_easy_cleanup(handles[i]);
return 0;
}

View File

@ -0,0 +1,95 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: multi-double.c,v 1.3 2002/12/03 12:34:43 bagder Exp $
*
* This is a very simple example using the multi interface.
*/
#include <stdio.h>
#include <string.h>
/* somewhat unix-specific */
#include <sys/time.h>
#include <unistd.h>
/* curl stuff */
#include <curl/curl.h>
/*
* Simply download two HTTP files!
*/
int main(int argc, char **argv)
{
CURL *http_handle;
CURL *http_handle2;
CURLM *multi_handle;
int still_running; /* keep number of running handles */
http_handle = curl_easy_init();
http_handle2 = curl_easy_init();
/* set options */
curl_easy_setopt(http_handle, CURLOPT_URL, "http://www.haxx.se/");
/* set options */
curl_easy_setopt(http_handle2, CURLOPT_URL, "http://localhost/");
/* init a multi stack */
multi_handle = curl_multi_init();
/* add the individual transfers */
curl_multi_add_handle(multi_handle, http_handle);
curl_multi_add_handle(multi_handle, http_handle2);
/* we start some action by calling perform right away */
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
while(still_running) {
struct timeval timeout;
int rc; /* select() return code */
fd_set fdread;
fd_set fdwrite;
fd_set fdexcep;
int maxfd;
FD_ZERO(&fdread);
FD_ZERO(&fdwrite);
FD_ZERO(&fdexcep);
/* set a suitable timeout to play around with */
timeout.tv_sec = 1;
timeout.tv_usec = 0;
/* get file descriptors from the transfers */
curl_multi_fdset(multi_handle, &fdread, &fdwrite, &fdexcep, &maxfd);
rc = select(maxfd+1, &fdread, &fdwrite, &fdexcep, &timeout);
switch(rc) {
case -1:
/* select error */
break;
case 0:
default:
/* timeout or readable/writable sockets */
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
break;
}
}
curl_multi_cleanup(multi_handle);
curl_easy_cleanup(http_handle);
curl_easy_cleanup(http_handle2);
return 0;
}

View File

@ -0,0 +1,126 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: multi-post.c,v 1.1 2002/05/06 13:38:28 bagder Exp $
*
* This is an example application source code using the multi interface
* to do a multipart formpost without "blocking".
*/
#include <stdio.h>
#include <string.h>
#include <sys/time.h>
#include <curl/curl.h>
int main(int argc, char *argv[])
{
CURL *curl;
CURLcode res;
CURLM *multi_handle;
int still_running;
struct HttpPost *formpost=NULL;
struct HttpPost *lastptr=NULL;
struct curl_slist *headerlist=NULL;
char buf[] = "Expect:";
/* Fill in the file upload field */
curl_formadd(&formpost,
&lastptr,
CURLFORM_COPYNAME, "sendfile",
CURLFORM_FILE, "postit2.c",
CURLFORM_END);
/* Fill in the filename field */
curl_formadd(&formpost,
&lastptr,
CURLFORM_COPYNAME, "filename",
CURLFORM_COPYCONTENTS, "postit2.c",
CURLFORM_END);
/* Fill in the submit field too, even if this is rarely needed */
curl_formadd(&formpost,
&lastptr,
CURLFORM_COPYNAME, "submit",
CURLFORM_COPYCONTENTS, "send",
CURLFORM_END);
curl = curl_easy_init();
multi_handle = curl_multi_init();
/* initalize custom header list (stating that Expect: 100-continue is not
wanted */
headerlist = curl_slist_append(headerlist, buf);
if(curl && multi_handle) {
int perform=0;
/* what URL that receives this POST */
curl_easy_setopt(curl, CURLOPT_URL,
"http://www.fillinyoururl.com/upload.cgi");
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1);
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist);
curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost);
curl_multi_add_handle(multi_handle, curl);
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
while(still_running) {
struct timeval timeout;
int rc; /* select() return code */
fd_set fdread;
fd_set fdwrite;
fd_set fdexcep;
int maxfd;
FD_ZERO(&fdread);
FD_ZERO(&fdwrite);
FD_ZERO(&fdexcep);
/* set a suitable timeout to play around with */
timeout.tv_sec = 1;
timeout.tv_usec = 0;
/* get file descriptors from the transfers */
curl_multi_fdset(multi_handle, &fdread, &fdwrite, &fdexcep, &maxfd);
rc = select(maxfd+1, &fdread, &fdwrite, &fdexcep, &timeout);
switch(rc) {
case -1:
/* select error */
break;
case 0:
printf("timeout!\n");
default:
/* timeout or readable/writable sockets */
printf("perform!\n");
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
printf("running: %d!\n", still_running);
break;
}
}
curl_multi_cleanup(multi_handle);
/* always cleanup */
curl_easy_cleanup(curl);
/* then cleanup the formpost chain */
curl_formfree(formpost);
/* free slist */
curl_slist_free_all (headerlist);
}
return 0;
}

View File

@ -0,0 +1,88 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: multi-single.c,v 1.3 2003/01/09 11:42:07 bagder Exp $
*
* This is a very simple example using the multi interface.
*/
#include <stdio.h>
#include <string.h>
/* somewhat unix-specific */
#include <sys/time.h>
#include <unistd.h>
/* curl stuff */
#include <curl/curl.h>
/*
* Simply download a HTTP file.
*/
int main(int argc, char **argv)
{
CURL *http_handle;
CURLM *multi_handle;
int still_running; /* keep number of running handles */
http_handle = curl_easy_init();
/* set the options (I left out a few, you'll get the point anyway) */
curl_easy_setopt(http_handle, CURLOPT_URL, "http://www.haxx.se/");
/* init a multi stack */
multi_handle = curl_multi_init();
/* add the individual transfers */
curl_multi_add_handle(multi_handle, http_handle);
/* we start some action by calling perform right away */
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
while(still_running) {
struct timeval timeout;
int rc; /* select() return code */
fd_set fdread;
fd_set fdwrite;
fd_set fdexcep;
int maxfd;
FD_ZERO(&fdread);
FD_ZERO(&fdwrite);
FD_ZERO(&fdexcep);
/* set a suitable timeout to play around with */
timeout.tv_sec = 1;
timeout.tv_usec = 0;
/* get file descriptors from the transfers */
curl_multi_fdset(multi_handle, &fdread, &fdwrite, &fdexcep, &maxfd);
rc = select(maxfd+1, &fdread, &fdwrite, &fdexcep, &timeout);
switch(rc) {
case -1:
/* select error */
break;
case 0:
default:
/* timeout or readable/writable sockets */
while(CURLM_CALL_MULTI_PERFORM ==
curl_multi_perform(multi_handle, &still_running));
break;
}
}
curl_multi_cleanup(multi_handle);
curl_easy_cleanup(http_handle);
return 0;
}

View File

@ -0,0 +1,70 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: multithread.c,v 1.1 2001/05/04 09:35:43 bagder Exp $
*/
/* A multi-threaded example that uses pthreads extensively to fetch
* X remote files at once */
#include <stdio.h>
#include <pthread.h>
#include <curl/curl.h>
/* silly list of test-URLs */
char *urls[]= {
"http://curl.haxx.se/",
"ftp://cool.haxx.se/",
"http://www.contactor.se/",
"www.haxx.se"
};
void *pull_one_url(void *url)
{
CURL *curl;
curl = curl_easy_init();
curl_easy_setopt(curl, CURLOPT_URL, url);
curl_easy_perform(curl);
curl_easy_cleanup(curl);
return NULL;
}
/*
int pthread_create(pthread_t *new_thread_ID,
const pthread_attr_t *attr,
void * (*start_func)(void *), void *arg);
*/
int main(int argc, char **argv)
{
pthread_t tid[4];
int i;
int error;
for(i=0; i< 4; i++) {
error = pthread_create(&tid[i],
NULL, /* default attributes please */
pull_one_url,
urls[i]);
if(0 != error)
fprintf(stderr, "Couldn't run thread number %d, errno %d\n", i, error);
else
fprintf(stderr, "Thread %d, gets %s\n", i, urls[i]);
}
/* now wait for all threads to terminate */
for(i=0; i< 4; i++) {
error = pthread_join(tid[i], NULL);
fprintf(stderr, "Thread %d terminated\n", i);
}
return 0;
}

View File

@ -0,0 +1,41 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: persistant.c,v 1.2 2003/11/19 08:20:13 bagder Exp $
*/
#include <stdio.h>
#include <unistd.h>
#include <curl/curl.h>
int main(int argc, char **argv)
{
CURL *curl;
CURLcode res;
curl_global_init(CURL_GLOBAL_ALL);
curl = curl_easy_init();
if(curl) {
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1);
curl_easy_setopt(curl, CURLOPT_HEADER, 1);
/* get the first document */
curl_easy_setopt(curl, CURLOPT_URL, "http://curl.haxx.se/");
res = curl_easy_perform(curl);
/* get another document from the same server using the same
connection */
curl_easy_setopt(curl, CURLOPT_URL, "http://curl.haxx.se/docs/");
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
return 0;
}

View File

@ -0,0 +1,79 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: post-callback.c,v 1.3 2003/12/08 14:14:26 bagder Exp $
*
* An example source code that issues a HTTP POST and we provide the actual
* data through a read callback.
*
*/
#include <stdio.h>
#include <string.h>
#include <curl/curl.h>
char data[]="this is what we post to the silly web server";
struct WriteThis {
char *readptr;
int sizeleft;
};
size_t read_callback(void *ptr, size_t size, size_t nmemb, void *userp)
{
struct WriteThis *pooh = (struct WriteThis *)userp;
if(size*nmemb < 1)
return 0;
if(pooh->sizeleft) {
*(char *)ptr = pooh->readptr[0]; /* copy one single byte */
pooh->readptr++; /* advance pointer */
pooh->sizeleft--; /* less data left */
return 1; /* we return 1 byte at a time! */
}
return -1; /* no more data left to deliver */
}
int main(void)
{
CURL *curl;
CURLcode res;
struct WriteThis pooh;
pooh.readptr = data;
pooh.sizeleft = strlen(data);
curl = curl_easy_init();
if(curl) {
/* First set the URL that is about to receive our POST. */
curl_easy_setopt(curl, CURLOPT_URL,
"http://receivingsite.com.pooh/index.cgi");
/* Now specify we want to POST data */
curl_easy_setopt(curl, CURLOPT_POST, TRUE);
/* Set the expected POST size */
curl_easy_setopt(curl, CURLOPT_POSTFIELDSIZE, pooh.sizeleft);
/* we want to use our own read function */
curl_easy_setopt(curl, CURLOPT_READFUNCTION, read_callback);
/* pointer to pass to our read function */
curl_easy_setopt(curl, CURLOPT_READDATA, &pooh);
/* get verbose debug output please */
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1);
/* Perform the request, res will get the return code */
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
return 0;
}

View File

@ -0,0 +1,87 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id: postit2.c,v 1.2 2003/11/19 08:21:34 bagder Exp $
*
* Example code that uploads a file name 'foo' to a remote script that accepts
* "HTML form based" (as described in RFC1738) uploads using HTTP POST.
*
* The imaginary form we'll fill in looks like:
*
* <form method="post" enctype="multipart/form-data" action="examplepost.cgi">
* Enter file: <input type="file" name="sendfile" size="40">
* Enter file name: <input type="text" name="filename" size="30">
* <input type="submit" value="send" name="submit">
* </form>
*
* This exact source code has not been verified to work.
*/
#include <stdio.h>
#include <string.h>
#include <curl/curl.h>
#include <curl/types.h>
#include <curl/easy.h>
int main(int argc, char *argv[])
{
CURL *curl;
CURLcode res;
struct HttpPost *formpost=NULL;
struct HttpPost *lastptr=NULL;
struct curl_slist *headerlist=NULL;
char buf[] = "Expect:";
curl_global_init(CURL_GLOBAL_ALL);
/* Fill in the file upload field */
curl_formadd(&formpost,
&lastptr,
CURLFORM_COPYNAME, "sendfile",
CURLFORM_FILE, "postit2.c",
CURLFORM_END);
/* Fill in the filename field */
curl_formadd(&formpost,
&lastptr,
CURLFORM_COPYNAME, "filename",
CURLFORM_COPYCONTENTS, "postit2.c",
CURLFORM_END);
/* Fill in the submit field too, even if this is rarely needed */
curl_formadd(&formpost,
&lastptr,
CURLFORM_COPYNAME, "submit",
CURLFORM_COPYCONTENTS, "send",
CURLFORM_END);
curl = curl_easy_init();
/* initalize custom header list (stating that Expect: 100-continue is not
wanted */
headerlist = curl_slist_append(headerlist, buf);
if(curl) {
/* what URL that receives this POST */
curl_easy_setopt(curl, CURLOPT_URL, "http://curl.haxx.se/examplepost.cgi");
if ( (argc == 2) && (!strcmp(argv[1], "noexpectheader")) )
/* only disable 100-continue header if explicitly requested */
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist);
curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost);
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
/* then cleanup the formpost chain */
curl_formfree(formpost);
/* free slist */
curl_slist_free_all (headerlist);
}
return 0;
}

Some files were not shown because too many files have changed in this diff Show More