Repository: blackrosezy/build-libcurl-windows
Branch: master
Commit: c281bb342096
Files: 7
Total size: 60.5 KB
Directory structure:
gitextract_vj9ywqfs/
├── .gitignore
├── README.md
├── bin/
│ ├── 7-zip/
│ │ └── license.txt
│ ├── unxutils/
│ │ ├── StdDisclaimer.html
│ │ └── UnxUtilsDist.html
│ └── xidel/
│ └── readme.txt
└── build.bat
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
tmp_libcurl
curl.zip
third-party
tmp_url
================================================
FILE: README.md
================================================
Auto download & compile libcurl
-----------
This batch script will automatically download the latest libcurl source code and build it using Visual Studio compiler.
Supported Visual Studio are:
* Visual C++ 6 (require Windows Server 2003 Platform SDK released in February 2003)
* Visual Studio 2005
* Visual Studio 2008
* Visual Studio 2010
* Visual Studio 2012
* Visual Studio 2013 [](https://ci.appveyor.com/project/blackrosezy/build-libcurl-windows)
* Visual Studio 2015 [](https://ci.appveyor.com/project/blackrosezy/build-libcurl-windows-unln0)
*Note-1*: All versions of **Visual Studio Express are unsupported**.
*Note-2*: This script is using third-party open source software
* `bin/7-zip` http://www.7-zip.org/download.html
* `bin/unxutils` http://sourceforge.net/projects/unxutils/
* `bin/xidel` http://sourceforge.net/projects/videlibri/files/Xidel/
Usage :
$ build.bat
To build using /MT rather than /MD:
$ build.bat -static
Output :
```
third-party
└───libcurl
├───include
│ └───curl
│ curl.h
│ curlbuild.h
│ curlrules.h
│ curlver.h
│ easy.h
│ mprintf.h
│ multi.h
│ stdcheaders.h
│ typecheck-gcc.h
│
└───lib
├───dll-debug-x64
│ libcurl_debug.dll
│ libcurl_debug.lib
│ libcurl_debug.pdb
│
├───dll-debug-x86
│ libcurl_debug.dll
│ libcurl_debug.lib
│ libcurl_debug.pdb
│
├───dll-release-x64
│ libcurl.dll
│ libcurl.lib
│ libcurl.pdb
│
├───dll-release-x86
│ libcurl.dll
│ libcurl.lib
│ libcurl.pdb
│
├───static-debug-x64
│ libcurl_a_debug.lib
│
├───static-debug-x86
│ libcurl_a_debug.lib
│
├───static-release-x64
│ libcurl_a.lib
│
└───static-release-x86
libcurl_a.lib
```
## FAQ
If you get message something like below, please re-run build.bat again.
**** Retrieving:http://curl.haxx.se/download.html ****
Downloading latest curl...
http://curl.haxx.seAn unhandled exception occurred at $004C7D39 :: Bad port number.
License (build.bat)
-----------
The MIT License (MIT)
Copyright (c) 2014 Mohd Rozi
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
================================================
FILE: bin/7-zip/license.txt
================================================
7-Zip Command line version
~~~~~~~~~~~~~~~~~~~~~~~~~~
License for use and distribution
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
7-Zip Copyright (C) 1999-2010 Igor Pavlov.
7za.exe is distributed under the GNU LGPL license
Notes:
You can use 7-Zip on any computer, including a computer in a commercial
organization. You don't need to register or pay for 7-Zip.
GNU LGPL information
--------------------
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You can receive a copy of the GNU Lesser General Public License from
http://www.gnu.org/
================================================
FILE: bin/unxutils/StdDisclaimer.html
================================================
Disclaimer
Disclaimer
THIS SOFTWARE IS PROVIDED "AS IS" AND ANY EXPRESSED OR IMPLIED
WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE CONTRIBUTORS
BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
================================================
FILE: bin/unxutils/UnxUtilsDist.html
================================================
Native Win32 ports of some GNU utilities
GNU utilities for Win32
This are some ports of common GNU utilities to native
Win32. In this context, native means the executables do only depend on the Microsoft
C-runtime (msvcrt.dll) and not an emulation layer like that provided by Cygwin tools.
I have started an Open source project at http://unxutils.sourceforge.net. There is a
CVS repository of all sources, even if the changes to the
original GNU code are trivial. The repository can be accessed via anonymous CVS with the
command
cvs
-d:pserver:anonymous@cvs.UnxUtils.sourceforge.net:/cvsroot/unxutils login
for (de)compressing .Z files; this is actually a program
called ncompress and was found on one of the Linux mirrors
cp.exe
works only on NT, does real hardlinks on NTFS
csplit.exe
cut.exe
date.exe
dd.exe
df.exe
dummy
diff.exe
diff3.exe
dirname.exe
du.exe
echo.exe
egrep.exe
env.exe
dummy
expand.exe
expr.exe
factor.exe
fgrep.exe
find.exe
Example of command line under Cmd/Command shell:
find -name *.txt -exec cat {} ;
flex.exe
fmt.exe
fold.exe
gawk.exe
input files are opened in text mode
make.exe
From v3.77 upwards, make searches for a sh.exe on the path. If
it does not find one, it switches to win32 make mode that is it uses intermediate batch
files for command processing.
This is fine until your makefile tries to execute something like mkdir, which will invoke
the internal mkdir from cmd.exe or command.com. As the results may not be to your liking,
you may prefer to use the sh.exe provide here.
renamed zsh - this is no replacement for the Cygwin bash
shar.exe
only works with -T (text) option
split.exe
stego.exe
stego -E encode.me -T words.txt Encodes file encode.me as gibberish text using words from the words.txt file to
stdout. If the -T option is omitted, stego looks for a file called "words" in
the current directory. Of course, the -D option will decode the input file; remember to
redistribute words.txt together with the encoded file.
The purpose of this program is to disguise binary files as a kind of text file or to drive
your coworkers mad.
su.exe
dummy
sum.exe
sync.exe
tac.exe
tail.exe
tar.exe
only forward slashes are accepted
the -z option (compression) does not work
no remote archives
works only on NT, does real hardlinks on NTFS
tee.exe
test.exe
touch.exe
tr.exe
uname.exe
unexpand.exe
uniq.exe
unrar.exe
This is unrar 3.00 beta 7, which seems to have been
"free". Works good enough.
unshar.exe
uudecode.exe
uuencode.exe
wc.exe
wget.exe
wget 1.8.2 builds out of the box with MS Visual C
which.exe
does not search the current directory
whoami.exe
xargs.exe
yes.exe
zcat.exe
Additional
programs
pclip.exe
put the Windows clipboard text to stdout
gclip.exe
get the Windows clipboard text from stdin
Example: run the text from the clipboard through sed and put the result back
pclip | sed "s/string1/string2/g" | gclip
gplay.exe
My minimalist console multimedia player using DirectShow. With gplay filename | URL
you should be able to play just everything, as long as Microsoft supports it.
================================================
FILE: bin/xidel/readme.txt
================================================
================================================ Basics ================================================
The trivial usage is to extract an expression from a webpage like:
xidel http://www.example.org --extract //title
Instead of one or more urls, you can also pass file names or the xml data itself (xidel ".." ...).
The --extract option can be abbreviated as -e, and there are five different kind of extract expressions:
1 ) XPath 2 expressions, with some changes and additional functions.
2 ) XQuery 1 expressions
3 ) CSS 3 selectors.
4 ) Templates, a simplified version of the page which is pattern matched against the input
5 ) Multipage templates, i.e. a file that contains templates for several pages
The different kinds except multipage templates are usually automatically detected, but
a certain type can be forced with the extract-kind option.
Or by using the shorter --xpath "..", --xquery "..", --css ".." options.
Especially XQuery and template expressions are easily confused by the auto detector.
(Xidel assumes templates, if the expression starts with a "<" )
See the sections below for a more detailed description of each expression kind.
The next important option is --follow (abbreviated as -f) to follow links on a page. E.g:
xidel http://www.example.org --follow //a --extract //title
This will print the titles of all pages that are linked from http://www.example.org.
--follow supports the same expressions as --extract, and it will follow the href or src attributes of the
usual elements, or the contained text if there are no such attributes.
============================== Recursion / Argument order and grouping ===============================
You can specify multiple --extract (-e) and --follow (-f) arguments to extract values from one page,
follow the links to the next pages and extract values from there as well ...
Then it becomes important in which order the arguments are given, so it extracts before following,
or the other way around.
You can usually read it left-to-right like an English sentence, extracting from the current page,
or following to a new one, which will then become the next current page.
For example:
a) xidel http://site1 -e "select content 1" http://site2 -e "select content 2"
This will extract content 1 from site 1 and content 2 from site 2
b) xidel http://site1 http://site2 -e "select content 1" -e "select content 2"
This will extract content 1 and 2 from site 1 as well as from site 2
c) xidel http://site1 -e "select content 1" -f "//a (:select links:)" -e "select content 2"
This will extract the "content 1" from site1, and "content 2" from all sites the first site has links to.
d) xidel http://site1 -f "//a (:select links:)" -e "select content 1" -e "select content 2"
This will extract "content 1" and "content 2" from all sites the first site links to, and will not
extract anything from site1.
e) xidel http://site1 -e "select content 1" -e "select content 2" -f "//a (:select links:)"
This is some kind of special case. Since -f is the last option, it will repeat the previous operation, i.e.
it will extract content 1 and 2 from site1 and ALL sites that can be reached by an selected link on site1
or any other of the processed sites.
Only if there were another -e after -f, it would extract that from the first set of followed links and stop.
In some kinds of extract expression you can create new variables, if you assign values to a variable called
"_follow", that value will be included in the next follow expression.
If you assign an object to _follow, its properties will override the command line parameters with the same
value.
Generally an option modifier (like --extract-kind) affects all succeeding options, unless there are none,
then it affects the immediate preceding option.
You can always override the argument order by using [ and ] to group the options.
For example:
f) xidel http://site1 [ -f "//a (:select links:)" -e "select content 1" ] -e "select content 2"
This will extract content 1 from all sites linked by site1 and content 2 from site1 itself.
I.e. the extract of content 2 is not affected by the follow-option within the [..] brackets.
g) xidel http://site1 [ -f //a[@type1] --download type1/ ]
[ -f //a[@type2] --download type2/ ]
[ -f //a[@type3] --download type3/ ]
This will download all links of type 1 in a directory type1, all links of type2 in directory type2...
(if written on one line)
[ and ] must be surrounded by a space.
========================================== XPath 2.0 / XQuery ===========================================
XPath expressions provide an easy way to extract calculated values from x/html.
See http://en.wikipedia.org/wiki/XPath_2.0 for details.
Xidel also supports JSONiq and some custom extensions, so it deviates in a few ways from the standard.
However, you can disable this differences with the respective options (see link below or the
command line parameter listing printed by --help).
Switched to full standard compatible mode, its implementation passes 99.3% of the XPath 2 only tests and
97.8% of the XQuery 1 tests in the XQuery Testsuite (skipping tests for invalid input queries)
However, in the default mode, there are the following important extensions:
Syntax:
Variable assignment: $var := value
adds $var to a set of global variables, which can be created and accessed
everywhere
JSONiq literals true, false, null
true and false are evaluated as true(), false(), null becomes jn:null()
JSONiq arrays: [a,b,c]
Arrays store a list of values and can be nested with each other and
within sequences.
jn:members converts an array to a sequence.
JSONiq objects: {"name": value, ...}
Object stores a set of values as associative map. The values can be
accessed similar to a function call, e.g.: {"name": value, ...}("name").
Xidel also has {"name": value, ..}.name as an additional syntax to
access properties.
jn:keys returns a sequence of all property names, libjn:values a sequence
of values.
Used with global variables, you can copy an object with obj2 := obj
(objects are immutable, but properties can be changed with
obj2.foo := 12, which will create a new object with the changed property)
Extended strings: x"..{..}.."
If a string is prefixed by an "x", all expressions inside {}-parentheses
are evaluated, like in the value of a direct attribute constructor.
(Warning: This was changed in Xidel 0.7. Xidel <= 0.6 used
"foo$var;bar" without prefix for this)
Semantic:
All string comparisons are case insensitive, and "clever", e.g.:
'9xy' = '9XY' < '10XY' < 'xy'
This is more useful for html (think of @class = 'foobar'), but can be
disabled by passing collation urls to the string functions.
Everything is weakly typed, e.g 'false' = false() is true, and 1+"2" is 3.
Unknown namespace prefixes are resolved with the namespace bindings of the
input data.
Therefore //a always finds all links, independent of any xmlns-attributes.
(however, if you explicitly declare a namespace like
'declare default element namespace "..."' in XQuery, it will only find
elements in that namespace)
XML Schemas, error codes and static type checking are not supported.
Certain additional functions:
jn:*, libjn:* The standard JSONiq and JSONlib functions
json("str.") Parses a string as json, or downloads json from an url.(only use with trusted input)
serialize-json(value)
Converts a value to JSON
extract("string","regex"[,,[]])
This applies the regex "regex" to "string" and returns only the matching part.
If the argument is used, only the -th submatch will be returned.
(this function used to be called "filter")
css("sel") This returns the nodes below the context node matched by the specified css 3 selector.
You can use this to combine css and XPath, like in 'css("a.aclass")/@href'.
eval("xpath") This will evaluate the string "xpath" as an XPath expression
system("..") Runs a certain program and returns its stdout result as string
deep-text() This is the concatenated plain text of the every tag inside the current text.
You can also pass a separator like deep-text(' ') to separate text of different nodes.
inner-html() This is the html content of node ., like innerHTML in javascript.
outer-html() This is the same as inner-html, but includes the node itself
inner-xml() This is the xml content of node, similar to inner-html()
outer-xml() Like outer-html(), but xml-serialized
split-equal("list", "string"[, "sep" = " "])
Treats the string "list" as a list of strings separated by "sep" and tests if
"string" is contained in that list. (just like css class matching)
form(form, [overridden parameters = ()])
Converts a html form in a http request, by url encoding all inputs descendants
of the given form node. You can give a sequence of parameters to override.
e.g. form(//form[1], "foo=bar&xyz=123") returns a request for the first form,
with the foo and xyz parameters overriden by bar and 123.
You can also use a JSON object to set the override parameters, e.g.
{"foo": "bar", "xyz": 123}, in that case they are url encoded.
It returns an object with .url, .method and .post properties.
match(, )
Performs pattern matching between the template (see below for template documentation)
and the nodes, and returns a list or an object of matched values.
For exmple match({{.}}, FOOBAR) returns FOO, and
match(*{{.}}, FOOBAR) returns (FOO, BAR).
It is also possible to use named variables in the template, in which case an object
is returned, e.g:
match({{first:=.}}{{second:=.}}, FOOBAR)
returns an object with two properties "first" and "bar", containing respectively
FOO and BAR.
The template can be a node or a string. Written as string the above example would be
match("{.}", FOOBAR).
All additional functions except the jn/libjn functions are in the pxp: namespace, which is also set
as default namespace.
The pasdoc documentation of my XPath 2 / XQuery library explains more details and lists more functions:
http://www.benibela.de/documentation/internettools/xquery.TXQueryEngine.html
========================================== CSS 3.0 Selectors ==========================================
CSS 3 Selectors are fully supported, except some pseudoclasses like :hover and ::before that do not
make sense in a gui less, reading-only application.
(It is however not much tested, since I personally only use XPath)
The easiest way to use CSS selectors with the command line is to write it like --extract "css('selector')"
(the "-quotes are necessary to escape the '-quotes.)
Alternatively you can use --extract-kind=css --extract="your selector", or --css="your selector"
============================================== Templates ==============================================
Templates are a very easy way to extract complex data from a webpage.
Each template is basically a stripped-down excerpt of the webpage, in which the relevant parts have
been annotated.
The best way to describe templates is with a real world example:
The following is the html of an entry of one of the recommended videos you can always see at the right
side of an youtube video: (skipped the image part for clarity)
As you see there are actual interesting values like the url/title/username/view texts, and irrelevant,
changing values like the session url.
If you now remove the irrelevant parts, and annotate the interesting values as {name:=value},
you get the following:
+
This template can directly passed as an extract-expression, applied to the page of an youtube video,
and will return all recommended/related videos.
More precisely, it will return four (interleaved) arrays "title", "username", "views", "url" each
containing the relevant values.
A basic template as above consists of three different kind of expressions:
A normal html element will be matched to the processed html page.
This means it will search the first element on the page, that has the same node name,
all the attributes with the same values, and whose children match the children of the
template element.
{..} A {} marker will execute the contained XPath expression, once the corresponding
place in the html page has been found.
The context node . will refer to the surrounding element, and you can use my extended
XPath syntax (var := value) to create a variable. (see XPath above)
Often you want to read the entire matched element in a variable with $name, which
can be written as {$name := .} or further abbreviated as {$name} .
It can also be used within attributes, like to read the attribute value.
(the parentheses can be also replaced by .. or ..)
+ Finally the loop marker will repeat the matching of the previous element as long as
possible (an similar syntax is .. or ..).
This is sufficient for most basic scraping operations, but you can also use the following things in a
template:
textnodes Textnodes are matched like html element nodes.
A textnode in the webpage is considered a valid match, if it starts
with the same text as the text node in the template.
(but you can change this behavior to ends-with/exact/regex-comparisons with
the
command)
All children of a template:if-tag are ignored if the test-XPath-expressions
evaluates to false()
Only one of the child elements will be used for matching
Same a t:switch, but it will choose the earliest template child that has a match.
t:optional="true" Html nodes can be marked as optional, and they will be ignored, if no possible
match can be found
t:condition="??" An XPath expression that c be The context node (.) refers to a potential match.
* Like +, but it can also match none
{min,max} or {count} Matches between [min,max] or {count}-many occurrences of the previous element
The same as above. However, t:loop will repeat all its children, while a marker
like + can only repeat the single, previous element.
? Short notation for t:optional.
(see http://www.benibela.de/documentation/internettools/extendedhtmlparser.THtmlTemplateParser.html
for more detailed explanations)
There is also a Greasemonkey script to create templates directly by just selecting the text on the
corresponding webpage.
========================================= Multipage templates ==========================================
Multipage templates collect several single page templates in a xml file.
They are basically just a list of nodes with data and associated s
E.g.
unescaped post datayour=escaped&post=data&...{alink:=.}*
...
...
All pages are downloaded with GET or respectively POST requests, and processed with the given template.
The page-node also accepts a "test" attribute, which gives an XPath expression that needs to be true,
if the page element should be used.
In the attributes and the text of post-nodes, everything enclosed in {..} parentheses is evaluated
as xpath expression. (like in an extended x".." string, see above)
Since this would be cumbersome to pass directly to --extract, you can also specify the containing file
with the --template-file argument.
You can also have multiple s in a multipage template (surrounded by a parent element with
name ), and call the later actions with from another action.
If a template with multiple actions is passed to Xidel it will always perform the first action,
unless the --template-action parameter specifies another action to run. (in Xidel > 0.5)
There are also -elements to declare variables and -elements to repeat other elements,
see http://www.benibela.de/documentation/internettools/multipagetemplate.TMultiPageTemplate.html
for more details.
=========================================== Input formats =============================================
Xidel supports html and xml input, and the option input-format can be used to set the parsing behaviour:
auto: Automatically switch between html and xml
html: The input will be parsed as html.
Missing tags like head, body, tbody are automatically created.
(beware that this means table/tr is never valid, and either table//tr or table/tbody/tr
has to be used)
xml: The input will be parsed as xml.
However, it still uses the html parser, so it will correct missing end tags and not
support DTDs.
xml-strict: The input will be parsed as strict xml.
This uses the standard fpc, validating xml parser.
You can also use json files, by loading them explicitly with pxp:json() or jn:json-doc() within a
XPath/XQuery expression.
=========================================== Output formats =============================================
Xidel has several different output formats, which can be chosen with the output-format option:
adhoc: A very simple format, it will just print all values (default)
xml: The output will be serialized as xml
html: The output will be serialized as html
xml-wrapped: It will print a xml-based machine readable output.
Sequences will become value 1value 2...
Objects will become
(so in contrast to xml, it will keep variable names and type information intact)
json-wrapped: It will print a json-based machine readable output.
Sequences become arrays [ ... ].
Objects become objects. {"prop-1": "value 1", "prop-2": "value 2", ... }
(this was called json before Xidel 0.7)
bash: Prints a bash script that sets the internal variables as bash variables.
E.g.
eval $(xidel http://data -e 'title:=//title' -e 'links:=//a')
can be used to set the bash variable $title to the title of a page and the
variable $links to a bash array of all links on the page.
cmd: Like bash, but for Windows cmd.exe
Generally it prints a sequence of all processed pages (i.e. each page a single sequence element),
and the variables defined as global variables or read by a template become variables or
object properties.
There is a special rule for json-wrapped output, if the template assigns multiple values to the same
variable: Xidel will collect all these values in an array. I.e. (a:=1, b:=2, a:=3, c:=4)
becomes "a": [1, 3], "b": 2. "c": 4
================================================
FILE: build.bat
================================================
@echo off
setlocal EnableDelayedExpansion
set PROGFILES=%ProgramFiles%
if not "%ProgramFiles(x86)%" == "" set PROGFILES=%ProgramFiles(x86)%
REM Check if Visual Studio 2017 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio\2017"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio\2017\Community\VC\Auxiliary\Build\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2017"
echo Using Visual Studio 2017 Community
goto setup_env
)
)
REM Check if Visual Studio 2015 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio 14.0"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio 14.0\VC\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2015"
echo Using Visual Studio 2015
goto setup_env
)
)
REM Check if Visual Studio 2013 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio 12.0"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio 12.0\VC\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2013"
echo Using Visual Studio 2013
goto setup_env
)
)
REM Check if Visual Studio 2012 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio 11.0"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio 11.0\VC\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2012"
echo Using Visual Studio 2012
goto setup_env
)
)
REM Check if Visual Studio 2010 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio 10.0"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio 10.0\VC\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2010"
echo Using Visual Studio 2010
goto setup_env
)
)
REM Check if Visual Studio 2008 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio 9.0"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio 9.0\VC\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2008"
echo Using Visual Studio 2008
goto setup_env
)
)
REM Check if Visual Studio 2005 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio 8"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio 8\VC\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="2005"
echo Using Visual Studio 2005
goto setup_env
)
)
REM Check if Visual Studio 6 is installed
set MSVCDIR="%PROGFILES%\Microsoft Visual Studio\VC98"
set VCVARSALLPATH="%PROGFILES%\Microsoft Visual Studio\VC98\vcvarsall.bat"
if exist %MSVCDIR% (
if exist %VCVARSALLPATH% (
set COMPILER_VER="6"
echo Using Visual Studio 6
goto setup_env
)
)
echo No compiler : Microsoft Visual Studio (6, 2005, 2008, 2010, 2012, 2013 or 2015) is not installed.
goto end
:setup_env
echo Setting up environment
if %COMPILER_VER% == "6" (
call %MSVCDIR%\Bin\VCVARS32.BAT
goto begin
)
:begin
REM Setup path to helper bin
set ROOT_DIR="%CD%"
set RM="%CD%\bin\unxutils\rm.exe"
set CP="%CD%\bin\unxutils\cp.exe"
set MKDIR="%CD%\bin\unxutils\mkdir.exe"
set SEVEN_ZIP="%CD%\bin\7-zip\7za.exe"
set XIDEL="%CD%\bin\xidel\xidel.exe"
REM Housekeeping
%RM% -rf tmp_*
%RM% -rf third-party
%RM% -rf curl.zip
%RM% -rf build_*.txt
REM Get download url .Look under