Repository: mozilla/nixpkgs-mozilla
Branch: master
Commit: 58356aa021c5
Files: 46
Total size: 156.8 KB
Directory structure:
gitextract_gw3xxd2j/
├── .gitignore
├── .travis.yml
├── CODE_OF_CONDUCT.md
├── LICENSE
├── README.rst
├── compilers-overlay.nix
├── default.nix
├── deploy_rsa.enc
├── firefox-overlay.nix
├── flake.nix
├── git-cinnabar-overlay.nix
├── lib/
│ └── parseTOML.nix
├── lib-overlay.nix
├── overlays.nix
├── package-set.nix
├── phlay-overlay.nix
├── pinned.nix
├── pkgs/
│ ├── cbindgen/
│ │ └── default.nix
│ ├── clang/
│ │ └── bug-14435.patch
│ ├── firefox-nightly-bin/
│ │ └── update.nix
│ ├── gcc-4.7/
│ │ ├── arm-eabi.patch
│ │ ├── builder.sh
│ │ ├── default.nix
│ │ ├── gfortran-driving.patch
│ │ ├── gnat-cflags.patch
│ │ ├── java-jvgenmain-link.patch
│ │ ├── libstdc++-target.patch
│ │ ├── no-sys-dirs.patch
│ │ └── parallel-bconfig-4.7.patch
│ ├── gecko/
│ │ ├── default.nix
│ │ └── source.json
│ ├── git-cinnabar/
│ │ └── default.nix
│ ├── jsdoc/
│ │ ├── default.nix
│ │ ├── node-env.nix
│ │ ├── node-packages.nix
│ │ └── package.json
│ ├── lib/
│ │ ├── default.nix
│ │ └── update.nix
│ ├── nixpkgs.json
│ ├── phlay/
│ │ └── default.nix
│ └── servo/
│ └── default.nix
├── release.nix
├── rust-overlay-install.sh
├── rust-overlay.nix
├── rust-src-overlay.nix
└── update.nix
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
/result*
================================================
FILE: .travis.yml
================================================
language: nix
addons:
ssh_known_hosts: floki.garbas.si
env:
- STDENV=clang
- STDENV=clang36
- STDENV=clang37
- STDENV=clang38
- STDENV=gcc
- STDENV=gcc49
- STDENV=gcc48
script:
- if [ "$TRAVIS_EVENT_TYPE" == "cron" ]; then
nix-shell update.nix --pure;
fi
- if [ "$TRAVIS_PULL_REQUEST" != "true" -a "$TRAVIS_BRANCH" = "master" ]; then
nix-build release.nix -A gecko."x86_64-linux"."$STDENV";
mkdir nars/;
nix-push --dest "$PWD/nars/" --force ./result;
fi
before_install:
- openssl aes-256-cbc -K $encrypted_be02022e0814_key -iv $encrypted_be02022e0814_iv -in deploy_rsa.enc -out deploy_rsa -d
before_deploy:
- eval "$(ssh-agent -s)"
- chmod 600 $TRAVIS_BUILD_DIR/deploy_rsa
- ssh-add $TRAVIS_BUILD_DIR/deploy_rsa
deploy:
provider: script
skip_cleanup: true
script: rsync -avh --ignore-existing $TRAVIS_BUILD_DIR/nars/ travis@floki.garbas.si:/var/travis/nixpkgs-mozilla/
on:
branch: master
================================================
FILE: CODE_OF_CONDUCT.md
================================================
# Community Participation Guidelines
This repository is governed by Mozilla's code of conduct and etiquette guidelines.
For more details, please read the
[Mozilla Community Participation Guidelines](https://www.mozilla.org/about/governance/policies/participation/).
## How to Report
For more information on how to report violations of the Community Participation Guidelines, please read our '[How to Report](https://www.mozilla.org/about/governance/policies/participation/reporting/)' page.
<!--
## Project Specific Etiquette
In some cases, there will be additional project etiquette i.e.: (https://bugzilla.mozilla.org/page.cgi?id=etiquette.html).
Please update for your project.
-->
================================================
FILE: LICENSE
================================================
Copyright 2017 Mozilla
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.rst
================================================
nixpkgs-mozilla
===============
Gathering nix efforts in one repository.
Current packages
----------------
- gecko (https://github.com/mozilla/gecko-dev)
- firefox-bin variants including Nightly
firefox-bin variants
--------------------
Nixpkgs already has definitions for `firefox
<https://github.com/NixOS/nixpkgs/blob/246d2848ff657d56fcf2d8596709e8869ce8616a/pkgs/applications/networking/browsers/firefox/packages.nix>`_,
which is built from source, as well as `firefox-bin
<https://github.com/NixOS/nixpkgs/blob/ba2fe3c9a626a8fb845c786383b8b23ad8355951/pkgs/applications/networking/browsers/firefox-bin/default.nix>`_,
which is the binary Firefox version built by Mozilla.
The ``firefox-overlay.nix`` in this repository adds definitions for
some other firefox-bin variants that Mozilla ships:
``firefox-nightly-bin``, ``firefox-beta-bin``, and
``firefox-esr-bin``. All are exposed under a ``latest`` attribute,
e.g. ``latest.firefox-nightly-bin``.
Unfortunately, these variants do not auto-update, and you may see some
annoying pop-ups complaining about this.
Note that all the ``-bin`` packages are "unfree" (because of the
Firefox trademark, held by Mozilla), so you will need to set
``nixpkgs.config.allowUnfree`` in order to use them. More info `here
<https://wiki.nixos.org/wiki/FAQ#How_can_I_install_a_package_from_unstable_while_remaining_on_the_stable_channel?>`_.
Rust overlay
------------
**NOTE:** Nix overlays only works on up-to-date versions of NixOS/nixpkgs, starting from 17.03.
A nixpkgs overlay is provided to contain all of the latest rust releases.
To use the rust overlay run the ``./rust-overlay-install.sh`` command. It will
link the current ``./rust-overlay.nix`` into your ``~/.config/nixpkgs/overlays`` folder.
Once this is done, use ``nix-env -iA nixpkgs.latest.rustChannels.nightly.rust`` for
example. Replace the ``nixpkgs.`` prefix with ``nixos.`` on NixOS.
Using in nix expressions
------------------------
Example of using in ```shell.nix```:
.. code:: nix
let
moz_overlay = import (builtins.fetchTarball https://github.com/mozilla/nixpkgs-mozilla/archive/master.tar.gz);
nixpkgs = import <nixpkgs> { overlays = [ moz_overlay ]; };
in
with nixpkgs;
stdenv.mkDerivation {
name = "moz_overlay_shell";
buildInputs = [
# to use the latest nightly:
nixpkgs.latest.rustChannels.nightly.rust
# to use a specific nighly:
(nixpkgs.rustChannelOf { date = "2018-04-11"; channel = "nightly"; }).rust
# to use the project's rust-toolchain file:
(nixpkgs.rustChannelOf { rustToolchain = ./rust-toolchain; }).rust
];
}
Flake usage
-----------
This repository contains a minimal flake interface for the various
overlays in this repository. To use it in your own flake, add it as
an input to your ``flake.nix``:
.. code:: nix
{
inputs.nixpkgs.url = github:NixOS/nixpkgs;
inputs.nixpkgs-mozilla.url = github:mozilla/nixpkgs-mozilla;
outputs = { self, nixpkgs, nixpkgs-mozilla }: {
devShell."x86_64-linux" = let
pkgs = import nixpkgs { system = "x86_64-linux"; overlays = [ nixpkgs-mozilla.overlay ]; };
in pkgs.mkShell {
buildInputs = [ pkgs.latest.rustChannels.nightly.rust ];
};
};
}
The available overlays are ``nixpkgs-mozilla.overlay`` for the
default overlay containing everything, and
``nixpkgs-mozilla.overlays.{lib, rust, rr, firefox, git-cinnabar}``
respectively. Depending on your use case, you might need to set the
``--impure`` flag when invoking the ``nix`` command. This is because
this repository fetches resources from non-pinned URLs
non-reproducibly.
Using Custom Version of Firefox
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To use with a custom version of Firefox, you would have multiple choices to
provide either provide the full file information, such as the url and the
checksum of the file.
.. code:: nix
{
inputs.nixpkgs.url = nixpkgs/nixos-unstable;
inputs.nixpkgs-mozilla.url = github:mozilla/nixpkgs-mozilla;
outputs = { self, nixpkgs, nixpkgs-mozilla }: {
devShell."x86_64-linux" = let
pkgs = import nixpkgs { system = "x86_64-linux"; overlays = [ nixpkgs-mozilla.overlay ]; };
firefox-nightly150 = pkgs.lib.firefoxOverlay.firefoxVersion {
info = {
url = "https://download.cdn.mozilla.net/pub/firefox/nightly/2026/03/2026-03-05-00-23-19-mozilla-central/firefox-150.0a1.en-US.linux-x86_64.tar.xz";
sha512 = "8faa93d786d618963a7e5f5bf16488523aac2faeb9a27051b1002a44f56a31c93af72db16a00af9ec97786666ff0498a7fc9e95480e967ff1aa55346e57fcef3";
verifiedByHand = true;
};
};
in pkgs.mkShell { buildInputs = [ firefox-nightly150 ];};
};
}
In this example, the checksum is taken out of the checksums file, which is in
the same directory as the compressed package.
In this particular case, as this is a nightly version, we could have set the
release attribute to `false`, with the version and the timestamp of the
directory:
.. code:: nix
firefox-nightly150 = pkgs.lib.firefoxOverlay.firefoxVersion {
release = false;
version = "150.0a1";
timestamp = "2026-03-05-00-23-19";
};
And we could have omitted the timestamp if we only cared about the latest
nightly version.
To grab a beta or release version, we only have to specify the version number
and setting the release attribute to `true`:
.. code:: nix
firefox-nightly149 = pkgs.lib.firefoxOverlay.firefoxVersion {
release = true;
version = "149.0b1";
};
Firefox Development Environment
-------------------------------
This repository provides several tools to facilitate development on
Firefox. Firefox is built on an engine called Gecko, which lends its
name to some of the files and derivations in this repo.
Checking out Firefox
~~~~~~~~~~~~~~~~~~~~
To build Firefox from source, it is best to have a local checkout of
``mozilla-central``. ``mozilla-central`` is hosted in Mercurial, but
some people prefer to access it using ``git`` and
``git-cinnabar``. The tools in this repo support either using
mercurial or git.
This repository provides a ``git-cinnabar-overlay.nix`` which defines
a ``git-cinnabar`` derivation. This overlay can be used to install
``git-cinnabar``, either using ``nix-env`` or as part of a system-wide
``configuration.nix``.
Building Firefox
~~~~~~~~~~~~~~~~
The ``firefox-overlay.nix`` provides an environment to build Firefox
from its sources, once you have finished the checkout of
``mozilla-central``. You can use ``nix-shell`` to enter this
environment to launch ``mach`` commands to build Firefox and test your
build.
Some debugging tools are available in this environment as well, but
other development tools (such as those used to submit changes for
review) are outside the scope of this environment.
The ``nix-shell`` environment is available in the
``gecko.<arch>.<cc>`` attribute of the ``release.nix`` file provided
in this repository.
The ``<arch>`` attribute is either ``x86_64-linux`` or ``i686-linux``. The first
one would create a native toolchain for compiling on x64, while the second one
would give a native toolchain for compiling on x86. Note that due to the size of
the compilation units on x86, the compilation might not be able to complete, but
some sub part of Gecko, such as SpiderMonkey would compile fine.
The ``<cc>`` attribute is either ``gcc`` or ``clang``, or any specific version
of the compiler available in the ``compiler-overlay.nix`` file which is repeated
in ``release.nix``. This compiler would only be used for compiling Gecko, and
the rest of the toolchain is compiled against the default ``stdenv`` of the
architecture.
When first entering the ``nix-shell``, the toolchain will pull and build all
the dependencies necessary to build Gecko, this includes might take some time.
This work will not be necessary the second time, unless you use a different
toolchain or architecture.
.. code:: sh
~/$ cd mozilla-central
~/mozilla-central$ nix-shell ../nixpkgs-mozilla/release.nix -A gecko.x86_64-linux.gcc --pure
... pull the rust compiler
... compile the toolchain
# First time only - initialize virtualenv
[~/mozilla-central] python ./mach create-mach-environment
... create .mozbuild/_virtualenvs/mach
[~/mozilla-central] python ./mach build
... build firefox desktop
[~/mozilla-central] python ./mach run
... run firefox
When entering the ``nix-shell``, the ``MOZCONFIG`` environment variable is set
to a local file, named ``.mozconfig.nix-shell``, created each time you enter the
``nix-shell``. You can create your own ``.mozconfig`` file which extends the
default one, with your own options.
.. code:: sh
~/mozilla-central$ nix-shell ../nixpkgs-mozilla/release.nix -A gecko.x86_64-linux.gcc --pure
[~/mozilla-central] cat .mozconfig
# Import current nix-shell config.
. .mozconfig.nix-shell
ac_add_options --enable-js-shell
ac_add_options --disable-tests
[~/mozilla-central] export MOZCONFIG="$(pwd)/.mozconfig"
[~/mozilla-central] python ./mach build
To avoid repeating yourself, you can also rely on the ``NIX_SHELL_HOOK``
environment variable, to reset the ``MOZCONFIG`` environment variable for you.
.. code:: sh
~/mozilla-central$ export NIX_SHELL_HOOK="export MOZCONFIG=$(pwd)/.mozconfig;"
~/mozilla-central$ nix-shell ../nixpkgs-mozilla/release.nix -A gecko.x86_64-linux.gcc --pure
[~/mozilla-central] python ./mach build
Submitting Firefox patches
~~~~~~~~~~~~~~~~~~~~~~~~~~
Firefox development happens in `Mozilla Phabricator
<https://phabricator.services.mozilla.com/>`_. Mozilla Phabricator
docs are `here
<https://moz-conduit.readthedocs.io/en/latest/phabricator-user.html>`_.
To get your commits into Phabricator, some options include:
- Arcanist, the upstream tool for interacting with
Phabricator. Arcanist is packaged in nixpkgs already; you can find
it in `nixos.arcanist`. Unfortunately, as of this writing, upstream
Arcanist does not support ``git-cinnabar`` (according to `the
"Setting up Arcanist"
<https://moz-conduit.readthedocs.io/en/latest/phabricator-user.html#setting-up-arcanist>`_
documentation). `Mozilla maintains a fork of Arcanist
<https://github.com/mozilla-conduit/arcanist>`_ but it isn't yet
packaged. (PRs welcome.)
- `moz-phab <https://github.com/mozilla-conduit/review>`_, an in-house
CLI for Phabricator. It's available in nix packages (unstable channel).
- `phlay <https://github.com/mystor/phlay>`_, a small Python script
that speaks to the Phabricator API directly. This repository ships a
``phlay-overlay.nix`` that you can use to make ``phlay`` available
in a nix-shell or nix-env.
Note: although the ``nix-shell`` from the previous section may have
all the tools you would normally use to do Firefox development, it
isn't recommended that you use that shell for anything besides tasks
that involve running ``mach``. Other development tasks such as
committing code and submitting patches to code review are best handled
in a separate nix-shell.
TODO
----
- setup hydra to have binary channels
- make sure pinned revisions get updated automatically (if build passes we
should update revisions in default.nix)
- pin to specific (working) nixpkgs revision (as we do for other sources)
- can we make this work on darwin as well?
- assign maintainers for our packages that will montior that it "always" builds
- hook it with vulnix report to monitor CVEs (once vulnix is ready, it must be
ready soon :P)
================================================
FILE: compilers-overlay.nix
================================================
# This overlays add a customStdenv attribute which provide an stdenv with
# different versions of the compilers. This can be used to test Gecko builds
# against different compiler settings, or different compiler versions.
#
# See release.nix "builder" function, to understand how these different stdenv
# are used.
self: super:
let
noSysDirs = (super.stdenv.system != "x86_64-darwin"
&& super.stdenv.system != "x86_64-freebsd"
&& super.stdenv.system != "i686-freebsd"
&& super.stdenv.system != "x86_64-kfreebsd-gnu");
crossSystem = null;
gcc473 = super.wrapCC (super.callPackage ./pkgs/gcc-4.7 (with self; {
inherit noSysDirs;
texinfo = texinfo4;
# I'm not sure if profiling with enableParallelBuilding helps a lot.
# We can enable it back some day. This makes the *gcc* builds faster now.
profiledCompiler = false;
# When building `gcc.crossDrv' (a "Canadian cross", with host == target
# and host != build), `cross' must be null but the cross-libc must still
# be passed.
cross = null;
libcCross = if crossSystem != null then libcCross else null;
libpthreadCross =
if crossSystem != null && crossSystem.config == "i586-pc-gnu"
then gnu.libpthreadCross
else null;
}));
# By default wrapCC keep the same header files, but NixOS is using the
# latest header files from GCC, which are not supported by clang, because
# clang implement a different set of locking primitives than GCC. This
# expression is used to wrap clang with a matching verion of the libc++.
maybeWrapClang = cc: cc;
/*
if cc ? clang
then clangWrapCC cc
else cc;
*/
clangWrapCC = llvmPackages:
let libcxx =
super.lib.overrideDerivation llvmPackages.libcxx (drv: {
# https://bugzilla.mozilla.org/show_bug.cgi?id=1277619
# https://llvm.org/bugs/show_bug.cgi?id=14435
patches = drv.patches ++ [ ./pkgs/clang/bug-14435.patch ];
});
in
super.callPackage <nixpkgs/pkgs/build-support/cc-wrapper> {
cc = llvmPackages.clang-unwrapped or llvmPackages.clang;
isClang = true;
stdenv = self.clangStdenv;
libc = self.glibc;
# cc-wrapper pulls gcc headers, which are not compatible with features
# implemented in clang. These packages are used to override that.
extraPackages = [ self.libcxx llvmPackages.libcxxabi ];
nativeTools = false;
nativeLibc = false;
};
buildWithCompiler = cc:
super.stdenvAdapters.overrideCC self.stdenv (maybeWrapClang cc);
chgCompilerSource = cc: name: src:
cc.override (conf:
if conf ? gcc then # Nixpkgs 14.12
{ gcc = super.lib.overrideDerivation conf.gcc (old: { inherit name src; }); }
else # Nixpkgs 15.05
{ cc = super.lib.overrideDerivation conf.cc (old: { inherit name src; }); }
);
compilersByName = with self; {
clang = llvmPackages.clang;
clang36 = llvmPackages_36.clang;
clang37 = llvmPackages_37.clang;
clang38 = llvmPackages_38.clang; # not working yet.
clang5 = llvmPackages_5.clang or llvmPackages.clang;
clang6 = llvmPackages_6.clang or llvmPackages.clang;
clang7 = llvmPackages_7.clang or llvmPackages.clang;
clang12 = llvmPackages_12.clang or llvmPackages.clang;
clang13 = llvmPackages_13.clang or llvmPackages.clang;
gcc = gcc;
gcc6 = gcc6;
gcc5 = gcc5;
gcc49 = gcc49;
gcc48 = gcc48;
gcc474 = chgCompilerSource gcc473 "gcc-4.7.4" (fetchurl {
url = "mirror://gnu/gcc/gcc-4.7.4/gcc-4.7.4.tar.bz2";
sha256 = "10k2k71kxgay283ylbbhhs51cl55zn2q38vj5pk4k950qdnirrlj";
});
gcc473 = gcc473;
# Version used on Linux slaves, except Linux x64 ASAN.
gcc472 = chgCompilerSource gcc473 "gcc-4.7.2" (fetchurl {
url = "mirror://gnu/gcc/gcc-4.7.2/gcc-4.7.2.tar.bz2";
sha256 = "115h03hil99ljig8lkrq4qk426awmzh0g99wrrggxf8g07bq74la";
});
};
in {
customStdenvs =
super.lib.mapAttrs (name: value: buildWithCompiler value) compilersByName;
}
================================================
FILE: default.nix
================================================
# Nixpkgs overlay which aggregates overlays for tools and products, used and
# published by Mozilla.
self: super:
with super.lib;
(foldl' (flip extends) (_: super)
(map import (import ./overlays.nix)))
self
================================================
FILE: firefox-overlay.nix
================================================
# This file provide the latest binary versions of Firefox published by Mozilla.
self: super:
let
# This URL needs to be updated about every 2 years when the subkey is rotated.
pgpKey = super.fetchurl {
url = "https://download.cdn.mozilla.net/pub/firefox/candidates/138.0b1-candidates/build1/KEY";
hash = "sha256-FOGtyDxtZpW6AbNdSj0QoK1AYkQYxHPypT8zJr2XYQk=";
};
# This file is currently maintained manually, if this Nix expression attempt
# to download the wrong version, this is likely to be the problem.
#
# Open a pull request against https://github.com/mozilla-releng/shipit to
# update the version, as done in
# https://github.com/mozilla-releng/shipit/pull/1467
firefox_versions = with builtins;
fromJSON (readFile (fetchurl "https://product-details.mozilla.org/1.0/firefox_versions.json"));
arch = if self.stdenv.system == "i686-linux"
then "linux-i686"
else "linux-x86_64";
yearOf = with super.lib; yyyymmddhhmmss:
head (splitString "-" yyyymmddhhmmss);
monthOf = with super.lib; yyyymmddhhmmss:
head (tail (splitString "-" yyyymmddhhmmss));
# Given SHA512SUMS file contents and file name, extract matching sha512sum.
extractSha512Sum = sha512sums: file:
with builtins;
# Nix 1.x do not have `builtins.split`.
# Nix 2.0 have an bug in `builtins.match` (see https://github.com/NixOS/nix/issues/2147).
# So I made separate logic for Nix 1.x and Nix 2.0.
if builtins ? split then
substring 0 128 (head
(super.lib.filter
(s: isString s && substring 128 (stringLength s) s == " ${file}")
(split "\n" sha512sums)))
else
head (match ".*[\n]([0-9a-f]*) ${file}.*" sha512sums);
# The timestamp argument is a yyyy-mm-dd-hh-mm-ss date, which corresponds to
# one specific version. This is used mostly for bisecting.
versionInfo = { name, version, release, system ? arch, timestamp ? null, info ? null, ... }: with builtins;
if (info != null) then info else
if release then
# For versions such as Beta & Release:
# https://download.cdn.mozilla.net/pub/firefox/releases/55.0b3/SHA256SUMS
let
dir = "https://download.cdn.mozilla.net/pub/firefox/releases/${version}";
# After version 134 firefox switched to using tar.xz instead of tar.bz2
majorVersion = super.lib.strings.toInt (
builtins.elemAt (super.lib.strings.splitString "." version) 0
);
extension = if majorVersion > 134 then "tar.xz" else "tar.bz2";
file = "${system}/en-US/firefox-${version}.${extension}";
sha512Of = chksum: file: extractSha512Sum (readFile (fetchurl chksum)) file;
in rec {
chksum = "${dir}/SHA512SUMS";
chksumSig = "${chksum}.asc";
chksumSha256 = hashFile "sha256" (fetchurl "${dir}/SHA512SUMS");
chksumSigSha256 = hashFile "sha256" (fetchurl "${chksum}.asc");
inherit file;
url = "${dir}/${file}";
sha512 = sha512Of chksum file;
sig = null;
sigSha512 = null;
}
else
# For Nightly versions:
# https://download.cdn.mozilla.net/pub/firefox/nightly/latest-mozilla-central/firefox-56.0a1.en-US.linux-x86_64.checksums
let
dir =
if timestamp == null then
let
buildhubJSON = with builtins;
fromJSON (readFile (fetchurl "https://download.cdn.mozilla.net/pub/firefox/nightly/latest-mozilla-central/firefox-${version}.en-US.${system}.buildhub.json"));
in builtins.replaceStrings [ "/${file}" ] [ "" ] buildhubJSON.download.url
else "https://download.cdn.mozilla.net/pub/firefox/nightly/${yearOf timestamp}/${monthOf timestamp}/${timestamp}-mozilla-central" ;
file = "firefox-${version}.en-US.${system}.tar.xz";
sha512Of = chksum: file: head (match ".*[\n]([0-9a-f]*) sha512 [0-9]* ${file}[\n].*" (readFile (fetchurl chksum)));
in rec {
chksum = "${dir}/firefox-${version}.en-US.${system}.checksums";
chksumSig = null;
# file content:
# <hash> sha512 62733881 firefox-56.0a1.en-US.linux-x86_64.tar.bz2
# <hash> sha256 62733881 firefox-56.0a1.en-US.linux-x86_64.tar.bz2
url = "${dir}/${file}";
sha512 = sha512Of chksum file;
sig = "${dir}/${file}.asc";
sigSha512 = sha512Of chksum "${file}.asc";
};
# From the version info, check the authenticity of the check sum file, such
# that we guarantee that we have
verifyFileAuthenticity = { file, sha512, chksum, chksumSig }:
assert extractSha512Sum (builtins.readFile chksum) file == sha512;
super.runCommand "check-firefox-signature" {
buildInputs = [ self.gnupg ];
FILE = chksum;
ASC = chksumSig;
} ''
set -eu
gpg --dearmor < ${pgpKey} > keyring.gpg
gpgv --keyring=./keyring.gpg $ASC $FILE
mkdir $out
'';
# From the version info, create a fetchurl derivation which will get the
# sources from the remote.
fetchVersion = info:
if info.verifiedByHand or false then
# Set info.verifiedByHand = true; when testing with tarball.
super.fetchurl {
inherit (info) url sha512;
}
else if info.chksumSig != null then
super.fetchurl {
inherit (info) url sha512;
# This is a fixed derivation, but we still add as a dependency the
# verification of the checksum. Thus, this fetch script can only be
# executed once the verifyAuthenticity script finished successfully.
postFetch = ''
: # Authenticity Check (${verifyFileAuthenticity {
inherit (info) file sha512;
chksum = builtins.fetchurl { url = info.chksum; sha256 = info.chksumSha256; };
chksumSig = builtins.fetchurl { url = info.chksumSig; sha256 = info.chksumSigSha256; };
}})
'';
}
else
super.fetchurl {
inherit (info) url sha512;
# This would download the tarball, and then verify that the content
# match the signature file. Fortunately, any failure of this code would
# prevent the output from being reused.
postFetch =
let asc = super.fetchurl { url = info.sig; sha512 = info.sigSha512; }; in ''
: # Authenticity Check
set -eu
export PATH="$PATH:${self.gnupg}/bin/"
gpg --dearmor < ${pgpKey} > keyring.gpg
gpgv --keyring=./keyring.gpg ${asc} $out
'';
};
versionWithDefaults = version:
{ name = "Firefox Twilight";
version = "0.0a1";
channel = "twilight";
wmClass = "firefox-twilight";
release = false;
# info attribute set is either null, in which case it is infered by
# versionInfo, or it should be an attribute set with either:
#
# 1. Manual verification of packages:
# url = "...";
# sha512 = "...";
# verifiedByHand = true;
#
# 2. Using a checksum file, which is itself verified using the gpg key.
# url = "...";
# file = "...";
# sha512 = "...";
# chksum = "...";
# chksumSha256 = "...";
# chksumSig = "...";
# chksumSigSha256 = "...";
#
# 3. Using the gpg key on the archive
# url = "...";
# sha512 = "...";
# sig = "...";
# sigSha512 = "...";
} // version;
firefoxVersion = version':
let
version = versionWithDefaults version';
info = versionInfo version;
pkg = ((self.firefox-bin-unwrapped.override ({
generated = {
version = version.version;
sources = { inherit (info) url sha512; };
};
} // super.lib.optionalAttrs (self.firefox-bin-unwrapped.passthru ? applicationName) {
applicationName = version.name;
})).overrideAttrs (old: {
# Add a dependency on the signature check.
src = fetchVersion info;
}));
in super.wrapFirefox pkg ({
pname = "${pkg.binaryName}-bin";
wmClass = version.wmClass;
} // super.lib.optionalAttrs (!self.firefox-bin-unwrapped.passthru ? applicationName) {
desktopName = version.name;
});
firefoxVariants = {
firefox-nightly-bin = {
name = "Firefox Nightly";
channel = "nightly";
wmClass = "firefox-nightly";
version = firefox_versions.FIREFOX_NIGHTLY;
release = false;
};
firefox-beta-bin = {
name = "Firefox Beta";
channel = "beta";
wmClass = "firefox-beta";
version = firefox_versions.LATEST_FIREFOX_DEVEL_VERSION;
release = true;
};
firefox-bin = {
name = "Firefox";
channel = "release";
wmClass = "firefox";
version = firefox_versions.LATEST_FIREFOX_VERSION;
release = true;
};
firefox-esr-bin = {
name = "Firefox ESR";
channel = "release";
wmClass = "firefox";
version = firefox_versions.FIREFOX_ESR;
release = true;
};
};
in
{
lib = super.lib // {
firefoxOverlay = {
inherit pgpKey firefoxVersion versionInfo firefox_versions firefoxVariants;
};
};
# Set of packages which are automagically updated. Do not rely on these for
# reproducible builds.
latest = (super.latest or {}) // (builtins.mapAttrs (n: v: firefoxVersion v) firefoxVariants);
# Set of packages which used to build developer environment
devEnv = (super.shell or {}) // {
gecko = super.callPackage ./pkgs/gecko {
inherit (self.python38Packages) setuptools;
pythonFull = self.python38Full;
nodejs =
if builtins.compareVersions self.nodejs.name "nodejs-8.11.3" < 0
then self.nodejs-8_x else self.nodejs;
rust-cbindgen =
if !(self ? "rust-cbindgen") then self.rust-cbindgen-latest
else if builtins.compareVersions self.rust-cbindgen.version self.rust-cbindgen-latest.version < 0
then self.rust-cbindgen-latest else self.rust-cbindgen;
# Due to std::ascii::AsciiExt changes in 1.23, Gecko does not compile, so
# use the latest Rust version before 1.23.
# rust = (super.rustChannelOf { channel = "stable"; date = "2017-11-22"; }).rust;
# rust = (super.rustChannelOf { channel = "stable"; date = "2020-03-12"; }).rust;
inherit (self.latest.rustChannels.stable) rust;
};
};
# Use rust-cbindgen imported from Nixpkgs (September 2018) unless the current
# version of Nixpkgs already packages a version of rust-cbindgen.
rust-cbindgen-latest = super.callPackage ./pkgs/cbindgen {
rustPlatform = super.makeRustPlatform {
cargo = self.latest.rustChannels.stable.rust;
rustc = self.latest.rustChannels.stable.rust;
};
};
jsdoc = super.callPackage ./pkgs/jsdoc {};
}
================================================
FILE: flake.nix
================================================
{
description = "Mozilla overlay for Nixpkgs";
outputs = { self, ... }: {
# Default overlay.
overlay = import ./default.nix;
# Inidividual overlays.
overlays = {
lib = import ./lib-overlay.nix;
rust = import ./rust-overlay.nix;
firefox = import ./firefox-overlay.nix;
git-cinnabar = import ./git-cinnabar-overlay.nix;
};
};
}
================================================
FILE: git-cinnabar-overlay.nix
================================================
self: super:
{
git-cinnabar = super.callPackage ./pkgs/git-cinnabar {
# we need urllib to recognize ssh.
# python = self.pythonFull;
python = self.mercurial.python;
};
}
================================================
FILE: lib/parseTOML.nix
================================================
with builtins;
# Tokenizer.
let
layout_pat = "[ \n]+";
layout_pat_opt = "[ \n]*";
token_pat = ''=|[[][[][a-zA-Z0-9_."*-]+[]][]]|[[][a-zA-Z0-9_."*-]+[]]|[[][^]]+[]]|[a-zA-Z0-9_-]+|"[^"]*"''; #"
tokenizer_1_11 = str:
let
tokenizer_rec = len: prevTokens: patterns: str:
let
pattern = head patterns;
layoutAndTokens = match pattern str;
matchLength = stringLength (head layoutAndTokens);
tokens = prevTokens ++ tail layoutAndTokens;
in
if layoutAndTokens == null then
# if we cannot reduce the pattern, return the list of token
if tail patterns == [] then prevTokens
# otherwise, take the next pattern, which only captures half the token.
else tokenizer_rec len prevTokens (tail patterns) str
else tokenizer_rec len tokens patterns (substring matchLength len str);
avgTokenSize = 100;
ceilLog2 = v:
let inner = n: i: if i < v then inner (n + 1) (i * 2) else n; in
inner 1 1;
# The builtins.match function match the entire string, and generate a list of all captured
# elements. This is the most efficient way to make a tokenizer, if we can make a pattern which
# capture all token of the file. Unfortunately C++ std::regex does not support captures in
# repeated patterns. As a work-around, we generate patterns which are matching tokens in multiple
# of 2, such that we can avoid iterating too many times over the content.
generatePatterns = str:
let
depth = ceilLog2 (stringLength str / avgTokenSize);
inner = depth:
if depth == 0 then [ "(${token_pat})" ]
else
let next = inner (depth - 1); in
[ "${head next}${layout_pat}${head next}" ] ++ next;
in
map (pat: "(${layout_pat_opt}${pat}).*" ) (inner depth);
in
tokenizer_rec (stringLength str) [] (generatePatterns str) str;
tokenizer_1_12 = str:
let
# Nix 1.12 has the builtins.split function which allow to tokenize the
# file quickly. by iterating with a simple regexp.
layoutTokenList = split "(${token_pat})" str;
isLayout = s: match layout_pat_opt s != null;
filterLayout = list:
filter (s:
if isString s then
if isLayout s then false
else throw "Error: Unexpected token: '${s}'"
else true) list;
removeTokenWrapper = list:
map (x: assert tail x == []; head x) list;
in
removeTokenWrapper (filterLayout layoutTokenList);
tokenizer =
if builtins ? split
then tokenizer_1_12
else tokenizer_1_11;
in
# Parse entry headers
let
unescapeString = str:
# Let's ignore any escape character for the moment.
assert match ''"[^"]*"'' str != null; #"
substring 1 (stringLength str - 2) str;
# Match the content of TOML format section names.
ident_pat = ''[a-zA-Z0-9_-]+|"[^"]*"''; #"
removeBraces = token: wrapLen:
substring wrapLen (stringLength token - 2 * wrapLen) token;
# Note, this implementation is limited to 11 identifiers.
matchPathFun_1_11 = token:
let
# match header_pat "a.b.c" == [ "a" ".b" "b" ".c" "c" ]
header_pat =
foldl' (pat: n: "(${ident_pat})([.]${pat})?")
"(${ident_pat})" (genList (n: 0) 10);
matchPath = match header_pat token;
filterDot = filter (s: substring 0 1 s != ".") matchPath;
in
filterDot;
matchPathFun_1_12 = token:
map (e: head e)
(filter (s: isList s)
(split "(${ident_pat})" token));
matchPathFun =
if builtins ? split
then matchPathFun_1_12
else matchPathFun_1_11;
headerToPath = token: wrapLen:
let
token' = removeBraces token wrapLen;
matchPath = matchPathFun token';
path =
map (s:
if substring 0 1 s != ''"'' then s #"
else unescapeString s
) matchPath;
in
assert matchPath != null;
# assert trace "Path: ${token'}; match as ${toString path}" true;
path;
in
# Reconstruct the equivalent attribute set.
let
tokenToValue = token:
if token == "true" then true
else if token == "false" then false
# TODO: convert the TOML list into a Nix list.
else if match "[[][^]]+[]]" token != null then token
else unescapeString token;
parserInitState = {
idx = 0;
path = [];
isList = false;
output = [];
elem = {};
};
# Imported from nixpkgs library.
setAttrByPath = attrPath: value:
if attrPath == [] then value
else listToAttrs
[ { name = head attrPath; value = setAttrByPath (tail attrPath) value; } ];
closeSection = state:
state // {
output = state.output ++ [ (setAttrByPath state.path (
if state.isList then [ state.elem ]
else state.elem
)) ];
};
readToken = state: token:
# assert trace "Read '${token}'" true;
if state.idx == 0 then
if substring 0 2 token == "[[" then
(closeSection state) // {
path = headerToPath token 2;
isList = true;
elem = {};
}
else if substring 0 1 token == "[" then
(closeSection state) // {
path = headerToPath token 1;
isList = false;
elem = {};
}
else
assert match "[a-zA-Z0-9_-]+" token != null;
state // { idx = 1; name = token; }
else if state.idx == 1 then
assert token == "=";
state // { idx = 2; }
else
assert state.idx == 2;
state // {
idx = 0;
elem = state.elem // {
"${state.name}" = tokenToValue token;
};
};
# aggregate each section as individual attribute sets.
parser = str:
closeSection (foldl' readToken parserInitState (tokenizer str));
fromTOML = toml:
let
sections = (parser toml).output;
# Inlined from nixpkgs library functions.
zipAttrs = sets:
listToAttrs (map (n: {
name = n;
value =
let v = catAttrs n sets; in
# assert trace "Visiting ${n}" true;
if tail v == [] then head v
else if isList (head v) then concatLists v
else if isAttrs (head v) then zipAttrs v
else throw "cannot merge sections";
}) (concatLists (map attrNames sets)));
in
zipAttrs sections;
in
{
testing = fromTOML (builtins.readFile ./channel-rust-nightly.toml);
testing_url = fromTOML (builtins.readFile (builtins.fetchurl
"https://static.rust-lang.org/dist/channel-rust-nightly.toml"));
inherit fromTOML;
}
================================================
FILE: lib-overlay.nix
================================================
self: super:
{
lib = super.lib // (import ./pkgs/lib/default.nix { pkgs = self; });
}
================================================
FILE: overlays.nix
================================================
[
./lib-overlay.nix
./rust-overlay.nix
./firefox-overlay.nix
./git-cinnabar-overlay.nix
]
================================================
FILE: package-set.nix
================================================
{ pkgs }:
with pkgs.lib;
let
self = foldl'
(prev: overlay: prev // (overlay (pkgs // self) (pkgs // prev)))
{} (map import (import ./overlays.nix));
in self
================================================
FILE: phlay-overlay.nix
================================================
self: super:
{
phlay = super.callPackage ./pkgs/phlay {};
}
================================================
FILE: pinned.nix
================================================
# This script extends nixpkgs with mozilla packages.
#
# First it imports the <nixpkgs> in the environment and depends on it
# providing fetchFromGitHub and lib.importJSON.
#
# After that it loads a pinned release of nixos-unstable and uses that as the
# base for the rest of packaging. One can pass it's own pkgsPath attribute if
# desired, probably in the context of hydra.
{ pkgsPath ? null
, overlays ? []
, system ? null
, geckoSrc ? null
}:
# Pin a specific version of Nixpkgs.
let
_pkgs = import <nixpkgs> {};
_pkgsPath =
if pkgsPath != null then pkgsPath
else _pkgs.fetchFromGitHub (_pkgs.lib.importJSON ./pkgs/nixpkgs.json);
nixpkgs = import _pkgsPath ({
overlays = import ./default.nix ++ overlays;
} // (if system != null then { inherit system; } else {}));
in
nixpkgs // {
# Do not add a name attribute attribute in an overlay !!! As this will cause
# tons of recompilations.
name = "nixpkgs";
updateScript = nixpkgs.lib.updateFromGitHub {
owner = "NixOS";
repo = "nixpkgs-channels";
branch = "nixos-unstable-small";
path = "pkgs/nixpkgs.json";
};
}
================================================
FILE: pkgs/cbindgen/default.nix
================================================
### NOTE: This file is a copy of the one from Nixpkgs repository
### (taken 2020 February) from commit 82d9ce45fe0b67e3708ab6ba47ffcb4bba09945d.
### It is used when the version of cbindgen in
### upstream nixpkgs is not up-to-date enough to compile Firefox.
{ stdenv, lib, fetchFromGitHub, rustPlatform
# , Security
}:
rustPlatform.buildRustPackage rec {
name = "rust-cbindgen-${version}";
version = "0.14.3";
src = fetchFromGitHub {
owner = "eqrion";
repo = "cbindgen";
rev = "v${version}";
sha256 = "0pw55334i10k75qkig8bgcnlsy613zw2p5j4xyz8v71s4vh1a58j";
};
cargoSha256 = "0088ijnjhqfvdb1wxy9jc7hq8c0yxgj5brlg68n9vws1mz9rilpy";
# buildInputs = lib.optional stdenv.isDarwin Security;
checkFlags = [
# https://github.com/eqrion/cbindgen/issues/338
"--skip test_expand"
];
# https://github.com/NixOS/nixpkgs/issues/61618
postConfigure = ''
mkdir .cargo
touch .cargo/.package-cache
export HOME=`pwd`
'';
meta = with lib; {
description = "A project for generating C bindings from Rust code";
homepage = "https://github.com/eqrion/cbindgen";
license = licenses.mpl20;
maintainers = with maintainers; [ jtojnar andir ];
};
}
================================================
FILE: pkgs/clang/bug-14435.patch
================================================
diff -x _inst -x _build -x .svn -ur libcxx.old/include/cstdio libcxx.new/include/cstdio
--- libcxx.old/include/cstdio 2016-07-08 12:47:12.964181871 +0000
+++ libcxx.new/include/cstdio 2016-07-08 12:47:27.540149147 +0000
@@ -109,15 +109,15 @@
#endif
#ifdef getc
-inline _LIBCPP_INLINE_VISIBILITY int __libcpp_getc(FILE* __stream) {return getc(__stream);}
+inline __attribute__ ((__always_inline__)) int __libcpp_getc(FILE* __stream) {return getc(__stream);}
#undef getc
-inline _LIBCPP_INLINE_VISIBILITY int getc(FILE* __stream) {return __libcpp_getc(__stream);}
+inline __attribute__ ((__always_inline__)) int getc(FILE* __stream) {return __libcpp_getc(__stream);}
#endif // getc
#ifdef putc
-inline _LIBCPP_INLINE_VISIBILITY int __libcpp_putc(int __c, FILE* __stream) {return putc(__c, __stream);}
+inline __attribute__ ((__always_inline__)) int __libcpp_putc(int __c, FILE* __stream) {return putc(__c, __stream);}
#undef putc
-inline _LIBCPP_INLINE_VISIBILITY int putc(int __c, FILE* __stream) {return __libcpp_putc(__c, __stream);}
+inline __attribute__ ((__always_inline__)) int putc(int __c, FILE* __stream) {return __libcpp_putc(__c, __stream);}
#endif // putc
#ifdef clearerr
diff -x _inst -x _build -x .svn -ur libcxx.old/include/utility libcxx.new/include/utility
--- libcxx.old/include/utility 2016-07-08 12:46:02.570334913 +0000
+++ libcxx.new/include/utility 2016-07-08 12:51:00.760636878 +0000
@@ -217,7 +217,7 @@
}
template<class _Tp, size_t _Np>
-inline _LIBCPP_INLINE_VISIBILITY
+inline __attribute__ ((__always_inline__))
void
swap(_Tp (&__a)[_Np], _Tp (&__b)[_Np]) _NOEXCEPT_(__is_nothrow_swappable<_Tp>::value)
{
================================================
FILE: pkgs/firefox-nightly-bin/update.nix
================================================
{ name
, writeScript
, xidel
, coreutils
, gnused
, gnugrep
, curl
, jq
}:
let
version = (builtins.parseDrvName name).version;
in writeScript "update-firefox-nightly-bin" ''
PATH=${coreutils}/bin:${gnused}/bin:${gnugrep}/bin:${xidel}/bin:${curl}/bin:${jq}/bin
#set -eux
pushd pkgs/firefox-nightly-bin
tmpfile=`mktemp`
url=https://archive.mozilla.org/pub/firefox/nightly/latest-mozilla-central/
nightly_file=`curl $url | \
xidel - --extract //a | \
grep firefox | \
grep linux-x86_64.json | \
tail -1 | \
sed -e 's/.json//'`
nightly_json=`curl --silent $url$nightly_file.json`
cat > $tmpfile <<EOF
{
version = `echo $nightly_json | jq ."moz_app_version"` + "-" + `echo $nightly_json | jq ."buildid"`;
sources = [
{ url = "$url$nightly_file.tar.xz";
locale = "`echo $nightly_file | cut -d"." -f3`";
arch = "`echo $nightly_file | cut -d"." -f4`";
sha512 = "`curl --silent $url$nightly_file.checksums | grep $nightly_file.tar.xz$ | grep sha512 | cut -d" " -f1`";
}
];
}
EOF
mv $tmpfile sources.nix
popd
''
================================================
FILE: pkgs/gcc-4.7/arm-eabi.patch
================================================
Index: gcc-4_7-branch/libstdc++-v3/configure.host
===================================================================
--- gcc-4_7-branch/libstdc++-v3/configure.host (revision 194579)
+++ gcc-4_7-branch/libstdc++-v3/configure.host (revision 194580)
@@ -340,7 +340,7 @@
fi
esac
case "${host}" in
- arm*-*-linux-*eabi)
+ arm*-*-linux-*eabi*)
port_specific_symbol_files="\$(srcdir)/../config/os/gnu-linux/arm-eabi-extra.ver"
;;
esac
Index: gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_signed/requirements/typedefs-2.cc
===================================================================
--- gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_signed/requirements/typedefs-2.cc (revision 194579)
+++ gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_signed/requirements/typedefs-2.cc (revision 194580)
@@ -1,5 +1,5 @@
// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums" }
-// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
// 2007-05-03 Benjamin Kosnik <bkoz@redhat.com>
//
Index: gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_unsigned/requirements/typedefs-2.cc
===================================================================
--- gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_unsigned/requirements/typedefs-2.cc (revision 194579)
+++ gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_unsigned/requirements/typedefs-2.cc (revision 194580)
@@ -1,5 +1,5 @@
// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums" }
-// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
// 2007-05-03 Benjamin Kosnik <bkoz@redhat.com>
//
Index: gcc-4_7-branch/libjava/configure.ac
===================================================================
--- gcc-4_7-branch/libjava/configure.ac (revision 194579)
+++ gcc-4_7-branch/libjava/configure.ac (revision 194580)
@@ -931,7 +931,7 @@
# on Darwin -single_module speeds up loading of the dynamic libraries.
extra_ldflags_libjava=-Wl,-single_module
;;
-arm*linux*eabi)
+arm*-*-linux*eabi*)
# Some of the ARM unwinder code is actually in libstdc++. We
# could in principle replicate it in libgcj, but it's better to
# have a dependency on libstdc++.
Index: gcc-4_7-branch/libjava/configure
===================================================================
--- gcc-4_7-branch/libjava/configure (revision 194579)
+++ gcc-4_7-branch/libjava/configure (revision 194580)
@@ -20542,7 +20542,7 @@
# on Darwin -single_module speeds up loading of the dynamic libraries.
extra_ldflags_libjava=-Wl,-single_module
;;
-arm*linux*eabi)
+arm*-*-linux*eabi*)
# Some of the ARM unwinder code is actually in libstdc++. We
# could in principle replicate it in libgcj, but it's better to
# have a dependency on libstdc++.
Index: gcc-4_7-branch/libgcc/config.host
===================================================================
--- gcc-4_7-branch/libgcc/config.host (revision 194579)
+++ gcc-4_7-branch/libgcc/config.host (revision 194580)
@@ -327,7 +327,7 @@
arm*-*-linux*) # ARM GNU/Linux with ELF
tmake_file="${tmake_file} arm/t-arm t-fixedpoint-gnu-prefix"
case ${host} in
- arm*-*-linux-*eabi)
+ arm*-*-linux-*eabi*)
tmake_file="${tmake_file} arm/t-elf arm/t-bpabi arm/t-linux-eabi t-slibgcc-libgcc"
tm_file="$tm_file arm/bpabi-lib.h"
unwind_header=config/arm/unwind-arm.h
Index: gcc-4_7-branch/gcc/doc/install.texi
===================================================================
--- gcc-4_7-branch/gcc/doc/install.texi (revision 194579)
+++ gcc-4_7-branch/gcc/doc/install.texi (revision 194580)
@@ -3222,7 +3222,7 @@
@heading @anchor{arm-x-eabi}arm-*-eabi
ARM-family processors. Subtargets that use the ELF object format
require GNU binutils 2.13 or newer. Such subtargets include:
-@code{arm-*-netbsdelf}, @code{arm-*-*linux-gnueabi}
+@code{arm-*-netbsdelf}, @code{arm-*-*linux-gnueabi*}
and @code{arm-*-rtemseabi}.
@html
Index: gcc-4_7-branch/gcc/testsuite/gcc.target/arm/synchronize.c
===================================================================
--- gcc-4_7-branch/gcc/testsuite/gcc.target/arm/synchronize.c (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/gcc.target/arm/synchronize.c (revision 194580)
@@ -1,4 +1,4 @@
-/* { dg-final { scan-assembler "__sync_synchronize|dmb|mcr" { target arm*-*-linux-*eabi } } } */
+/* { dg-final { scan-assembler "__sync_synchronize|dmb|mcr" { target arm*-*-linux-*eabi* } } } */
void *foo (void)
{
Index: gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.jason/enum6.C
===================================================================
--- gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.jason/enum6.C (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.jason/enum6.C (revision 194580)
@@ -7,10 +7,10 @@
// enum-size attributes should only be emitted if there are values of
// enum type that can escape the compilation unit, gcc cannot currently
// detect this; if this facility is added then this linker option should
-// not be needed. arm-*-linux*eabi should be a good approximation to
+// not be needed. arm-*-linux*eabi* should be a good approximation to
// those platforms where the EABI supplement defines enum values to be
// 32 bits wide.
-// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
#include <limits.h>
Index: gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.other/enum4.C
===================================================================
--- gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.other/enum4.C (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.other/enum4.C (revision 194580)
@@ -9,10 +9,10 @@
// enum-size attributes should only be emitted if there are values of
// enum type that can escape the compilation unit, gcc cannot currently
// detect this; if this facility is added then this linker option should
-// not be needed. arm-*-linux*eabi should be a good approximation to
+// not be needed. arm-*-linux*eabi* should be a good approximation to
// those platforms where the EABI supplement defines enum values to be
// 32 bits wide.
-// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
enum E {
a = -312
Index: gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.law/enum9.C
===================================================================
--- gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.law/enum9.C (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.law/enum9.C (revision 194580)
@@ -7,10 +7,10 @@
// enum-size attributes should only be emitted if there are values of
// enum type that can escape the compilation unit, gcc cannot currently
// detect this; if this facility is added then this linker option should
-// not be needed. arm-*-linux*eabi should be a good approximation to
+// not be needed. arm-*-linux*eabi* should be a good approximation to
// those platforms where the EABI supplement defines enum values to be
// 32 bits wide.
-// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
// GROUPS passed enums
extern "C" int printf (const char *, ...);
Index: gcc-4_7-branch/gcc/testsuite/lib/target-supports.exp
===================================================================
--- gcc-4_7-branch/gcc/testsuite/lib/target-supports.exp (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/lib/target-supports.exp (revision 194580)
@@ -3818,7 +3818,7 @@
}
} ""
}]
- } elseif { [istarget arm*-*-linux-gnueabi] } {
+ } elseif { [istarget arm*-*-linux-gnueabi*] } {
return [check_runtime sync_longlong_runtime {
#include <stdlib.h>
int main ()
@@ -3860,7 +3860,7 @@
|| [istarget i?86-*-*]
|| [istarget x86_64-*-*]
|| [istarget alpha*-*-*]
- || [istarget arm*-*-linux-gnueabi]
+ || [istarget arm*-*-linux-gnueabi*]
|| [istarget bfin*-*linux*]
|| [istarget hppa*-*linux*]
|| [istarget s390*-*-*]
@@ -3890,7 +3890,7 @@
|| [istarget i?86-*-*]
|| [istarget x86_64-*-*]
|| [istarget alpha*-*-*]
- || [istarget arm*-*-linux-gnueabi]
+ || [istarget arm*-*-linux-gnueabi*]
|| [istarget hppa*-*linux*]
|| [istarget s390*-*-*]
|| [istarget powerpc*-*-*]
Index: gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_9.f90
===================================================================
--- gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_9.f90 (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_9.f90 (revision 194580)
@@ -1,6 +1,6 @@
! { dg-do run }
! { dg-options "-fshort-enums" }
-! { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+! { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
! Program to test enumerations when option -fshort-enums is given
program main
Index: gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_10.f90
===================================================================
--- gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_10.f90 (revision 194579)
+++ gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_10.f90 (revision 194580)
@@ -1,7 +1,7 @@
! { dg-do run }
! { dg-additional-sources enum_10.c }
! { dg-options "-fshort-enums -w" }
-! { dg-options "-fshort-enums -w -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } }
+! { dg-options "-fshort-enums -w -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } }
! Make sure short enums are indeed interoperable with the
! corresponding C type.
Index: gcc-4_7-branch/gcc/ada/gcc-interface/Makefile.in
===================================================================
--- gcc-4_7-branch/gcc/ada/gcc-interface/Makefile.in (revision 194579)
+++ gcc-4_7-branch/gcc/ada/gcc-interface/Makefile.in (revision 194580)
@@ -1866,7 +1866,7 @@
LIBRARY_VERSION := $(LIB_VERSION)
endif
-ifeq ($(strip $(filter-out arm% linux-gnueabi,$(arch) $(osys)-$(word 4,$(targ)))),)
+ifeq ($(strip $(filter-out arm%-linux,$(arch)-$(osys)) $(if $(findstring eabi,$(word 4,$(targ))),,$(word 4,$(targ)))),)
LIBGNAT_TARGET_PAIRS = \
a-intnam.ads<a-intnam-linux.ads \
s-inmaop.adb<s-inmaop-posix.adb \
Index: gcc-4_7-branch/gcc/config.gcc
===================================================================
--- gcc-4_7-branch/gcc/config.gcc (revision 194579)
+++ gcc-4_7-branch/gcc/config.gcc (revision 194580)
@@ -855,7 +855,7 @@
esac
tmake_file="${tmake_file} arm/t-arm"
case ${target} in
- arm*-*-linux-*eabi)
+ arm*-*-linux-*eabi*)
tm_file="$tm_file arm/bpabi.h arm/linux-eabi.h"
tmake_file="$tmake_file arm/t-arm-elf arm/t-bpabi arm/t-linux-eabi"
# Define multilib configuration for arm-linux-androideabi.
================================================
FILE: pkgs/gcc-4.7/builder.sh
================================================
source $stdenv/setup
export NIX_FIXINC_DUMMY=$NIX_BUILD_TOP/dummy
mkdir $NIX_FIXINC_DUMMY
if test "$staticCompiler" = "1"; then
EXTRA_LDFLAGS="-static"
else
EXTRA_LDFLAGS=""
fi
# GCC interprets empty paths as ".", which we don't want.
if test -z "$CPATH"; then unset CPATH; fi
if test -z "$LIBRARY_PATH"; then unset LIBRARY_PATH; fi
echo "\$CPATH is \`$CPATH'"
echo "\$LIBRARY_PATH is \`$LIBRARY_PATH'"
if test "$noSysDirs" = "1"; then
if test -e $NIX_GCC/nix-support/orig-libc; then
# Figure out what extra flags to pass to the gcc compilers
# being generated to make sure that they use our glibc.
extraFlags="$(cat $NIX_GCC/nix-support/libc-cflags)"
extraLDFlags="$(cat $NIX_GCC/nix-support/libc-ldflags) $(cat $NIX_GCC/nix-support/libc-ldflags-before)"
# Use *real* header files, otherwise a limits.h is generated
# that does not include Glibc's limits.h (notably missing
# SSIZE_MAX, which breaks the build).
export NIX_FIXINC_DUMMY=$(cat $NIX_GCC/nix-support/orig-libc)/include
# The path to the Glibc binaries such as `crti.o'.
glibc_libdir="$(cat $NIX_GCC/nix-support/orig-libc)/lib"
else
# Hack: support impure environments.
extraFlags="-isystem /usr/include"
extraLDFlags="-L/usr/lib64 -L/usr/lib"
glibc_libdir="/usr/lib"
export NIX_FIXINC_DUMMY=/usr/include
fi
extraFlags="-I$NIX_FIXINC_DUMMY $extraFlags"
extraLDFlags="-L$glibc_libdir -rpath $glibc_libdir $extraLDFlags"
# BOOT_CFLAGS defaults to `-g -O2'; since we override it below,
# make sure to explictly add them so that files compiled with the
# bootstrap compiler are optimized and (optionally) contain
# debugging information (info "(gccinstall) Building").
if test -n "$dontStrip"; then
extraFlags="-O2 -g $extraFlags"
else
# Don't pass `-g' at all; this saves space while building.
extraFlags="-O2 $extraFlags"
fi
EXTRA_FLAGS="$extraFlags"
for i in $extraLDFlags; do
EXTRA_LDFLAGS="$EXTRA_LDFLAGS -Wl,$i"
done
if test -n "$targetConfig"; then
# Cross-compiling, we need gcc not to read ./specs in order to build
# the g++ compiler (after the specs for the cross-gcc are created).
# Having LIBRARY_PATH= makes gcc read the specs from ., and the build
# breaks. Having this variable comes from the default.nix code to bring
# gcj in.
unset LIBRARY_PATH
unset CPATH
if test -z "$crossStageStatic"; then
EXTRA_TARGET_CFLAGS="-B${libcCross}/lib -idirafter ${libcCross}/include"
EXTRA_TARGET_LDFLAGS="-Wl,-L${libcCross}/lib -Wl,-rpath,${libcCross}/lib -Wl,-rpath-link,${libcCross}/lib"
fi
else
if test -z "$NIX_GCC_CROSS"; then
EXTRA_TARGET_CFLAGS="$EXTRA_FLAGS"
EXTRA_TARGET_CXXFLAGS="$EXTRA_FLAGS"
EXTRA_TARGET_LDFLAGS="$EXTRA_LDFLAGS"
else
# This the case of cross-building the gcc.
# We need special flags for the target, different than those of the build
# Assertion:
test -e $NIX_GCC_CROSS/nix-support/orig-libc
# Figure out what extra flags to pass to the gcc compilers
# being generated to make sure that they use our glibc.
extraFlags="$(cat $NIX_GCC_CROSS/nix-support/libc-cflags)"
extraLDFlags="$(cat $NIX_GCC_CROSS/nix-support/libc-ldflags) $(cat $NIX_GCC_CROSS/nix-support/libc-ldflags-before)"
# Use *real* header files, otherwise a limits.h is generated
# that does not include Glibc's limits.h (notably missing
# SSIZE_MAX, which breaks the build).
NIX_FIXINC_DUMMY_CROSS=$(cat $NIX_GCC_CROSS/nix-support/orig-libc)/include
# The path to the Glibc binaries such as `crti.o'.
glibc_dir="$(cat $NIX_GCC_CROSS/nix-support/orig-libc)"
glibc_libdir="$glibc_dir/lib"
configureFlags="$configureFlags --with-native-system-header-dir=$glibc_dir/include"
extraFlags="-I$NIX_FIXINC_DUMMY_CROSS $extraFlags"
extraLDFlags="-L$glibc_libdir -rpath $glibc_libdir $extraLDFlags"
EXTRA_TARGET_CFLAGS="$extraFlags"
for i in $extraLDFlags; do
EXTRA_TARGET_LDFLAGS="$EXTRA_TARGET_LDFLAGS -Wl,$i"
done
fi
fi
# CFLAGS_FOR_TARGET are needed for the libstdc++ configure script to find
# the startfiles.
# FLAGS_FOR_TARGET are needed for the target libraries to receive the -Bxxx
# for the startfiles.
makeFlagsArray=( \
"${makeFlagsArray[@]}" \
NATIVE_SYSTEM_HEADER_DIR="$NIX_FIXINC_DUMMY" \
SYSTEM_HEADER_DIR="$NIX_FIXINC_DUMMY" \
CFLAGS_FOR_BUILD="$EXTRA_FLAGS $EXTRA_LDFLAGS" \
CXXFLAGS_FOR_BUILD="$EXTRA_FLAGS $EXTRA_LDFLAGS" \
CFLAGS_FOR_TARGET="$EXTRA_TARGET_CFLAGS $EXTRA_TARGET_LDFLAGS" \
CXXFLAGS_FOR_TARGET="$EXTRA_TARGET_CFLAGS $EXTRA_TARGET_LDFLAGS" \
FLAGS_FOR_TARGET="$EXTRA_TARGET_CFLAGS $EXTRA_TARGET_LDFLAGS" \
LDFLAGS_FOR_BUILD="$EXTRA_FLAGS $EXTRA_LDFLAGS" \
LDFLAGS_FOR_TARGET="$EXTRA_TARGET_LDFLAGS $EXTRA_TARGET_LDFLAGS" \
)
if test -z "$targetConfig"; then
makeFlagsArray=( \
"${makeFlagsArray[@]}" \
BOOT_CFLAGS="$EXTRA_FLAGS $EXTRA_LDFLAGS" \
BOOT_LDFLAGS="$EXTRA_TARGET_CFLAGS $EXTRA_TARGET_LDFLAGS" \
)
fi
if test -n "$targetConfig" -a "$crossStageStatic" == 1; then
# We don't want the gcc build to assume there will be a libc providing
# limits.h in this stagae
makeFlagsArray=( \
"${makeFlagsArray[@]}" \
LIMITS_H_TEST=false \
)
else
makeFlagsArray=( \
"${makeFlagsArray[@]}" \
LIMITS_H_TEST=true \
)
fi
fi
if test -n "$targetConfig"; then
# The host strip will destroy some important details of the objects
dontStrip=1
fi
providedPreConfigure="$preConfigure";
preConfigure() {
if test -n "$newlibSrc"; then
tar xvf "$newlibSrc" -C ..
ln -s ../newlib-*/newlib newlib
# Patch to get armvt5el working:
sed -i -e 's/ arm)/ arm*)/' newlib/configure.host
fi
# Bug - they packaged zlib
if test -d "zlib"; then
# This breaks the build without-headers, which should build only
# the target libgcc as target libraries.
# See 'configure:5370'
rm -Rf zlib
fi
if test -f "$NIX_GCC/nix-support/orig-libc"; then
# Patch the configure script so it finds glibc headers. It's
# important for example in order not to get libssp built,
# because its functionality is in glibc already.
glibc_headers="$(cat $NIX_GCC/nix-support/orig-libc)/include"
sed -i \
-e "s,glibc_header_dir=/usr/include,glibc_header_dir=$glibc_headers", \
gcc/configure
fi
if test -n "$crossMingw" -a -n "$crossStageStatic"; then
mkdir -p ../mingw
# --with-build-sysroot expects that:
cp -R $libcCross/include ../mingw
configureFlags="$configureFlags --with-build-sysroot=`pwd`/.."
fi
# Eval the preConfigure script from nix expression.
eval $providedPreConfigure;
env;
# Perform the build in a different directory.
mkdir ../build
cd ../build
configureScript=../$sourceRoot/configure
}
postConfigure() {
# Don't store the configure flags in the resulting executables.
sed -e '/TOPLEVEL_CONFIGURE_ARGUMENTS=/d' -i Makefile
}
postInstall() {
# Remove precompiled headers for now. They are very big and
# probably not very useful yet.
find $out/include -name "*.gch" -exec rm -rf {} \; -prune
# Remove `fixincl' to prevent a retained dependency on the
# previous gcc.
rm -rf $out/libexec/gcc/*/*/install-tools
rm -rf $out/lib/gcc/*/*/install-tools
# More dependencies with the previous gcc or some libs (gccbug stores the build command line)
rm -rf $out/bin/gccbug
# Take out the bootstrap-tools from the rpath, as it's not needed at all having $out
for i in $out/libexec/gcc/*/*/*; do
if PREV_RPATH=`patchelf --print-rpath $i`; then
patchelf --set-rpath `echo $PREV_RPATH | sed 's,:[^:]*bootstrap-tools/lib,,'` $i
fi
done
# Get rid of some "fixed" header files
rm -rf $out/lib/gcc/*/*/include/root
# Replace hard links for i686-pc-linux-gnu-gcc etc. with symlinks.
for i in $out/bin/*-gcc*; do
if cmp -s $out/bin/gcc $i; then
ln -sfn gcc $i
fi
done
for i in $out/bin/c++ $out/bin/*-c++* $out/bin/*-g++*; do
if cmp -s $out/bin/g++ $i; then
ln -sfn g++ $i
fi
done
eval "$postInstallGhdl"
}
genericBuild
================================================
FILE: pkgs/gcc-4.7/default.nix
================================================
{ stdenv, lib, fetchurl, noSysDirs
, langC ? true, langCC ? true, langFortran ? false
, langJava ? false
, langAda ? false
, langVhdl ? false
, langGo ? false
, profiledCompiler ? false
, staticCompiler ? false
, enableShared ? true
, texinfo ? null
, perl ? null # optional, for texi2pod (then pod2man); required for Java
, gmp, mpfr, mpc, gettext, which
, libelf # optional, for link-time optimizations (LTO)
, ppl ? null, cloog ? null # optional, for the Graphite optimization framework.
, zlib ? null, boehmgc ? null
, zip ? null, unzip ? null, pkgconfig ? null, gtk ? null, libart_lgpl ? null
, libX11 ? null, libXt ? null, libSM ? null, libICE ? null, libXtst ? null
, libXrender ? null, xproto ? null, renderproto ? null, xextproto ? null
, libXrandr ? null, libXi ? null, inputproto ? null, randrproto ? null
, gnatboot ? null
, enableMultilib ? false
, enablePlugin ? true # whether to support user-supplied plug-ins
, name ? "gcc"
, cross ? null
, binutilsCross ? null
, libcCross ? null
, crossStageStatic ? true
, gnat ? null
, libpthread ? null, libpthreadCross ? null # required for GNU/Hurd
, stripped ? true
, gnused ? null
}:
assert langJava -> zip != null && unzip != null
&& zlib != null && boehmgc != null
&& perl != null; # for `--enable-java-home'
assert langAda -> gnatboot != null;
assert langVhdl -> gnat != null;
# LTO needs libelf and zlib.
assert libelf != null -> zlib != null;
# Make sure we get GNU sed.
assert stdenv.isDarwin -> gnused != null;
# The go frontend is written in c++
assert langGo -> langCC;
with lib;
with builtins;
let version = "4.7.3";
# Whether building a cross-compiler for GNU/Hurd.
crossGNU = cross != null && cross.config == "i586-pc-gnu";
/* gccinstall.info says that "parallel make is currently not supported since
collisions in profile collecting may occur".
Parallel make of gfortran is disabled because of an apparent race
condition concerning the generation of "bconfig.h". Please try and
re-enable parallel make for a later release of gfortran to check whether
the error has been fixed.
*/
enableParallelBuilding = !profiledCompiler && !langFortran;
patches = []
++ optional enableParallelBuilding ./parallel-bconfig-4.7.patch
++ optional stdenv.isArm [ ./arm-eabi.patch ]
++ optional (cross != null) ./libstdc++-target.patch
# ++ optional noSysDirs ./no-sys-dirs.patch
# The GNAT Makefiles did not pay attention to CFLAGS_FOR_TARGET for its
# target libraries and tools.
++ optional langAda ./gnat-cflags.patch
++ optional langFortran ./gfortran-driving.patch;
javaEcj = fetchurl {
# The `$(top_srcdir)/ecj.jar' file is automatically picked up at
# `configure' time.
# XXX: Eventually we might want to take it from upstream.
url = "ftp://sourceware.org/pub/java/ecj-4.3.jar";
sha256 = "0jz7hvc0s6iydmhgh5h2m15yza7p2rlss2vkif30vm9y77m97qcx";
};
# Antlr (optional) allows the Java `gjdoc' tool to be built. We want a
# binary distribution here to allow the whole chain to be bootstrapped.
javaAntlr = fetchurl {
url = "http://www.antlr.org/download/antlr-3.1.3.jar";
sha256 = "1f41j0y4kjydl71lqlvr73yagrs2jsg1fjymzjz66mjy7al5lh09";
};
xlibs = [
libX11 libXt libSM libICE libXtst libXrender libXrandr libXi
xproto renderproto xextproto inputproto randrproto
];
javaAwtGtk = langJava && gtk != null;
/* Platform flags */
platformFlags = let
gccArch = lib.attrByPath [ "platform" "gcc" "arch" ] null stdenv;
gccCpu = lib.attrByPath [ "platform" "gcc" "cpu" ] null stdenv;
gccAbi = lib.attrByPath [ "platform" "gcc" "abi" ] null stdenv;
gccFpu = lib.attrByPath [ "platform" "gcc" "fpu" ] null stdenv;
gccFloat = lib.attrByPath [ "platform" "gcc" "float" ] null stdenv;
gccMode = lib.attrByPath [ "platform" "gcc" "mode" ] null stdenv;
withArch = if gccArch != null then " --with-arch=${gccArch}" else "";
withCpu = if gccCpu != null then " --with-cpu=${gccCpu}" else "";
withAbi = if gccAbi != null then " --with-abi=${gccAbi}" else "";
withFpu = if gccFpu != null then " --with-fpu=${gccFpu}" else "";
withFloat = if gccFloat != null then " --with-float=${gccFloat}" else "";
withMode = if gccMode != null then " --with-mode=${gccMode}" else "";
in
(withArch +
withCpu +
withAbi +
withFpu +
withFloat +
withMode);
/* Cross-gcc settings */
crossMingw = (cross != null && cross.libc == "msvcrt");
crossConfigureFlags = let
gccArch = lib.attrByPath [ "gcc" "arch" ] null cross;
gccCpu = lib.attrByPath [ "gcc" "cpu" ] null cross;
gccAbi = lib.attrByPath [ "gcc" "abi" ] null cross;
gccFpu = lib.attrByPath [ "gcc" "fpu" ] null cross;
gccFloat = lib.attrByPath [ "gcc" "float" ] null cross;
gccMode = lib.attrByPath [ "gcc" "mode" ] null cross;
withArch = if gccArch != null then " --with-arch=${gccArch}" else "";
withCpu = if gccCpu != null then " --with-cpu=${gccCpu}" else "";
withAbi = if gccAbi != null then " --with-abi=${gccAbi}" else "";
withFpu = if gccFpu != null then " --with-fpu=${gccFpu}" else "";
withFloat = if gccFloat != null then " --with-float=${gccFloat}" else "";
withMode = if gccMode != null then " --with-mode=${gccMode}" else "";
in
"--target=${cross.config}" +
withArch +
withCpu +
withAbi +
withFpu +
withFloat +
withMode +
(if crossMingw && crossStageStatic then
" --with-headers=${libcCross}/include" +
" --with-gcc" +
" --with-gnu-as" +
" --with-gnu-ld" +
" --with-gnu-ld" +
" --disable-shared" +
" --disable-nls" +
" --disable-debug" +
" --enable-sjlj-exceptions" +
" --enable-threads=win32" +
" --disable-win32-registry"
else if crossStageStatic then
" --disable-libssp --disable-nls" +
" --without-headers" +
" --disable-threads " +
" --disable-libmudflap " +
" --disable-libgomp " +
" --disable-libquadmath" +
" --disable-shared" +
" --disable-decimal-float" # libdecnumber requires libc
else
" --with-headers=${libcCross}/include" +
" --enable-__cxa_atexit" +
" --enable-long-long" +
(if crossMingw then
" --enable-threads=win32" +
" --enable-sjlj-exceptions" +
" --enable-hash-synchronization" +
" --disable-libssp" +
" --disable-nls" +
" --with-dwarf2" +
# I think noone uses shared gcc libs in mingw, so we better do the same.
# In any case, mingw32 g++ linking is broken by default with shared libs,
# unless adding "-lsupc++" to any linking command. I don't know why.
" --disable-shared" +
(if cross.config == "x86_64-w64-mingw32" then
# To keep ABI compatibility with upstream mingw-w64
" --enable-fully-dynamic-string"
else "")
else (if cross.libc == "uclibc" then
# In uclibc cases, libgomp needs an additional '-ldl'
# and as I don't know how to pass it, I disable libgomp.
" --disable-libgomp" else "") +
" --enable-threads=posix" +
" --enable-nls" +
" --disable-decimal-float") # No final libdecnumber (it may work only in 386)
);
stageNameAddon = if crossStageStatic then "-stage-static" else
"-stage-final";
crossNameAddon = if cross != null then "-${cross.config}" + stageNameAddon else "";
bootstrap = cross == null && !stdenv.isArm && !stdenv.isMips;
in
# We need all these X libraries when building AWT with GTK+.
assert gtk != null -> (filter (x: x == null) xlibs) == [];
stdenv.mkDerivation ({
name = "${name}${if stripped then "" else "-debug"}-${version}" + crossNameAddon;
builder = ./builder.sh;
src = fetchurl {
url = "mirror://gnu/gcc/gcc-${version}/gcc-${version}.tar.bz2";
sha256 = "1hx9h64ivarlzi4hxvq42as5m9vlr5cyzaaq4gzj4i619zmkfz1g";
};
inherit patches;
postPatch =
if (stdenv.isGNU
|| (libcCross != null # e.g., building `gcc.crossDrv'
&& libcCross ? crossConfig
&& libcCross.crossConfig == "i586-pc-gnu")
|| (crossGNU && libcCross != null))
then
# On GNU/Hurd glibc refers to Hurd & Mach headers and libpthread is not
# in glibc, so add the right `-I' flags to the default spec string.
assert libcCross != null -> libpthreadCross != null;
let
libc = if libcCross != null then libcCross else stdenv.glibc;
gnu_h = "gcc/config/gnu.h";
extraCPPDeps =
libc.propagatedBuildInputs
++ lib.optional (libpthreadCross != null) libpthreadCross
++ lib.optional (libpthread != null) libpthread;
extraCPPSpec =
concatStrings (intersperse " "
(map (x: "-I${x}/include") extraCPPDeps));
extraLibSpec =
if libpthreadCross != null
then "-L${libpthreadCross}/lib ${libpthreadCross.TARGET_LDFLAGS}"
else "-L${libpthread}/lib";
in
'' echo "augmenting \`CPP_SPEC' in \`${gnu_h}' with \`${extraCPPSpec}'..."
sed -i "${gnu_h}" \
-es'|CPP_SPEC *"\(.*\)$|CPP_SPEC "${extraCPPSpec} \1|g'
echo "augmenting \`LIB_SPEC' in \`${gnu_h}' with \`${extraLibSpec}'..."
sed -i "${gnu_h}" \
-es'|LIB_SPEC *"\(.*\)$|LIB_SPEC "${extraLibSpec} \1|g'
echo "setting \`NATIVE_SYSTEM_HEADER_DIR' and \`STANDARD_INCLUDE_DIR' to \`${libc}/include'..."
sed -i "${gnu_h}" \
-es'|#define STANDARD_INCLUDE_DIR.*$|#define STANDARD_INCLUDE_DIR "${libc}/include"|g'
''
else if cross != null || stdenv.gcc.libc != null then
# On NixOS, use the right path to the dynamic linker instead of
# `/lib/ld*.so'.
let
libc = if libcCross != null then libcCross else stdenv.gcc.libc;
in
'' echo "fixing the \`GLIBC_DYNAMIC_LINKER' and \`UCLIBC_DYNAMIC_LINKER' macros..."
for header in "gcc/config/"*-gnu.h "gcc/config/"*"/"*.h
do
grep -q LIBC_DYNAMIC_LINKER "$header" || continue
echo " fixing \`$header'..."
sed -i "$header" \
-e 's|define[[:blank:]]*\([UCG]\+\)LIBC_DYNAMIC_LINKER\([0-9]*\)[[:blank:]]"\([^\"]\+\)"$|define \1LIBC_DYNAMIC_LINKER\2 "${libc}\3"|g'
done
''
else null;
inherit noSysDirs staticCompiler langJava crossStageStatic
libcCross crossMingw;
nativeBuildInputs = [ texinfo which gettext ]
++ (optional (perl != null) perl)
++ (optional javaAwtGtk pkgconfig);
buildInputs = [ gmp mpfr mpc libelf ]
++ (optional (ppl != null) ppl)
++ (optional (cloog != null) cloog)
++ (optional (zlib != null) zlib)
++ (optionals langJava [ boehmgc zip unzip ])
++ (optionals javaAwtGtk ([ gtk libart_lgpl ] ++ xlibs))
++ (optionals (cross != null) [binutilsCross])
++ (optionals langAda [gnatboot])
++ (optionals langVhdl [gnat])
# The builder relies on GNU sed (for instance, Darwin's `sed' fails with
# "-i may not be used with stdin"), and `stdenvNative' doesn't provide it.
++ (optional stdenv.isDarwin gnused)
;
NIX_LDFLAGS = lib.optionalString stdenv.isSunOS "-lm -ldl";
preConfigure = ''
configureFlagsArray=(
${lib.optionalString (ppl != null && ppl ? dontDisableStatic && ppl.dontDisableStatic)
"'--with-host-libstdcxx=-lstdc++ -lgcc_s'"}
${lib.optionalString (ppl != null && stdenv.isSunOS)
"\"--with-host-libstdcxx=-Wl,-rpath,\$prefix/lib/amd64 -lstdc++\"
\"--with-boot-ldflags=-L../prev-x86_64-pc-solaris2.11/libstdc++-v3/src/.libs\""}
);
${lib.optionalString (stdenv.isSunOS && stdenv.is64bit)
''
export NIX_LDFLAGS=`echo $NIX_LDFLAGS | sed -e s~$prefix/lib~$prefix/lib/amd64~g`
export LDFLAGS_FOR_TARGET="-Wl,-rpath,$prefix/lib/amd64 $LDFLAGS_FOR_TARGET"
export CXXFLAGS_FOR_TARGET="-Wl,-rpath,$prefix/lib/amd64 $CXXFLAGS_FOR_TARGET"
export CFLAGS_FOR_TARGET="-Wl,-rpath,$prefix/lib/amd64 $CFLAGS_FOR_TARGET"
''}
'';
# 'iant' at #go-nuts@freenode, gccgo maintainer, said that
# they have a bug in 4.7.1 if adding "--disable-static"
dontDisableStatic = langGo || staticCompiler;
configureFlags = "
${if stdenv.isSunOS then
" --enable-long-long --enable-libssp --enable-threads=posix --disable-nls --enable-__cxa_atexit " +
# On Illumos/Solaris GNU as is preferred
" --with-gnu-as --without-gnu-ld "
else ""}
--enable-lto
${if enableMultilib then "" else "--disable-multilib"}
${if enableShared then "" else "--disable-shared"}
${if enablePlugin then "--enable-plugin" else "--disable-plugin"}
${if ppl != null then "--with-ppl=${ppl} --disable-ppl-version-check" else ""}
${if cloog != null then
"--with-cloog=${cloog} --disable-cloog-version-check --enable-cloog-backend=isl"
else ""}
${if langJava then
"--with-ecj-jar=${javaEcj} " +
# Follow Sun's layout for the convenience of IcedTea/OpenJDK. See
# <http://mail.openjdk.java.net/pipermail/distro-pkg-dev/2010-April/008888.html>.
"--enable-java-home --with-java-home=\${prefix}/lib/jvm/jre "
else ""}
${if javaAwtGtk then "--enable-java-awt=gtk" else ""}
${if langJava && javaAntlr != null then "--with-antlr-jar=${javaAntlr}" else ""}
--with-gmp=${gmp}
--with-mpfr=${mpfr}
--with-mpc=${mpc}
${if libelf != null then "--with-libelf=${libelf}" else ""}
--disable-libstdcxx-pch
--without-included-gettext
--with-system-zlib
--enable-languages=${
concatStrings (intersperse ","
( optional langC "c"
++ optional langCC "c++"
++ optional langFortran "fortran"
++ optional langJava "java"
++ optional langAda "ada"
++ optional langVhdl "vhdl"
++ optional langGo "go"
)
)
}
${if (stdenv ? glibc && cross == null)
then " --with-native-system-header-dir=${stdenv.glibc}/include"
else ""}
${if langAda then " --enable-libada" else ""}
${if cross == null && stdenv.isi686 then "--with-arch=i686" else ""}
${if cross != null then crossConfigureFlags else ""}
${if !bootstrap then "--disable-bootstrap" else ""}
${if cross == null then platformFlags else ""}
";
targetConfig = if cross != null then cross.config else null;
buildFlags = if bootstrap then
(if profiledCompiler then "profiledbootstrap" else "bootstrap")
else "";
installTargets =
if stripped
then "install-strip"
else "install";
crossAttrs = let
xgccArch = lib.attrByPath [ "gcc" "arch" ] null stdenv.cross;
xgccCpu = lib.attrByPath [ "gcc" "cpu" ] null stdenv.cross;
xgccAbi = lib.attrByPath [ "gcc" "abi" ] null stdenv.cross;
xgccFpu = lib.attrByPath [ "gcc" "fpu" ] null stdenv.cross;
xgccFloat = lib.attrByPath [ "gcc" "float" ] null stdenv.cross;
xwithArch = if xgccArch != null then " --with-arch=${xgccArch}" else "";
xwithCpu = if xgccCpu != null then " --with-cpu=${xgccCpu}" else "";
xwithAbi = if xgccAbi != null then " --with-abi=${xgccAbi}" else "";
xwithFpu = if xgccFpu != null then " --with-fpu=${xgccFpu}" else "";
xwithFloat = if xgccFloat != null then " --with-float=${xgccFloat}" else "";
in {
AR = "${stdenv.cross.config}-ar";
LD = "${stdenv.cross.config}-ld";
CC = "${stdenv.cross.config}-gcc";
CXX = "${stdenv.cross.config}-gcc";
AR_FOR_TARGET = "${stdenv.cross.config}-ar";
LD_FOR_TARGET = "${stdenv.cross.config}-ld";
CC_FOR_TARGET = "${stdenv.cross.config}-gcc";
NM_FOR_TARGET = "${stdenv.cross.config}-nm";
CXX_FOR_TARGET = "${stdenv.cross.config}-g++";
# If we are making a cross compiler, cross != null
NIX_GCC_CROSS = if cross == null then "${stdenv.gccCross}" else "";
dontStrip = true;
configureFlags = ''
${if enableMultilib then "" else "--disable-multilib"}
${if enableShared then "" else "--disable-shared"}
${if ppl != null then "--with-ppl=${ppl.crossDrv}" else ""}
${if cloog != null then "--with-cloog=${cloog.crossDrv} --enable-cloog-backend=isl" else ""}
${if langJava then "--with-ecj-jar=${javaEcj.crossDrv}" else ""}
${if javaAwtGtk then "--enable-java-awt=gtk" else ""}
${if langJava && javaAntlr != null then "--with-antlr-jar=${javaAntlr.crossDrv}" else ""}
--with-gmp=${gmp.crossDrv}
--with-mpfr=${mpfr.crossDrv}
--disable-libstdcxx-pch
--without-included-gettext
--with-system-zlib
--enable-languages=${
concatStrings (intersperse ","
( optional langC "c"
++ optional langCC "c++"
++ optional langFortran "fortran"
++ optional langJava "java"
++ optional langAda "ada"
++ optional langVhdl "vhdl"
++ optional langGo "go"
)
)
}
${if langAda then " --enable-libada" else ""}
--target=${stdenv.cross.config}
${xwithArch}
${xwithCpu}
${xwithAbi}
${xwithFpu}
${xwithFloat}
'';
buildFlags = "";
};
# Needed for the cross compilation to work
AR = "ar";
LD = "ld";
# http://gcc.gnu.org/install/specific.html#x86-64-x-solaris210
CC = if stdenv.system == "x86_64-solaris" then "gcc -m64"
else "gcc";
# Setting $CPATH and $LIBRARY_PATH to make sure both `gcc' and `xgcc' find
# the library headers and binaries, regarless of the language being
# compiled.
# Note: When building the Java AWT GTK+ peer, the build system doesn't
# honor `--with-gmp' et al., e.g., when building
# `libjava/classpath/native/jni/java-math/gnu_java_math_GMP.c', so we just
# add them to $CPATH and $LIBRARY_PATH in this case.
#
# Likewise, the LTO code doesn't find zlib.
CPATH = concatStrings
(intersperse ":" (map (x: x + "/include")
(optionals (zlib != null) [ zlib ]
++ optionals langJava [ boehmgc ]
++ optionals javaAwtGtk xlibs
++ optionals javaAwtGtk [ gmp mpfr ]
++ optional (libpthread != null) libpthread
++ optional (libpthreadCross != null) libpthreadCross
# On GNU/Hurd glibc refers to Mach & Hurd
# headers.
++ optionals (libcCross != null &&
hasAttr "propagatedBuildInputs" libcCross)
libcCross.propagatedBuildInputs)));
LIBRARY_PATH = concatStrings
(intersperse ":" (map (x: x + "/lib")
(optionals (zlib != null) [ zlib ]
++ optionals langJava [ boehmgc ]
++ optionals javaAwtGtk xlibs
++ optionals javaAwtGtk [ gmp mpfr ]
++ optional (libpthread != null) libpthread)));
EXTRA_TARGET_CFLAGS =
if cross != null && libcCross != null
then "-idirafter ${libcCross}/include"
else null;
EXTRA_TARGET_LDFLAGS =
if cross != null && libcCross != null
then "-B${libcCross}/lib -Wl,-L${libcCross}/lib" +
(optionalString (libpthreadCross != null)
" -L${libpthreadCross}/lib -Wl,${libpthreadCross.TARGET_LDFLAGS}")
else null;
passthru = { inherit langC langCC langAda langFortran langVhdl
langGo enableMultilib version; };
inherit enableParallelBuilding;
meta = {
homepage = "http://gcc.gnu.org/";
license = "GPLv3+"; # runtime support libraries are typically LGPLv3+
description = "GNU Compiler Collection, version ${version}"
+ (if stripped then "" else " (with debugging info)");
longDescription = ''
The GNU Compiler Collection includes compiler front ends for C, C++,
Objective-C, Fortran, OpenMP for C/C++/Fortran, Java, and Ada, as well
as libraries for these languages (libstdc++, libgcj, libgomp,...).
GCC development is a part of the GNU Project, aiming to improve the
compiler used in the GNU system including the GNU/Linux variant.
'';
maintainers = [
lib.maintainers.ludo
lib.maintainers.viric
lib.maintainers.shlevy
];
# Volunteers needed for the {Cyg,Dar}win ports of *PPL.
# gnatboot is not available out of linux platforms, so we disable the darwin build
# for the gnat (ada compiler).
platforms = lib.platforms.linux ++ optionals (langAda == false && libelf == null) [ "i686-darwin" ];
};
}
// optionalAttrs (cross != null && cross.libc == "msvcrt" && crossStageStatic) {
makeFlags = [ "all-gcc" "all-target-libgcc" ];
installTargets = "install-gcc install-target-libgcc";
}
# Strip kills static libs of other archs (hence cross != null)
// optionalAttrs (!stripped || cross != null) { dontStrip = true; NIX_STRIP_DEBUG = 0; }
)
================================================
FILE: pkgs/gcc-4.7/gfortran-driving.patch
================================================
This patch fixes interaction with Libtool.
See <http://thread.gmane.org/gmane.comp.gcc.patches/258777>, for details.
--- a/gcc/fortran/gfortranspec.c
+++ b/gcc/fortran/gfortranspec.c
@@ -461,8 +461,15 @@ For more information about these matters, see the file named COPYING\n\n"));
{
fprintf (stderr, _("Driving:"));
for (i = 0; i < g77_newargc; i++)
+ {
+ if (g77_new_decoded_options[i].opt_index == OPT_l)
+ /* Make sure no white space is inserted after `-l'. */
+ fprintf (stderr, " -l%s",
+ g77_new_decoded_options[i].canonical_option[1]);
+ else
fprintf (stderr, " %s",
g77_new_decoded_options[i].orig_option_with_args_text);
+ }
fprintf (stderr, "\n");
}
================================================
FILE: pkgs/gcc-4.7/gnat-cflags.patch
================================================
diff --git a/libada/Makefile.in b/libada/Makefile.in
index f5057a0..337e0c6 100644
--- a/libada/Makefile.in
+++ b/libada/Makefile.in
@@ -55,7 +55,7 @@ GCC_WARN_CFLAGS = $(LOOSE_WARN)
WARN_CFLAGS = @warn_cflags@
TARGET_LIBGCC2_CFLAGS=
-GNATLIBCFLAGS= -g -O2
+GNATLIBCFLAGS= -g -O2 $(CFLAGS)
GNATLIBCFLAGS_FOR_C = $(GNATLIBCFLAGS) $(TARGET_LIBGCC2_CFLAGS) -fexceptions \
-DIN_RTS @have_getipinfo@
--- a/gcc/ada/gcc-interface/Makefile.in
+++ b/gcc/ada/gcc-interface/Makefile.in
@@ -105,7 +105,7 @@ ADAFLAGS = -W -Wall -gnatpg -gnata
SOME_ADAFLAGS =-gnata
FORCE_DEBUG_ADAFLAGS = -g
GNATLIBFLAGS = -gnatpg -nostdinc
-GNATLIBCFLAGS = -g -O2
+GNATLIBCFLAGS = -g -O2 $(CFLAGS_FOR_TARGET)
# Pretend that _Unwind_GetIPInfo is available for the target by default. This
# should be autodetected during the configuration of libada and passed down to
# here, but we need something for --disable-libada and hope for the best.
@@ -193,7 +193,7 @@ RTSDIR = rts$(subst /,_,$(MULTISUBDIR))
# Link flags used to build gnat tools. By default we prefer to statically
# link with libgcc to avoid a dependency on shared libgcc (which is tricky
# to deal with as it may conflict with the libgcc provided by the system).
-GCC_LINK_FLAGS=-static-libgcc
+GCC_LINK_FLAGS=-static-libgcc $(CFLAGS_FOR_TARGET)
# End of variables for you to override.
================================================
FILE: pkgs/gcc-4.7/java-jvgenmain-link.patch
================================================
The `jvgenmain' executable must be linked against `vec.o', among others,
since it uses its vector API.
--- gcc-4.3.3/gcc/java/Make-lang.in 2008-12-05 00:00:19.000000000 +0100
+++ gcc-4.3.3/gcc/java/Make-lang.in 2009-07-03 16:11:41.000000000 +0200
@@ -109,9 +109,9 @@ jcf-dump$(exeext): $(JCFDUMP_OBJS) $(LIB
$(CC) $(ALL_CFLAGS) $(LDFLAGS) -o $@ $(JCFDUMP_OBJS) \
$(CPPLIBS) $(ZLIB) $(LDEXP_LIB) $(LIBS)
-jvgenmain$(exeext): $(JVGENMAIN_OBJS) $(LIBDEPS)
+jvgenmain$(exeext): $(JVGENMAIN_OBJS) $(LIBDEPS) $(BUILD_RTL)
rm -f $@
- $(CC) $(ALL_CFLAGS) $(LDFLAGS) -o $@ $(JVGENMAIN_OBJS) $(LIBS)
+ $(CC) $(ALL_CFLAGS) $(LDFLAGS) -o $@ $(JVGENMAIN_OBJS) $(BUILD_RTL) $(LIBS)
#
# Build hooks:
================================================
FILE: pkgs/gcc-4.7/libstdc++-target.patch
================================================
Patch to make the target libraries 'configure' scripts find the proper CPP.
I noticed that building the mingw32 cross compiler.
Looking at the build script for mingw in archlinux, I think that only nixos
needs this patch. I don't know why.
diff --git a/Makefile.in b/Makefile.in
index 93f66b6..d691917 100644
--- a/Makefile.in
+++ b/Makefile.in
@@ -266,6 +266,7 @@ BASE_TARGET_EXPORTS = \
AR="$(AR_FOR_TARGET)"; export AR; \
AS="$(COMPILER_AS_FOR_TARGET)"; export AS; \
CC="$(CC_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CC; \
+ CPP="$(CC_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS -E"; export CC; \
CFLAGS="$(CFLAGS_FOR_TARGET)"; export CFLAGS; \
CONFIG_SHELL="$(SHELL)"; export CONFIG_SHELL; \
CPPFLAGS="$(CPPFLAGS_FOR_TARGET)"; export CPPFLAGS; \
@@ -291,11 +292,13 @@ BASE_TARGET_EXPORTS = \
RAW_CXX_TARGET_EXPORTS = \
$(BASE_TARGET_EXPORTS) \
CXX_FOR_TARGET="$(RAW_CXX_FOR_TARGET)"; export CXX_FOR_TARGET; \
- CXX="$(RAW_CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX;
+ CXX="$(RAW_CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX; \
+ CXXCPP="$(RAW_CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS -E"; export CXX;
NORMAL_TARGET_EXPORTS = \
$(BASE_TARGET_EXPORTS) \
- CXX="$(CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX;
+ CXX="$(CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX; \
+ CXXCPP="$(CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS -E"; export CXX;
# Where to find GMP
HOST_GMPLIBS = @gmplibs@
================================================
FILE: pkgs/gcc-4.7/no-sys-dirs.patch
================================================
diff -ru gcc-4.3.1-orig/gcc/cppdefault.c gcc-4.3.1/gcc/cppdefault.c
--- gcc-4.3.1-orig/gcc/cppdefault.c 2007-07-26 10:37:01.000000000 +0200
+++ gcc-4.3.1/gcc/cppdefault.c 2008-06-25 17:48:23.000000000 +0200
@@ -41,6 +41,10 @@
# undef CROSS_INCLUDE_DIR
#endif
+#undef LOCAL_INCLUDE_DIR
+#undef SYSTEM_INCLUDE_DIR
+#undef STANDARD_INCLUDE_DIR
+
const struct default_include cpp_include_defaults[]
#ifdef INCLUDE_DEFAULTS
= INCLUDE_DEFAULTS;
diff -ru gcc-4.3.1-orig/gcc/gcc.c gcc-4.3.1/gcc/gcc.c
--- gcc-4.3.1-orig/gcc/gcc.c 2008-03-02 23:55:19.000000000 +0100
+++ gcc-4.3.1/gcc/gcc.c 2008-06-25 17:52:53.000000000 +0200
@@ -1478,10 +1478,10 @@
/* Default prefixes to attach to command names. */
#ifndef STANDARD_STARTFILE_PREFIX_1
-#define STANDARD_STARTFILE_PREFIX_1 "/lib/"
+#define STANDARD_STARTFILE_PREFIX_1 ""
#endif
#ifndef STANDARD_STARTFILE_PREFIX_2
-#define STANDARD_STARTFILE_PREFIX_2 "/usr/lib/"
+#define STANDARD_STARTFILE_PREFIX_2 ""
#endif
#ifdef CROSS_DIRECTORY_STRUCTURE /* Don't use these prefixes for a cross compiler. */
--- gcc-4.3.1-orig/gcc/Makefile.in 2008-05-11 20:54:15.000000000 +0200
+++ gcc-4.3.1/gcc/Makefile.in 2008-06-25 17:48:23.000000000 +0200
@@ -3277,7 +3281,7 @@
-DGPLUSPLUS_INCLUDE_DIR=\"$(gcc_gxx_include_dir)\" \
-DGPLUSPLUS_TOOL_INCLUDE_DIR=\"$(gcc_gxx_include_dir)/$(target_noncanonical)\" \
-DGPLUSPLUS_BACKWARD_INCLUDE_DIR=\"$(gcc_gxx_include_dir)/backward\" \
- -DLOCAL_INCLUDE_DIR=\"$(local_includedir)\" \
+ -DLOCAL_INCLUDE_DIR=\"/no-such-dir\" \
-DCROSS_INCLUDE_DIR=\"$(CROSS_SYSTEM_HEADER_DIR)\" \
-DTOOL_INCLUDE_DIR=\"$(gcc_tooldir)/include\" \
-DPREFIX=\"$(prefix)/\" \
================================================
FILE: pkgs/gcc-4.7/parallel-bconfig-4.7.patch
================================================
diff --git a/gcc/Makefile.in b/gcc/Makefile.in
index 0f6735a..ba93e9b 100644
--- a/gcc/Makefile.in
+++ b/gcc/Makefile.in
@@ -3904,21 +3904,21 @@ build/genflags.o : genflags.c $(RTL_BASE_H) $(OBSTACK_H) $(BCONFIG_H) \
$(SYSTEM_H) coretypes.h $(GTM_H) errors.h $(READ_MD_H) gensupport.h
build/gengenrtl.o : gengenrtl.c $(BCONFIG_H) $(SYSTEM_H) rtl.def
gengtype-lex.o build/gengtype-lex.o : gengtype-lex.c gengtype.h $(SYSTEM_H)
-gengtype-lex.o: $(CONFIG_H)
+gengtype-lex.o: $(CONFIG_H) $(BCONFIG_H)
build/gengtype-lex.o: $(BCONFIG_H)
gengtype-parse.o build/gengtype-parse.o : gengtype-parse.c gengtype.h \
$(SYSTEM_H)
-gengtype-parse.o: $(CONFIG_H)
+gengtype-parse.o: $(CONFIG_H) $(BCONFIG_H)
build/gengtype-parse.o: $(BCONFIG_H)
gengtype-state.o build/gengtype-state.o: gengtype-state.c $(SYSTEM_H) \
gengtype.h errors.h double-int.h version.h $(HASHTAB_H) $(OBSTACK_H) \
$(XREGEX_H)
-gengtype-state.o: $(CONFIG_H)
+gengtype-state.o: $(CONFIG_H) $(BCONFIG_H)
build/gengtype-state.o: $(BCONFIG_H)
gengtype.o build/gengtype.o : gengtype.c $(SYSTEM_H) gengtype.h \
rtl.def insn-notes.def errors.h double-int.h version.h $(HASHTAB_H) \
$(OBSTACK_H) $(XREGEX_H)
-gengtype.o: $(CONFIG_H)
+gengtype.o: $(CONFIG_H) $(BCONFIG_H)
build/gengtype.o: $(BCONFIG_H)
build/genmddeps.o: genmddeps.c $(BCONFIG_H) $(SYSTEM_H) coretypes.h \
errors.h $(READ_MD_H)
================================================
FILE: pkgs/gecko/default.nix
================================================
{ geckoSrc ? null, lib
, stdenv, fetchFromGitHub, pythonFull, which, autoconf213, m4
, perl, unzip, zip, gnumake, yasm, pkgconfig, xlibs, gnome2, pango, freetype, fontconfig, cairo
, dbus, dbus_glib, alsaLib, libpulseaudio
, gtk3, glib, gobjectIntrospection, gdk_pixbuf, atk, gtk2
, git, mercurial, openssl, cmake, procps
, libnotify
, valgrind, gdb, rr
, inotify-tools
, setuptools
, rust # rust & cargo bundled. (otheriwse use pkgs.rust.{rustc,cargo})
, buildFHSUserEnv # Build a FHS environment with all Gecko dependencies.
, llvm, llvmPackages, nasm
, ccache
, zlib, xorg
, rust-cbindgen
, nodejs
, jsdoc
, fzf # needed by "mack try fuzzy"
}:
let
inherit (lib) updateFromGitHub importJSON optionals inNixShell;
gcc = if stdenv.cc.isGNU then stdenv.cc.cc else stdenv.cc.cc.stdenv.cc.cc;
# Gecko sources are huge, we do not want to import them in the nix-store when
# we use this expression for making a build environment.
src =
if inNixShell then
null
else if geckoSrc == null then
fetchFromGitHub (importJSON ./source.json)
else
geckoSrc;
version = "HEAD"; # XXX: builtins.readFile "${src}/browser/config/version.txt";
buildInputs = [
# Expected by "mach"
pythonFull setuptools which autoconf213 m4
# Expected by the configure script
perl unzip zip gnumake yasm pkgconfig
xlibs.libICE xlibs.libSM xlibs.libX11 xlibs.libXau xlibs.libxcb
xlibs.libXdmcp xlibs.libXext xlibs.libXt xlibs.libXtst
xlibs.libXcomposite
xlibs.libXfixes
xlibs.libXdamage xlibs.libXrender
] ++ (if xlibs ? xproto then [
xlibs.damageproto xlibs.printproto xlibs.kbproto
xlibs.renderproto xlibs.xextproto xlibs.xproto
xlibs.compositeproto xlibs.fixesproto
] else [
xorg.xorgproto
]) ++ [
gnome2.libart_lgpl gnome2.libbonobo gnome2.libbonoboui
gnome2.libgnome gnome2.libgnomecanvas gnome2.libgnomeui
gnome2.libIDL
pango freetype fontconfig cairo
dbus dbus_glib
alsaLib libpulseaudio
gtk3 glib gobjectIntrospection gdk_pixbuf atk
gtk2 gnome2.GConf
rust
# For building bindgen
# Building bindgen is now done with the extra options added by genMozConfig
# shellHook, do not include clang directly in order to avoid messing up with
# the choices of the compilers.
# clang
llvm
# mach mochitest
procps
# "mach vendor rust" wants to list modified files by using the vcs.
git mercurial
# needed for compiling cargo-vendor and its dependencies
openssl cmake
# Useful for getting notification at the end of the build.
libnotify
# cbindgen is used to generate C bindings for WebRender.
rust-cbindgen
# nasm is used to build libdav1d.
nasm
# NodeJS is used for tooling around JS development.
nodejs
# Used for building documentation.
# jsdoc
] ++ optionals inNixShell [
valgrind gdb ccache
(if stdenv.isAarch64 then null else rr)
fzf # needed by "mach try fuzzy"
inotify-tools # Workaround download of prebuilt binaries.
];
# bindgen.configure now has a rule to check that with-libclang-path matches CC
# or CXX. Default to the stdenv compiler if we are compiling with clang.
clang_path =
if stdenv.cc.isGNU then "${llvmPackages.clang}/bin/clang"
else "${stdenv.cc}/bin/cc";
libclang_path =
if stdenv.cc.isGNU then "${llvmPackages.clang.cc.lib}/lib"
else "${stdenv.cc.cc.lib}/lib";
genMozConfig = ''
cxxLib=$( echo -n ${gcc}/include/c++/* )
archLib=$cxxLib/$( ${gcc}/bin/gcc -dumpmachine )
cat - > $MOZCONFIG <<EOF
ac_add_options --disable-bootstrap
#ac_add_options --without-wasm-sandboxed-libraries # this may be needed
mk_add_options AUTOCONF=${autoconf213}/bin/autoconf
ac_add_options --with-libclang-path=${libclang_path}
ac_add_options --with-clang-path=${clang_path}
export BINDGEN_CFLAGS="-cxx-isystem $cxxLib -isystem $archLib"
export CC="${stdenv.cc}/bin/cc"
export CXX="${stdenv.cc}/bin/c++"
EOF
'';
shellHook = ''
export MOZCONFIG=$PWD/.mozconfig.nix-shell
export MOZBUILD_STATE_PATH=$PWD/.mozbuild
export CC="${stdenv.cc}/bin/cc";
export CXX="${stdenv.cc}/bin/c++";
# To be used when building the JS Shell.
export NIX_EXTRA_CONFIGURE_ARGS="--with-libclang-path=${libclang_path} --with-clang-path=${clang_path}"
cxxLib=$( echo -n ${gcc}/include/c++/* )
archLib=$cxxLib/$( ${gcc}/bin/gcc -dumpmachine )
export BINDGEN_CFLAGS="-cxx-isystem $cxxLib -isystem $archLib"
${genMozConfig}
${builtins.getEnv "NIX_SHELL_HOOK"}
unset AS
'';
# propagatedBuildInput should already have applied the "lib.chooseDevOutputs"
# on the propagated build inputs.
pullAllInputs = inputs:
inputs ++ lib.concatMap (i: pullAllInputs (i.propagatedNativeBuildInputs or [])) inputs;
fhs = buildFHSUserEnv rec {
name = "gecko-deps-fhs";
targetPkgs = _: pullAllInputs (lib.chooseDevOutputs (buildInputs ++ [ stdenv.cc zlib xorg.libXinerama xorg.libXxf86vm ]));
multiPkgs = null; #targetPkgs;
extraOutputsToInstall = [ "share" ];
profile = ''
# build-fhs-userenv/env.nix adds it, but causes 'ls' to SEGV.
unset LD_LIBRARY_PATH;
export LD_LIBRARY_PATH=/lib/;
export IN_NIX_SHELL=1
export PKG_CONFIG_PATH=/usr/lib/pkgconfig:/usr/share/pkgconfig
${shellHook}
'';
};
in
stdenv.mkDerivation {
name = "gecko-dev-${version}";
inherit src buildInputs shellHook;
# Useful for debugging this Nix expression.
tracePhases = true;
configurePhase = ''
unset AS; # Set to CC when configured.
export MOZBUILD_STATE_PATH=$(pwd)/.mozbuild
export MOZCONFIG=$(pwd)/.mozconfig
export builddir=$(pwd)/builddir
${genMozConfig}
mkdir -p $MOZBUILD_STATE_PATH $builddir
echo >> $MOZCONFIG "
# . $src/build/mozconfig.common
ac_add_options --enable-application=browser
mk_add_options MOZ_OBJDIR=$builddir
ac_add_options --prefix=$out
ac_add_options --enable-official-branding
"
'';
AUTOCONF = "${autoconf213}/bin/autoconf";
buildPhase = ''
cd $builddir
$src/mach build
'';
installPhase = ''
cd $builddir
$src/mach install
'';
# TODO: are there tests we would like to run? or should we package them separately?
doCheck = false;
doInstallCheck = false;
# This is for debugging purposes, go to hell damn wrapper which are removing
# all I need for debugging.
hardeningDisable = [ "all" ];
passthru.updateScript = updateFromGitHub {
owner = "mozilla";
repo = "gecko-dev";
branch = "master";
path = "pkgs/gecko/source.json";
};
passthru.fhs = fhs; # gecko.x86_64-linux.gcc.fhs.env
}
================================================
FILE: pkgs/gecko/source.json
================================================
{
"owner": "mozilla",
"repo": "gecko-dev",
"rev": "fee636af734a0ce6dc7335691cc94664bafc385d",
"sha256": "0nnkqmglbi2znkz1avnyn064i5hngvsqrmhw8ccg6g4ga9bac8fv"
}
================================================
FILE: pkgs/git-cinnabar/default.nix
================================================
{ stdenv, fetchFromGitHub, autoconf
, zlib
, python
, perl
, gettext
, git
, mercurial
, curl
}:
# NOTE: git-cinnabar depends on a specific version of git-core, thus you should
# ensure that you install a git-cinnabar version which matches your git version.
#
# NOTE: This package only provides git-cinnabar tools, as a git users might want
# to have additional commands not provided by this forked version of git-core.
stdenv.mkDerivation rec {
version = "0.5.4";
name = "git-cinnabar-${version}";
src = fetchFromGitHub {
owner = "glandium";
repo = "git-cinnabar";
inherit name;
rev = version; # tag name
fetchSubmodules = true;
sha256 = "1cjn2cc6mj4m736wxab9s6qx83p5n5ha8cr3x84s9ra6rxs8d7pi";
};
buildInputs = [ autoconf python gettext git curl ];
ZLIB_PATH = zlib;
ZLIB_DEV_PATH = zlib.dev;
PERL_PATH = "${perl}/bin/perl";
NO_TCLTK = true;
V=1;
preBuild = ''
export ZLIB_PATH;
export ZLIB_DEV_PATH;
substituteInPlace git-core/Makefile --replace \
'$(ZLIB_PATH)/include' '$(ZLIB_DEV_PATH)/include'
# Comment out calls to git to try to verify that git-core is up to date
substituteInPlace Makefile \
--replace '$(eval $(call exec,git' '# $(eval $(call exec,git'
export PERL_PATH;
export NO_TCLTK
export V;
'';
makeFlags = "prefix=\${out}";
installTargets = "git-install";
postInstall =
let mercurial-py = mercurial + "/" + mercurial.python.sitePackages; in ''
# git-cinnabar rebuild git, we do not need that.
rm -rf $out/bin/* $out/share $out/lib
for f in $out/libexec/git-core/{git-remote-hg,git-cinnabar} ; do
substituteInPlace $f --replace \
"sys.path.append(os.path.join(os.path.dirname(__file__), 'pythonlib'))" \
"sys.path.extend(['$out/libexec/git-core/pythonlib', '${mercurial-py}'])"
mv $f $out/bin
done
mv $out/libexec/git-core/git-cinnabar-helper $out/bin/git-cinnabar-helper
mv $out/libexec/git-core/pythonlib $out/pythonlib
rm -rf $out/libexec/git-core/*
mv $out/pythonlib $out/libexec/git-core/pythonlib
substituteInPlace $out/libexec/git-core/pythonlib/cinnabar/helper.py \
--replace 'Git.config('cinnabar.helper')' "Git.config('cinnabar.helper') or '$out/bin/git-cinnabar-helper'"
'';
}
================================================
FILE: pkgs/jsdoc/default.nix
================================================
# This file has been generated by node2nix 1.9.0. Do not edit!
{pkgs ? import <nixpkgs> {
inherit system;
}, system ? builtins.currentSystem, nodejs ? pkgs."nodejs-12_x"}:
let
nodeEnv = import ./node-env.nix {
inherit (pkgs) stdenv lib python2 runCommand writeTextFile;
inherit pkgs nodejs;
libtool = if pkgs.stdenv.isDarwin then pkgs.darwin.cctools else null;
};
in
import ./node-packages.nix {
inherit (pkgs) fetchurl nix-gitignore stdenv lib fetchgit;
inherit nodeEnv;
}
================================================
FILE: pkgs/jsdoc/node-env.nix
================================================
# This file originates from node2nix
{lib, stdenv, nodejs, python2, pkgs, libtool, runCommand, writeTextFile}:
let
# Workaround to cope with utillinux in Nixpkgs 20.09 and util-linux in Nixpkgs master
utillinux = if pkgs ? utillinux then pkgs.utillinux else pkgs.util-linux;
python = if nodejs ? python then nodejs.python else python2;
# Create a tar wrapper that filters all the 'Ignoring unknown extended header keyword' noise
tarWrapper = runCommand "tarWrapper" {} ''
mkdir -p $out/bin
cat > $out/bin/tar <<EOF
#! ${stdenv.shell} -e
$(type -p tar) "\$@" --warning=no-unknown-keyword --delay-directory-restore
EOF
chmod +x $out/bin/tar
'';
# Function that generates a TGZ file from a NPM project
buildNodeSourceDist =
{ name, version, src, ... }:
stdenv.mkDerivation {
name = "node-tarball-${name}-${version}";
inherit src;
buildInputs = [ nodejs ];
buildPhase = ''
export HOME=$TMPDIR
tgzFile=$(npm pack | tail -n 1) # Hooks to the pack command will add output (https://docs.npmjs.com/misc/scripts)
'';
installPhase = ''
mkdir -p $out/tarballs
mv $tgzFile $out/tarballs
mkdir -p $out/nix-support
echo "file source-dist $out/tarballs/$tgzFile" >> $out/nix-support/hydra-build-products
'';
};
includeDependencies = {dependencies}:
lib.optionalString (dependencies != [])
(lib.concatMapStrings (dependency:
''
# Bundle the dependencies of the package
mkdir -p node_modules
cd node_modules
# Only include dependencies if they don't exist. They may also be bundled in the package.
if [ ! -e "${dependency.name}" ]
then
${composePackage dependency}
fi
cd ..
''
) dependencies);
# Recursively composes the dependencies of a package
composePackage = { name, packageName, src, dependencies ? [], ... }@args:
builtins.addErrorContext "while evaluating node package '${packageName}'" ''
DIR=$(pwd)
cd $TMPDIR
unpackFile ${src}
# Make the base dir in which the target dependency resides first
mkdir -p "$(dirname "$DIR/${packageName}")"
if [ -f "${src}" ]
then
# Figure out what directory has been unpacked
packageDir="$(find . -maxdepth 1 -type d | tail -1)"
# Restore write permissions to make building work
find "$packageDir" -type d -exec chmod u+x {} \;
chmod -R u+w "$packageDir"
# Move the extracted tarball into the output folder
mv "$packageDir" "$DIR/${packageName}"
elif [ -d "${src}" ]
then
# Get a stripped name (without hash) of the source directory.
# On old nixpkgs it's already set internally.
if [ -z "$strippedName" ]
then
strippedName="$(stripHash ${src})"
fi
# Restore write permissions to make building work
chmod -R u+w "$strippedName"
# Move the extracted directory into the output folder
mv "$strippedName" "$DIR/${packageName}"
fi
# Unset the stripped name to not confuse the next unpack step
unset strippedName
# Include the dependencies of the package
cd "$DIR/${packageName}"
${includeDependencies { inherit dependencies; }}
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
'';
pinpointDependencies = {dependencies, production}:
let
pinpointDependenciesFromPackageJSON = writeTextFile {
name = "pinpointDependencies.js";
text = ''
var fs = require('fs');
var path = require('path');
function resolveDependencyVersion(location, name) {
if(location == process.env['NIX_STORE']) {
return null;
} else {
var dependencyPackageJSON = path.join(location, "node_modules", name, "package.json");
if(fs.existsSync(dependencyPackageJSON)) {
var dependencyPackageObj = JSON.parse(fs.readFileSync(dependencyPackageJSON));
if(dependencyPackageObj.name == name) {
return dependencyPackageObj.version;
}
} else {
return resolveDependencyVersion(path.resolve(location, ".."), name);
}
}
}
function replaceDependencies(dependencies) {
if(typeof dependencies == "object" && dependencies !== null) {
for(var dependency in dependencies) {
var resolvedVersion = resolveDependencyVersion(process.cwd(), dependency);
if(resolvedVersion === null) {
process.stderr.write("WARNING: cannot pinpoint dependency: "+dependency+", context: "+process.cwd()+"\n");
} else {
dependencies[dependency] = resolvedVersion;
}
}
}
}
/* Read the package.json configuration */
var packageObj = JSON.parse(fs.readFileSync('./package.json'));
/* Pinpoint all dependencies */
replaceDependencies(packageObj.dependencies);
if(process.argv[2] == "development") {
replaceDependencies(packageObj.devDependencies);
}
replaceDependencies(packageObj.optionalDependencies);
/* Write the fixed package.json file */
fs.writeFileSync("package.json", JSON.stringify(packageObj, null, 2));
'';
};
in
''
node ${pinpointDependenciesFromPackageJSON} ${if production then "production" else "development"}
${lib.optionalString (dependencies != [])
''
if [ -d node_modules ]
then
cd node_modules
${lib.concatMapStrings (dependency: pinpointDependenciesOfPackage dependency) dependencies}
cd ..
fi
''}
'';
# Recursively traverses all dependencies of a package and pinpoints all
# dependencies in the package.json file to the versions that are actually
# being used.
pinpointDependenciesOfPackage = { packageName, dependencies ? [], production ? true, ... }@args:
''
if [ -d "${packageName}" ]
then
cd "${packageName}"
${pinpointDependencies { inherit dependencies production; }}
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
fi
'';
# Extract the Node.js source code which is used to compile packages with
# native bindings
nodeSources = runCommand "node-sources" {} ''
tar --no-same-owner --no-same-permissions -xf ${nodejs.src}
mv node-* $out
'';
# Script that adds _integrity fields to all package.json files to prevent NPM from consulting the cache (that is empty)
addIntegrityFieldsScript = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
function augmentDependencies(baseDir, dependencies) {
for(var dependencyName in dependencies) {
var dependency = dependencies[dependencyName];
// Open package.json and augment metadata fields
var packageJSONDir = path.join(baseDir, "node_modules", dependencyName);
var packageJSONPath = path.join(packageJSONDir, "package.json");
if(fs.existsSync(packageJSONPath)) { // Only augment packages that exist. Sometimes we may have production installs in which development dependencies can be ignored
console.log("Adding metadata fields to: "+packageJSONPath);
var packageObj = JSON.parse(fs.readFileSync(packageJSONPath));
if(dependency.integrity) {
packageObj["_integrity"] = dependency.integrity;
} else {
packageObj["_integrity"] = "sha1-000000000000000000000000000="; // When no _integrity string has been provided (e.g. by Git dependencies), add a dummy one. It does not seem to harm and it bypasses downloads.
}
if(dependency.resolved) {
packageObj["_resolved"] = dependency.resolved; // Adopt the resolved property if one has been provided
} else {
packageObj["_resolved"] = dependency.version; // Set the resolved version to the version identifier. This prevents NPM from cloning Git repositories.
}
if(dependency.from !== undefined) { // Adopt from property if one has been provided
packageObj["_from"] = dependency.from;
}
fs.writeFileSync(packageJSONPath, JSON.stringify(packageObj, null, 2));
}
// Augment transitive dependencies
if(dependency.dependencies !== undefined) {
augmentDependencies(packageJSONDir, dependency.dependencies);
}
}
}
if(fs.existsSync("./package-lock.json")) {
var packageLock = JSON.parse(fs.readFileSync("./package-lock.json"));
if(![1, 2].includes(packageLock.lockfileVersion)) {
process.stderr.write("Sorry, I only understand lock file versions 1 and 2!\n");
process.exit(1);
}
if(packageLock.dependencies !== undefined) {
augmentDependencies(".", packageLock.dependencies);
}
}
'';
};
# Reconstructs a package-lock file from the node_modules/ folder structure and package.json files with dummy sha1 hashes
reconstructPackageLock = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
var packageObj = JSON.parse(fs.readFileSync("package.json"));
var lockObj = {
name: packageObj.name,
version: packageObj.version,
lockfileVersion: 1,
requires: true,
dependencies: {}
};
function augmentPackageJSON(filePath, dependencies) {
var packageJSON = path.join(filePath, "package.json");
if(fs.existsSync(packageJSON)) {
var packageObj = JSON.parse(fs.readFileSync(packageJSON));
dependencies[packageObj.name] = {
version: packageObj.version,
integrity: "sha1-000000000000000000000000000=",
dependencies: {}
};
processDependencies(path.join(filePath, "node_modules"), dependencies[packageObj.name].dependencies);
}
}
function processDependencies(dir, dependencies) {
if(fs.existsSync(dir)) {
var files = fs.readdirSync(dir);
files.forEach(function(entry) {
var filePath = path.join(dir, entry);
var stats = fs.statSync(filePath);
if(stats.isDirectory()) {
if(entry.substr(0, 1) == "@") {
// When we encounter a namespace folder, augment all packages belonging to the scope
var pkgFiles = fs.readdirSync(filePath);
pkgFiles.forEach(function(entry) {
if(stats.isDirectory()) {
var pkgFilePath = path.join(filePath, entry);
augmentPackageJSON(pkgFilePath, dependencies);
}
});
} else {
augmentPackageJSON(filePath, dependencies);
}
}
});
}
}
processDependencies("node_modules", lockObj.dependencies);
fs.writeFileSync("package-lock.json", JSON.stringify(lockObj, null, 2));
'';
};
prepareAndInvokeNPM = {packageName, bypassCache, reconstructLock, npmFlags, production}:
let
forceOfflineFlag = if bypassCache then "--offline" else "--registry http://www.example.com";
in
''
# Pinpoint the versions of all dependencies to the ones that are actually being used
echo "pinpointing versions of dependencies..."
source $pinpointDependenciesScriptPath
# Patch the shebangs of the bundled modules to prevent them from
# calling executables outside the Nix store as much as possible
patchShebangs .
# Deploy the Node.js package by running npm install. Since the
# dependencies have been provided already by ourselves, it should not
# attempt to install them again, which is good, because we want to make
# it Nix's responsibility. If it needs to install any dependencies
# anyway (e.g. because the dependency parameters are
# incomplete/incorrect), it fails.
#
# The other responsibilities of NPM are kept -- version checks, build
# steps, postprocessing etc.
export HOME=$TMPDIR
cd "${packageName}"
runHook preRebuild
${lib.optionalString bypassCache ''
${lib.optionalString reconstructLock ''
if [ -f package-lock.json ]
then
echo "WARNING: Reconstruct lock option enabled, but a lock file already exists!"
echo "This will most likely result in version mismatches! We will remove the lock file and regenerate it!"
rm package-lock.json
else
echo "No package-lock.json file found, reconstructing..."
fi
node ${reconstructPackageLock}
''}
node ${addIntegrityFieldsScript}
''}
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} rebuild
if [ "''${dontNpmInstall-}" != "1" ]
then
# NPM tries to download packages even when they already exist if npm-shrinkwrap is used.
rm -f npm-shrinkwrap.json
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} install
fi
'';
# Builds and composes an NPM package including all its dependencies
buildNodePackage =
{ name
, packageName
, version
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, preRebuild ? ""
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "preRebuild" "unpackPhase" "buildPhase" ];
in
stdenv.mkDerivation ({
name = "node_${name}-${version}";
buildInputs = [ tarWrapper python nodejs ]
++ lib.optional (stdenv.isLinux) utillinux
++ lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit nodejs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall preRebuild unpackPhase buildPhase;
compositionScript = composePackage args;
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "compositionScript" "pinpointDependenciesScript" ];
installPhase = ''
# Create and enter a root node_modules/ folder
mkdir -p $out/lib/node_modules
cd $out/lib/node_modules
# Compose the package and all its dependencies
source $compositionScriptPath
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Create symlink to the deployed executable folder, if applicable
if [ -d "$out/lib/node_modules/.bin" ]
then
ln -s $out/lib/node_modules/.bin $out/bin
fi
# Create symlinks to the deployed manual page folders, if applicable
if [ -d "$out/lib/node_modules/${packageName}/man" ]
then
mkdir -p $out/share
for dir in "$out/lib/node_modules/${packageName}/man/"*
do
mkdir -p $out/share/man/$(basename "$dir")
for page in "$dir"/*
do
ln -s $page $out/share/man/$(basename "$dir")
done
done
fi
# Run post install hook, if provided
runHook postInstall
'';
} // extraArgs);
# Builds a node environment (a node_modules folder and a set of binaries)
buildNodeDependencies =
{ name
, packageName
, version
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" ];
in
stdenv.mkDerivation ({
name = "node-dependencies-${name}-${version}";
buildInputs = [ tarWrapper python nodejs ]
++ lib.optional (stdenv.isLinux) utillinux
++ lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall unpackPhase buildPhase;
includeScript = includeDependencies { inherit dependencies; };
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "includeScript" "pinpointDependenciesScript" ];
installPhase = ''
mkdir -p $out/${packageName}
cd $out/${packageName}
source $includeScriptPath
# Create fake package.json to make the npm commands work properly
cp ${src}/package.json .
chmod 644 package.json
${lib.optionalString bypassCache ''
if [ -f ${src}/package-lock.json ]
then
cp ${src}/package-lock.json .
fi
''}
# Go to the parent folder to make sure that all packages are pinpointed
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Expose the executables that were installed
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
mv ${packageName} lib
ln -s $out/lib/node_modules/.bin $out/bin
'';
} // extraArgs);
# Builds a development shell
buildNodeShell =
{ name
, packageName
, version
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
nodeDependencies = buildNodeDependencies args;
in
stdenv.mkDerivation {
name = "node-shell-${name}-${version}";
buildInputs = [ python nodejs ] ++ lib.optional (stdenv.isLinux) utillinux ++ buildInputs;
buildCommand = ''
mkdir -p $out/bin
cat > $out/bin/shell <<EOF
#! ${stdenv.shell} -e
$shellHook
exec ${stdenv.shell}
EOF
chmod +x $out/bin/shell
'';
# Provide the dependencies in a development shell through the NODE_PATH environment variable
inherit nodeDependencies;
shellHook = lib.optionalString (dependencies != []) ''
export NODE_PATH=${nodeDependencies}/lib/node_modules
export PATH="${nodeDependencies}/bin:$PATH"
'';
};
in
{
buildNodeSourceDist = lib.makeOverridable buildNodeSourceDist;
buildNodePackage = lib.makeOverridable buildNodePackage;
buildNodeDependencies = lib.makeOverridable buildNodeDependencies;
buildNodeShell = lib.makeOverridable buildNodeShell;
}
================================================
FILE: pkgs/jsdoc/node-packages.nix
================================================
# This file has been generated by node2nix 1.9.0. Do not edit!
{nodeEnv, fetchurl, fetchgit, nix-gitignore, stdenv, lib, globalBuildInputs ? []}:
let
sources = {
"@babel/parser-7.12.15" = {
name = "_at_babel_slash_parser";
packageName = "@babel/parser";
version = "7.12.15";
src = fetchurl {
url = "https://registry.npmjs.org/@babel/parser/-/parser-7.12.15.tgz";
sha512 = "AQBOU2Z9kWwSZMd6lNjCX0GUgFonL1wAM1db8L8PMk9UDaGsRCArBkU4Sc+UCM3AE4hjbXx+h58Lb3QT4oRmrA==";
};
};
"argparse-1.0.10" = {
name = "argparse";
packageName = "argparse";
version = "1.0.10";
src = fetchurl {
url = "https://registry.npmjs.org/argparse/-/argparse-1.0.10.tgz";
sha512 = "o5Roy6tNG4SL/FOkCAN6RzjiakZS25RLYFrcMttJqbdd8BWrnA+fGz57iN5Pb06pvBGvl5gQ0B48dJlslXvoTg==";
};
};
"bluebird-3.7.2" = {
name = "bluebird";
packageName = "bluebird";
version = "3.7.2";
src = fetchurl {
url = "https://registry.npmjs.org/bluebird/-/bluebird-3.7.2.tgz";
sha512 = "XpNj6GDQzdfW+r2Wnn7xiSAd7TM3jzkxGXBGTtWKuSXv1xUV+azxAm8jdWZN06QTQk+2N2XB9jRDkvbmQmcRtg==";
};
};
"catharsis-0.8.11" = {
name = "catharsis";
packageName = "catharsis";
version = "0.8.11";
src = fetchurl {
url = "https://registry.npmjs.org/catharsis/-/catharsis-0.8.11.tgz";
sha512 = "a+xUyMV7hD1BrDQA/3iPV7oc+6W26BgVJO05PGEoatMyIuPScQKsde6i3YorWX1qs+AZjnJ18NqdKoCtKiNh1g==";
};
};
"entities-2.0.3" = {
name = "entities";
packageName = "entities";
version = "2.0.3";
src = fetchurl {
url = "https://registry.npmjs.org/entities/-/entities-2.0.3.tgz";
sha512 = "MyoZ0jgnLvB2X3Lg5HqpFmn1kybDiIfEQmKzTb5apr51Rb+T3KdmMiqa70T+bhGnyv7bQ6WMj2QMHpGMmlrUYQ==";
};
};
"escape-string-regexp-2.0.0" = {
name = "escape-string-regexp";
packageName = "escape-string-regexp";
version = "2.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-2.0.0.tgz";
sha512 = "UpzcLCXolUWcNu5HtVMHYdXJjArjsF9C0aNnquZYY4uW/Vu0miy5YoWvbV345HauVvcAUnpRuhMMcqTcGOY2+w==";
};
};
"graceful-fs-4.2.5" = {
name = "graceful-fs";
packageName = "graceful-fs";
version = "4.2.5";
src = fetchurl {
url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.5.tgz";
sha512 = "kBBSQbz2K0Nyn+31j/w36fUfxkBW9/gfwRWdUY1ULReH3iokVJgddZAFcD1D0xlgTmFxJCbUkUclAlc6/IDJkw==";
};
};
"js2xmlparser-4.0.1" = {
name = "js2xmlparser";
packageName = "js2xmlparser";
version = "4.0.1";
src = fetchurl {
url = "https://registry.npmjs.org/js2xmlparser/-/js2xmlparser-4.0.1.tgz";
sha512 = "KrPTolcw6RocpYjdC7pL7v62e55q7qOMHvLX1UCLc5AAS8qeJ6nukarEJAF2KL2PZxlbGueEbINqZR2bDe/gUw==";
};
};
"klaw-3.0.0" = {
name = "klaw";
packageName = "klaw";
version = "3.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/klaw/-/klaw-3.0.0.tgz";
sha512 = "0Fo5oir+O9jnXu5EefYbVK+mHMBeEVEy2cmctR1O1NECcCkPRreJKrS6Qt/j3KC2C148Dfo9i3pCmCMsdqGr0g==";
};
};
"linkify-it-2.2.0" = {
name = "linkify-it";
packageName = "linkify-it";
version = "2.2.0";
src = fetchurl {
url = "https://registry.npmjs.org/linkify-it/-/linkify-it-2.2.0.tgz";
sha512 = "GnAl/knGn+i1U/wjBz3akz2stz+HrHLsxMwHQGofCDfPvlf+gDKN58UtfmUquTY4/MXeE2x7k19KQmeoZi94Iw==";
};
};
"lodash-4.17.20" = {
name = "lodash";
packageName = "lodash";
version = "4.17.20";
src = fetchurl {
url = "https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz";
sha512 = "PlhdFcillOINfeV7Ni6oF1TAEayyZBoZ8bcshTHqOYJYlrqzRK5hagpagky5o4HfCzzd1TRkXPMFq6cKk9rGmA==";
};
};
"markdown-it-10.0.0" = {
name = "markdown-it";
packageName = "markdown-it";
version = "10.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/markdown-it/-/markdown-it-10.0.0.tgz";
sha512 = "YWOP1j7UbDNz+TumYP1kpwnP0aEa711cJjrAQrzd0UXlbJfc5aAq0F/PZHjiioqDC1NKgvIMX+o+9Bk7yuM2dg==";
};
};
"markdown-it-anchor-5.3.0" = {
name = "markdown-it-anchor";
packageName = "markdown-it-anchor";
version = "5.3.0";
src = fetchurl {
url = "https://registry.npmjs.org/markdown-it-anchor/-/markdown-it-anchor-5.3.0.tgz";
sha512 = "/V1MnLL/rgJ3jkMWo84UR+K+jF1cxNG1a+KwqeXqTIJ+jtA8aWSHuigx8lTzauiIjBDbwF3NcWQMotd0Dm39jA==";
};
};
"marked-0.8.2" = {
name = "marked";
packageName = "marked";
version = "0.8.2";
src = fetchurl {
url = "https://registry.npmjs.org/marked/-/marked-0.8.2.tgz";
sha512 = "EGwzEeCcLniFX51DhTpmTom+dSA/MG/OBUDjnWtHbEnjAH180VzUeAw+oE4+Zv+CoYBWyRlYOTR0N8SO9R1PVw==";
};
};
"mdurl-1.0.1" = {
name = "mdurl";
packageName = "mdurl";
version = "1.0.1";
src = fetchurl {
url = "https://registry.npmjs.org/mdurl/-/mdurl-1.0.1.tgz";
sha1 = "fe85b2ec75a59037f2adfec100fd6c601761152e";
};
};
"mkdirp-1.0.4" = {
name = "mkdirp";
packageName = "mkdirp";
version = "1.0.4";
src = fetchurl {
url = "https://registry.npmjs.org/mkdirp/-/mkdirp-1.0.4.tgz";
sha512 = "vVqVZQyf3WLx2Shd0qJ9xuvqgAyKPLAiqITEtqW0oIUjzo3PePDd6fW9iFz30ef7Ysp/oiWqbhszeGWW2T6Gzw==";
};
};
"requizzle-0.2.3" = {
name = "requizzle";
packageName = "requizzle";
version = "0.2.3";
src = fetchurl {
url = "https://registry.npmjs.org/requizzle/-/requizzle-0.2.3.tgz";
sha512 = "YanoyJjykPxGHii0fZP0uUPEXpvqfBDxWV7s6GKAiiOsiqhX6vHNyW3Qzdmqp/iq/ExbhaGbVrjB4ruEVSM4GQ==";
};
};
"sprintf-js-1.0.3" = {
name = "sprintf-js";
packageName = "sprintf-js";
version = "1.0.3";
src = fetchurl {
url = "https://registry.npmjs.org/sprintf-js/-/sprintf-js-1.0.3.tgz";
sha1 = "04e6926f662895354f3dd015203633b857297e2c";
};
};
"strip-json-comments-3.1.1" = {
name = "strip-json-comments";
packageName = "strip-json-comments";
version = "3.1.1";
src = fetchurl {
url = "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz";
sha512 = "6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==";
};
};
"taffydb-2.6.2" = {
name = "taffydb";
packageName = "taffydb";
version = "2.6.2";
src = fetchurl {
url = "https://registry.npmjs.org/taffydb/-/taffydb-2.6.2.tgz";
sha1 = "7cbcb64b5a141b6a2efc2c5d2c67b4e150b2a268";
};
};
"uc.micro-1.0.6" = {
name = "uc.micro";
packageName = "uc.micro";
version = "1.0.6";
src = fetchurl {
url = "https://registry.npmjs.org/uc.micro/-/uc.micro-1.0.6.tgz";
sha512 = "8Y75pvTYkLJW2hWQHXxoqRgV7qb9B+9vFEtidML+7koHUFapnVJAZ6cKs+Qjz5Aw3aZWHMC6u0wJE3At+nSGwA==";
};
};
"underscore-1.10.2" = {
name = "underscore";
packageName = "underscore";
version = "1.10.2";
src = fetchurl {
url = "https://registry.npmjs.org/underscore/-/underscore-1.10.2.tgz";
sha512 = "N4P+Q/BuyuEKFJ43B9gYuOj4TQUHXX+j2FqguVOpjkssLUUrnJofCcBccJSCoeturDoZU6GorDTHSvUDlSQbTg==";
};
};
"xmlcreate-2.0.3" = {
name = "xmlcreate";
packageName = "xmlcreate";
version = "2.0.3";
src = fetchurl {
url = "https://registry.npmjs.org/xmlcreate/-/xmlcreate-2.0.3.tgz";
sha512 = "HgS+X6zAztGa9zIK3Y3LXuJes33Lz9x+YyTxgrkIdabu2vqcGOWwdfCpf1hWLRrd553wd4QCDf6BBO6FfdsRiQ==";
};
};
};
in
{
jsdoc = nodeEnv.buildNodePackage {
name = "jsdoc";
packageName = "jsdoc";
version = "3.6.6";
src = fetchurl {
url = "https://registry.npmjs.org/jsdoc/-/jsdoc-3.6.6.tgz";
sha512 = "znR99e1BHeyEkSvgDDpX0sTiTu+8aQyDl9DawrkOGZTTW8hv0deIFXx87114zJ7gRaDZKVQD/4tr1ifmJp9xhQ==";
};
dependencies = [
sources."@babel/parser-7.12.15"
sources."argparse-1.0.10"
sources."bluebird-3.7.2"
sources."catharsis-0.8.11"
sources."entities-2.0.3"
sources."escape-string-regexp-2.0.0"
sources."graceful-fs-4.2.5"
sources."js2xmlparser-4.0.1"
sources."klaw-3.0.0"
sources."linkify-it-2.2.0"
sources."lodash-4.17.20"
sources."markdown-it-10.0.0"
sources."markdown-it-anchor-5.3.0"
sources."marked-0.8.2"
sources."mdurl-1.0.1"
sources."mkdirp-1.0.4"
sources."requizzle-0.2.3"
sources."sprintf-js-1.0.3"
sources."strip-json-comments-3.1.1"
sources."taffydb-2.6.2"
sources."uc.micro-1.0.6"
sources."underscore-1.10.2"
sources."xmlcreate-2.0.3"
];
buildInputs = globalBuildInputs;
meta = {
description = "An API documentation generator for JavaScript.";
homepage = "https://github.com/jsdoc/jsdoc#readme";
license = "Apache-2.0";
};
production = true;
bypassCache = true;
reconstructLock = true;
};
}
================================================
FILE: pkgs/jsdoc/package.json
================================================
["jsdoc"]
================================================
FILE: pkgs/lib/default.nix
================================================
{ pkgs }:
let
update = import ./update.nix { inherit pkgs; };
in
{ inherit update; }
// update
================================================
FILE: pkgs/lib/update.nix
================================================
{ pkgs }:
let
inherit (pkgs) cacert nix jq curl gnused gnugrep coreutils;
in {
updateFromGitHub = { owner, repo, path, branch }: ''
export SSL_CERT_FILE=${cacert}/etc/ssl/certs/ca-bundle.crt
github_rev() {
${curl.bin}/bin/curl -sSf "https://api.github.com/repos/$1/$2/branches/$3" | \
${jq}/bin/jq '.commit.sha' | \
${gnused}/bin/sed 's/"//g'
}
github_sha256() {
${nix}/bin/nix-prefetch-url \
--unpack \
--type sha256 \
"https://github.com/$1/$2/archive/$3.tar.gz" 2>&1 | \
tail -1
}
echo "=== ${owner}/${repo}@${branch} ==="
echo -n "Looking up latest revision ... "
rev=$(github_rev "${owner}" "${repo}" "${branch}");
echo "revision is \`$rev\`."
sha256=$(github_sha256 "${owner}" "${repo}" "$rev");
echo "sha256 is \`$sha256\`."
if [ "$sha256" == "" ]; then
echo "sha256 is not valid!"
exit 2
fi
source_file=${path}
echo "Content of source file (``$source_file``) written."
cat <<REPO | ${coreutils}/bin/tee "$source_file"
{
"owner": "${owner}",
"repo": "${repo}",
"rev": "$rev",
"sha256": "$sha256"
}
REPO
echo
'';
}
================================================
FILE: pkgs/nixpkgs.json
================================================
{
"owner": "NixOS",
"repo": "nixpkgs-channels",
"rev": "ed070354a9e307fdf20a94cb2af749738562385d",
"sha256": "05pqwg7s4w34v99yykb27031kc21x4n3f33szdi6wv11k4asjyfp"
}
================================================
FILE: pkgs/phlay/default.nix
================================================
{ fetchFromGitHub
, python3Packages
}:
python3Packages.buildPythonApplication rec {
name = "phlay-${version}";
version = "0.2.3";
src = fetchFromGitHub {
owner = "mystor";
repo = "phlay";
rev = "98fcbead18c785db24a4b62fad4a8a525b81f8e1";
sha256 = "1m5c7lq12pgcaab4xrifzi0axaxpx24kb9x2f017pb5ni7lbcg3s";
};
meta = {
description = "A command-line interface for Phabricator";
longDescription = ''
Phlay is an alternative to Arcanist for submitting changes to Phabricator.
You might like Phlay if you do Mozilla development using git and
a "commit series" workflow.
'';
};
# phlay is designed as a single-file Python script with no
# dependencies outside the stdlib.
format = "other";
installPhase = "mkdir -p $out/bin; cp phlay $out/bin";
}
================================================
FILE: pkgs/servo/default.nix
================================================
{ servoSrc ? null
, lib
, rustPlatform
, pkgs
}:
let
inherit (lib) updateFromGitHub;
inherit (pkgs) fetchFromGitHub curl dbus fontconfig freeglut freetype
gperf libxmi llvm mesa mesa_glu openssl pkgconfig makeWrapper writeText
xorg;
inherit (pkgs.stdenv) mkDerivation;
inherit (pkgs.lib) importJSON;
inherit (rustPlatform) buildRustPackage;
inherit (rustPlatform.rust) rustc cargo;
pythonPackages = pkgs.python3Packages;
src =
if servoSrc == null then
fetchFromGitHub (importJSON ./source.json)
else
servoSrc;
# TODO: figure out version from servoSrc
version = "latest";
# TODO: add possibility to test against wayland
xorgCompositorLibs = "${xorg.libXcursor.out}/lib:${xorg.libXi.out}/lib";
servobuild = writeText "servobuild" ''
[tools]
cache-dir = "./downloads"
cargo-home-dir = "./.downloads/clones
system-rust = true
rust-root = "${rustc}/bin/rustc"
system-cargo = true
cargo-root = "${cargo}/bin/cargo"
[build]
'';
servoRust = buildRustPackage rec {
inherit src;
name = "servo-rust-${version}";
postUnpack = ''
pwd
ls -la
exit 100
'';
sourceRoot = "servo/components/servo";
depsSha256 = "0ca0lc8mm8kczll5m03n5fwsr0540c2xbfi4nn9ksn0s4sap50yn";
doCheck = false;
};
in mkDerivation rec {
name = "servo-${version}";
src = servoSrc;
buildInputs = [
#cmake
curl
dbus
fontconfig
freeglut
freetype
gperf
libxmi
llvm
mesa
mesa_glu
openssl
pkgconfig
pythonPackages.pip
pythonPackages.virtualenv
xorg.libX11
xorg.libXmu
# nix stuff
makeWrapper
servoRust
];
preConfigure = ''
ln -s ${servobuild} .servobuild
'';
postInstall = ''
wrapProgram "$out/bin/servo" --prefix LD_LIBRARY_PATH : "${xorgCompositorLibs}"
'';
shellHook = ''
# Servo tries to switch between libX11 and wayland at runtime so we have
# to provide a path
export LD_LIBRARY_PATH=${xorgCompositorLibs}:$LD_LIBRARY_PATH
'';
passthru.updateScript = updateFromGitHub {
owner = "servo";
repo = "servo";
branch = "master";
path = "pkgs/servo/source.json";
};
}
================================================
FILE: release.nix
================================================
# To pin a specific version of nixpkgs, change the nixpkgsSrc argument.
{ nixpkgsSrc ? <nixpkgs>
, supportedSystems ? [ "x86_64-linux" "i686-linux" /* "x86_64-darwin" */
"aarch64-linux"
]
}:
let
lib = (import nixpkgsSrc {}).lib;
# Make an attribute set for each system, the builder is then specialized to
# use the selected system.
forEachSystem = systems: builder /* system -> stdenv -> pkgs */:
lib.genAttrs systems builder;
# Make an attribute set for each compiler, the builder is then be specialized
# to use the selected compiler.
forEachCompiler = compilers: builder: system:
builtins.listToAttrs (map (compiler: {
name = compiler;
value = builder compiler system;
}) compilers);
# Overide the previous derivation, with a different stdenv.
builder = path: compiler: system:
lib.getAttrFromPath path (import nixpkgsSrc {
inherit system;
overlays = [
# Add all packages from nixpkgs-mozilla.
(import ./default.nix)
# Define customStdenvs, which is a set of various compilers which can be
# used to compile the given package against.
(import ./compilers-overlay.nix)
# Use the following overlay to override the requested package from
# nixpkgs, with a custom stdenv taken from the compilers-overlay.
(self: super:
if compiler == null then {}
else lib.setAttrByPath path ((lib.getAttrFromPath path super).override {
stdenv = self.customStdenvs."${compiler}";
}))
];
});
build = path: { systems ? supportedSystems, compilers ? null }:
forEachSystem systems (
if compilers == null
then builder path null
else forEachCompiler compilers (builder path)
);
geckoCompilers = [
"clang"
"clang36"
"clang37"
"clang38"
"clang5"
"clang6"
"clang7"
"clang12"
"clang13"
"gcc"
"gcc6"
"gcc5"
"gcc49"
"gcc48"
#"gcc474"
#"gcc473"
#"gcc472"
];
jobs = {
# For each system, and each compiler, create an attribute with the name of
# the system and compiler. Use this attribute name to select which
# environment you are interested in for building firefox. These can be
# build using the following command:
#
# $ nix-build release.nix -A gecko.x86_64-linux.clang -o firefox-x64
# $ nix-build release.nix -A gecko.i686-linux.gcc48 -o firefox-x86
#
# If you are only interested in getting a build environment, the use the
# nix-shell command instead, which will skip the copy of Firefox sources,
# and pull the the dependencies needed for building firefox with this
# environment.
#
# $ nix-shell release.nix -A gecko.i686-linux.gcc --pure --command '$CC --version'
# $ nix-shell release.nix -A gecko.x86_64-linux.clang --pure
#
# As some of the test script of Gecko are checking against absolute path, a
# fake-FHS is provided for Gecko. It can be accessed by appending
# ".fhs.env" behind the previous commands:
#
# $ nix-shell release.nix -A gecko.x86_64-linux.gcc.fhs.env
#
# Which will spawn a new shell where the closure of everything used to build
# Gecko would be part of the fake-root.
gecko = build [ "devEnv" "gecko" ] { compilers = geckoCompilers; };
latest = {
"firefox-nightly-bin" = build [ "latest" "firefox-nightly-bin" ];
};
git-cinnabar = build [ "git-cinnabar" ];
};
in jobs
================================================
FILE: rust-overlay-install.sh
================================================
#!/bin/sh -e
cd "$(dirname "$0")" || exit
overlay_dir=$HOME/.config/nixpkgs/overlays
name=rust-overlay.nix
echo Installing $name as an overlay
set -x
mkdir -p "$overlay_dir"
ln -s "$PWD/$name" "$overlay_dir/$name"
================================================
FILE: rust-overlay.nix
================================================
# This file provide a Rust overlay, which provides pre-packaged bleeding edge versions of rustc
# and cargo.
self: super:
let
fromTOML =
# nix 2.1 added the fromTOML builtin
if builtins ? fromTOML
then builtins.fromTOML
else (import ./lib/parseTOML.nix).fromTOML;
parseRustToolchain = file: with builtins;
if file == null then
{ }
# Parse *.toml files as TOML
else if self.lib.strings.hasSuffix ".toml" file then
({ channel ? null, date ? null, ... }: { inherit channel date; })
(fromTOML (readFile file)).toolchain
else
# Otherwise, assume the file contains just a rust version string
let
str = readFile file;
# Match toolchain descriptions of type "nightly" or "nightly-2020-01-01"
channel_by_name = match "([a-z]+)(-([0-9]{4}-[0-9]{2}-[0-9]{2}))?.*" str;
# Match toolchain descriptions of type "1.34.0" or "1.34.0-2019-04-10"
channel_by_version = match "([0-9]+\\.[0-9]+\\.[0-9]+)(-([0-9]{4}-[0-9]{2}-[0-9]{2}))?.*" str;
in
(x: { channel = head x; date = (head (tail (tail x))); }) (
if channel_by_name != null then
channel_by_name
else
channel_by_version
);
# In NixOS 24.11, the `pkgs.rust.toRustTarget` has become deprecated in favor of the
# `.rust.rustcTarget` attribute of the platform. This function provides backwards compatibility in
# case the caller is using a nixpkgs older than NixOS 24.11.
toRustTargetCompat = platform:
if platform ? rust && platform.rust ? rustcTarget
then platform.rust.rustcTarget
else super.rust.toRustTarget platform;
# See https://github.com/rust-lang-nursery/rustup.rs/blob/master/src/dist/src/dist.rs
defaultDistRoot = "https://static.rust-lang.org";
manifest_v1_url = {
dist_root ? defaultDistRoot + "/dist",
date ? null,
staging ? false,
# A channel can be "nightly", "beta", "stable", or "\d{1}\.\d{1,3}\.\d{1,2}".
channel ? "nightly",
# A path that points to a rust-toolchain file, typically ./rust-toolchain.
rustToolchain ? null,
...
}:
let args = { inherit channel date; } // parseRustToolchain rustToolchain; in
let inherit (args) date channel; in
if date == null && staging == false
then "${dist_root}/channel-rust-${channel}"
else if date != null && staging == false
then "${dist_root}/${date}/channel-rust-${channel}"
else if date == null && staging == true
then "${dist_root}/staging/channel-rust-${channel}"
else throw "not a real-world case";
manifest_v2_url = args: (manifest_v1_url args) + ".toml";
getComponentsWithFixedPlatform = pkgs: pkgname: stdenv:
let
pkg = pkgs.${pkgname};
srcInfo = pkg.target.${toRustTargetCompat stdenv.targetPlatform} or pkg.target."*";
components = srcInfo.components or [];
componentNamesList =
builtins.map (pkg: pkg.pkg) (builtins.filter (pkg: (pkg.target != "*")) components);
in
componentNamesList;
getExtensions = pkgs: pkgname: stdenv:
let
inherit (super.lib) unique;
pkg = pkgs.${pkgname};
srcInfo = pkg.target.${toRustTargetCompat stdenv.targetPlatform} or pkg.target."*";
extensions = srcInfo.extensions or [];
extensionNamesList = unique (builtins.map (pkg: pkg.pkg) extensions);
in
extensionNamesList;
hasTarget = pkgs: pkgname: target:
pkgs ? ${pkgname}.target.${target};
getTuples = pkgs: name: targets:
builtins.map (target: { inherit name target; }) (builtins.filter (target: hasTarget pkgs name target) targets);
# In the manifest, a package might have different components which are bundled with it, as opposed as the extensions which can be added.
# By default, a package will include the components for the same architecture, and offers them as extensions for other architectures.
#
# This functions returns a list of { name, target } attribute sets, which includes the current system package, and all its components for the selected targets.
# The list contains the package for the pkgTargets as well as the packages for components for all compTargets
getTargetPkgTuples = pkgs: pkgname: pkgTargets: compTargets: stdenv:
let
inherit (builtins) elem;
inherit (super.lib) intersectLists;
components = getComponentsWithFixedPlatform pkgs pkgname stdenv;
extensions = getExtensions pkgs pkgname stdenv;
compExtIntersect = intersectLists components extensions;
tuples = (getTuples pkgs pkgname pkgTargets) ++ (builtins.map (name: getTuples pkgs name compTargets) compExtIntersect);
in
tuples;
getFetchUrl = pkgs: pkgname: target: stdenv: fetchurl:
let
pkg = pkgs.${pkgname};
srcInfo = pkg.target.${target};
in
(super.fetchurl { url = srcInfo.xz_url or srcInfo.url; sha256 = srcInfo.xz_hash or srcInfo.hash; });
checkMissingExtensions = pkgs: pkgname: stdenv: extensions:
let
inherit (builtins) head;
inherit (super.lib) concatStringsSep subtractLists;
availableExtensions = getExtensions pkgs pkgname stdenv;
missingExtensions = subtractLists availableExtensions extensions;
extensionsToInstall =
if missingExtensions == [] then extensions else throw ''
While compiling ${pkgname}: the extension ${head missingExtensions} is not available.
Select extensions from the following list:
${concatStringsSep "\n" availableExtensions}'';
in
extensionsToInstall;
getComponents = pkgs: pkgname: targets: extensions: targetExtensions: stdenv: fetchurl:
let
inherit (builtins) head map;
inherit (super.lib) flatten remove subtractLists unique;
targetExtensionsToInstall = checkMissingExtensions pkgs pkgname stdenv targetExtensions;
extensionsToInstall = checkMissingExtensions pkgs pkgname stdenv extensions;
hostTargets = [ "*" (toRustTargetCompat stdenv.hostPlatform) (toRustTargetCompat stdenv.targetPlatform) ];
pkgTuples = flatten (getTargetPkgTuples pkgs pkgname hostTargets targets stdenv);
extensionTuples = flatten (map (name: getTargetPkgTuples pkgs name hostTargets targets stdenv) extensionsToInstall);
targetExtensionTuples = flatten (map (name: getTargetPkgTuples pkgs name targets targets stdenv) targetExtensionsToInstall);
pkgsTuples = pkgTuples ++ extensionTuples ++ targetExtensionTuples;
missingTargets = subtractLists (map (tuple: tuple.target) pkgsTuples) (remove "*" targets);
pkgsTuplesToInstall =
if missingTargets == [] then pkgsTuples else throw ''
While compiling ${pkgname}: the target ${head missingTargets} is not available for any package.'';
in
map (tuple: { name = tuple.name; src = (getFetchUrl pkgs tuple.name tuple.target stdenv fetchurl); }) pkgsTuplesToInstall;
installComponents = stdenv: namesAndSrcs:
let
inherit (builtins) map;
installComponent = name: src:
stdenv.mkDerivation {
inherit name;
inherit src;
# No point copying src to a build server, then copying back the
# entire unpacked contents after just a little twiddling.
preferLocalBuild = true;
# (@nbp) TODO: Check on Windows and Mac.
# This code is inspired by patchelf/setup-hook.sh to iterate over all binaries.
installPhase = ''
patchShebangs install.sh
CFG_DISABLE_LDCONFIG=1 ./install.sh --prefix=$out --verbose
setInterpreter() {
local dir="$1"
[ -e "$dir" ] || return 0
echo "Patching interpreter of ELF executables and libraries in $dir"
local i
while IFS= read -r -d ''$'\0' i; do
if [[ "$i" =~ .build-id ]]; then continue; fi
if ! isELF "$i"; then continue; fi
echo "setting interpreter of $i"
if [[ -x "$i" ]]; then
# Handle executables
patchelf \
--set-interpreter "$(cat $NIX_CC/nix-support/dynamic-linker)" \
--set-rpath "${super.lib.makeLibraryPath [ self.zlib ]}:$out/lib" \
"$i" || true
else
# Handle libraries
patchelf \
--set-rpath "${super.lib.makeLibraryPath [ self.zlib ]}:$out/lib" \
"$i" || true
fi
done < <(find "$dir" -type f -print0)
}
setInterpreter $out
'';
postFixup = ''
# Function moves well-known files from etc/
handleEtc() {
local oldIFS="$IFS"
# Directories we are aware of, given as substitution lists
for paths in \
"etc/bash_completion.d","share/bash_completion/completions","etc/bash_completions.d","share/bash_completions/completions";
do
# Some directoties may be missing in some versions. If so we just skip them.
# See https://github.com/mozilla/nixpkgs-mozilla/issues/48 for more infomation.
if [ ! -e $paths ]; then continue; fi
IFS=","
set -- $paths
IFS="$oldIFS"
local orig_path="$1"
local wanted_path="$2"
# Rename the files
if [ -d ./"$orig_path" ]; then
mkdir -p "$(dirname ./"$wanted_path")"
fi
mv -v ./"$orig_path" ./"$wanted_path"
# Fail explicitly if etc is not empty so we can add it to the list and/or report it upstream
rmdir ./etc || {
echo Installer tries to install to /etc:
find ./etc
exit 1
}
done
}
if [ -d "$out"/etc ]; then
pushd "$out"
handleEtc
popd
fi
'';
dontStrip = true;
};
in
map (nameAndSrc: (installComponent nameAndSrc.name nameAndSrc.src)) namesAndSrcs;
# Manifest files are organized as follow:
# { date = "2017-03-03";
# pkg.cargo.version= "0.18.0-nightly (5db6d64 2017-03-03)";
# pkg.cargo.target.x86_64-unknown-linux-gnu = {
# available = true;
# hash = "abce..."; # sha256
# url = "https://static.rust-lang.org/dist/....tar.gz";
# xz_hash = "abce..."; # sha256
# xz_url = "https://static.rust-lang.org/dist/....tar.xz";
# };
# }
#
# The packages available usually are:
# cargo, rust-analysis, rust-docs, rust-src, rust-std, rustc, and
# rust, which aggregates them in one package.
#
# For each package the following options are available:
# extensions - The extensions that should be installed for the package.
# For example, install the package rust and add the extension rust-src.
# targets - The package will always be installed for the host system, but with this option
# extra targets can be specified, e.g. "mips-unknown-linux-musl". The target
# will only apply to components of the package that support being installed for
# a different architecture. For example, the rust package will install rust-std
# for the host system and the targets.
# targetExtensions - If you want to force extensions to be installed for the given targets, this is your option.
# All extensions in this list will be installed for the target architectures.
# *Attention* If you want to install an extension like rust-src, that has no fixed architecture (arch *),
# you will need to specify this extension in the extensions options or it will not be installed!
fromManifestFile = manifest: { stdenv, lib, fetchurl, patchelf }:
let
inherit (builtins) elemAt;
inherit (super) makeOverridable;
inherit (super.lib) flip mapAttrs;
pkgs = fromTOML (builtins.readFile manifest);
in
flip mapAttrs pkgs.pkg (name: pkg:
makeOverridable ({extensions, targets, targetExtensions}:
let
version' = builtins.match "([^ ]*) [(]([^ ]*) ([^ ]*)[)]" pkg.version;
version = "${elemAt version' 0}-${elemAt version' 2}-${elemAt version' 1}";
namesAndSrcs = getComponents pkgs.pkg name targets extensions targetExtensions stdenv fetchurl;
components = installComponents stdenv namesAndSrcs;
componentsOuts = builtins.map (comp: (super.lib.strings.escapeNixString (super.lib.getOutput "out" comp))) components;
in
super.pkgs.symlinkJoin {
name = name + "-" + version;
paths = components;
postBuild = ''
# If rustc or rustdoc is in the derivation, we need to copy their
# executable into the final derivation. This is required
# for making them find the correct SYSROOT.
# Similarly, we copy the python files for gdb pretty-printers since
# its auto-load-safe-path mechanism doesn't like symlinked files.
for target in $out/bin/{rustc,rustdoc} $out/lib/rustlib/etc/*.py; do
if [ -e $target ]; then
cp --remove-destination "$(realpath -e $target)" $target
# The SYSROOT is determined by using the librustc_driver-*.so.
# So, we need to point to the *.so files in our derivation.
chmod u+w $target
patchelf --set-rpath "$out/lib" $target || true
fi
done
# Here we copy the librustc_driver-*.so to our derivation.
# The SYSROOT is determined based on the path of this library.
if test "" != $out/lib/librustc_driver-*.so &> /dev/null; then
RUSTC_DRIVER_PATH=$(realpath -e $out/lib/librustc_driver-*.so)
rm $out/lib/librustc_driver-*.so
cp $RUSTC_DRIVER_PATH $out/lib/
fi
'';
# Export the manifest file as part of the nix-support files such
# that one can compute the sha256 of a manifest to freeze it for
# reproducible builds.
MANIFEST_FILE = manifest;
postInstall = ''
mkdir $out/nix-support
cp $MANIFEST_FILE $out/nix-support/manifest.toml
'';
# Add the compiler as part of the propagated build inputs in order
# to run:
#
# $ nix-shell -p rustChannels.stable.rust
#
# And get a fully working Rust compiler, with the stdenv linker.
propagatedBuildInputs = [ stdenv.cc ];
meta.platforms = lib.platforms.all;
}
) { extensions = []; targets = []; targetExtensions = []; }
);
fromManifest = sha256: manifest: { stdenv, lib, fetchurl, patchelf }:
let manifestFile = if sha256 == null then builtins.fetchurl manifest else fetchurl { url = manifest; inherit sha256; };
in fromManifestFile manifestFile { inherit stdenv lib fetchurl patchelf; };
in
rec {
lib = super.lib // {
inherit fromTOML;
rustLib = {
inherit fromManifest fromManifestFile manifest_v2_url;
};
};
rustChannelOf = { sha256 ? null, ... } @ manifest_args: fromManifest
sha256 (manifest_v2_url manifest_args)
{ inherit (super) lib;
inherit (self) stdenv fetchurl patchelf;
} ;
# Set of packages which are automagically updated. Do not rely on these for
# reproducible builds.
latest = (super.latest or {}) // {
rustChannels = {
nightly = rustChannelOf { channel = "nightly"; };
beta = rustChannelOf { channel = "beta"; };
stable = rustChannelOf { channel = "stable"; };
};
};
# Helper builder
rustChannelOfTargets = channel: date: targets:
(rustChannelOf { inherit channel date; })
.rust.override { inherit targets; };
# For backward compatibility
rustChannels = latest.rustChannels;
# For each channel:
# latest.rustChannels.nightly.cargo
# latest.rustChannels.nightly.rust # Aggregate all others. (recommended)
# latest.rustChannels.nightly.rustc
# latest.rustChannels.nightly.rust-analysis
# latest.rustChannels.nightly.rust-docs
# latest.rustChannels.nightly.rust-src
# latest.rustChannels.nightly.rust-std
# For a specific date:
# (rustChannelOf { date = "2017-06-06"; channel = "beta"; }).rust
}
================================================
FILE: rust-src-overlay.nix
================================================
# Overlay that builds on top of rust-overlay.nix.
# Adds rust-src component to all channels which is helpful for racer, intellij, ...
self: super:
let mapAttrs = super.lib.mapAttrs;
flip = super.lib.flip;
in {
# install stable rust with rust-src:
# "nix-env -i -A nixos.latest.rustChannels.stable.rust"
latest.rustChannels =
flip mapAttrs super.latest.rustChannels (name: value: value // {
rust = value.rust.override {
extensions = ["rust-src"];
};
});
}
================================================
FILE: update.nix
================================================
let
_pkgs = import <nixpkgs> {};
_nixpkgs = _pkgs.fetchFromGitHub (_pkgs.lib.importJSON ./pkgs/nixpkgs.json);
in
{ pkgs ? import _nixpkgs {}
, package ? null
, maintainer ? null
, dont_prompt ? false
}:
# TODO: add assert statements
let
pkgs-mozilla = import ./default.nix { inherit pkgs; };
dont_prompt_str = if dont_prompt then "yes" else "no";
packagesWith = cond: return: set:
pkgs.lib.flatten
(pkgs.lib.mapAttrsToList
(name: pkg:
let
result = builtins.tryEval (
if pkgs.lib.isDerivation pkg && cond name pkg
then [(return name pkg)]
else if pkg.recurseForDerivations or false || pkg.recurseForRelease or false
then packagesWith cond return pkg
else []
);
in
if result.success then result.value
else []
)
set
);
packagesWithUpdateScriptAndMaintainer = maintainer':
let
maintainer =
if ! builtins.hasAttr maintainer' pkgs.lib.maintainers then
builtins.throw "Maintainer with name `${maintainer'} does not exist in `lib/maintainers.nix`."
else
builtins.getAttr maintainer' pkgs.lib.maintainers;
in
packagesWith (name: pkg: builtins.hasAttr "updateScript" pkg &&
(if builtins.hasAttr "maintainers" pkg.meta
then (if builtins.isList pkg.meta.maintainers
then builtins.elem maintainer pkg.meta.maintainers
else maintainer == pkg.meta.maintainers
)
else false
)
)
(name: pkg: pkg)
pkgs-mozilla;
packageByName = name:
let
package = pkgs.lib.attrByPath (pkgs.lib.splitString "." name) null pkgs-mozilla;
in
if package == null then
builtins.throw "Package with an attribute name `${name}` does not exists."
else if ! builtins.hasAttr "updateScript" package then
builtins.throw "Package with an attribute name `${name}` does have an `passthru.updateScript` defined."
else
package;
packages =
if package != null then
[ (packageByName package) ]
else if maintainer != null then
packagesWithUpdateScriptAndMaintainer maintainer
else
builtins.throw "No arguments provided.\n\n${helpText}";
helpText = ''
Please run:
% nix-shell maintainers/scripts/update.nix --argstr maintainer garbas
to run all update scripts for all packages that lists \`garbas\` as a maintainer
and have \`updateScript\` defined, or:
% nix-shell maintainers/scripts/update.nix --argstr package garbas
to run update script for specific package.
'';
runUpdateScript = package: ''
echo -ne " - ${package.name}: UPDATING ..."\\r
${package.updateScript} &> ${(builtins.parseDrvName package.name).name}.log
CODE=$?
if [ "$CODE" != "0" ]; then
echo " - ${package.name}: ERROR "
echo ""
echo "--- SHOWING ERROR LOG FOR ${package.name} ----------------------"
echo ""
cat ${(builtins.parseDrvName package.name).name}.log
echo ""
echo "--- SHOWING ERROR LOG FOR ${package.name} ----------------------"
exit $CODE
else
rm ${(builtins.parseDrvName package.name).name}.log
fi
echo " - ${package.name}: DONE. "
'';
in pkgs.stdenv.mkDerivation {
name = "nixpkgs-mozilla-update-script";
buildCommand = ''
echo ""
echo "----------------------------------------------------------------"
echo ""
echo "Not possible to update packages using \`nix-build\`"
echo ""
echo "${helpText}"
echo "----------------------------------------------------------------"
exit 1
'';
shellHook = ''
echo ""
echo "Going to be running update for following packages:"
echo "${builtins.concatStringsSep "\n" (map (x: " - ${x.name}") packages)}"
echo ""
if [ "${dont_prompt_str}" = "no" ]; then
read -n1 -r -p "Press space to continue..." confirm
else
confirm=""
fi
if [ "$confirm" = "" ]; then
echo ""
echo "Running update for:"
${builtins.concatStringsSep "\n" (map runUpdateScript packages)}
echo ""
echo "Packages updated!"
exit 0
else
echo "Aborting!"
exit 1
fi
'';
}
gitextract_gw3xxd2j/ ├── .gitignore ├── .travis.yml ├── CODE_OF_CONDUCT.md ├── LICENSE ├── README.rst ├── compilers-overlay.nix ├── default.nix ├── deploy_rsa.enc ├── firefox-overlay.nix ├── flake.nix ├── git-cinnabar-overlay.nix ├── lib/ │ └── parseTOML.nix ├── lib-overlay.nix ├── overlays.nix ├── package-set.nix ├── phlay-overlay.nix ├── pinned.nix ├── pkgs/ │ ├── cbindgen/ │ │ └── default.nix │ ├── clang/ │ │ └── bug-14435.patch │ ├── firefox-nightly-bin/ │ │ └── update.nix │ ├── gcc-4.7/ │ │ ├── arm-eabi.patch │ │ ├── builder.sh │ │ ├── default.nix │ │ ├── gfortran-driving.patch │ │ ├── gnat-cflags.patch │ │ ├── java-jvgenmain-link.patch │ │ ├── libstdc++-target.patch │ │ ├── no-sys-dirs.patch │ │ └── parallel-bconfig-4.7.patch │ ├── gecko/ │ │ ├── default.nix │ │ └── source.json │ ├── git-cinnabar/ │ │ └── default.nix │ ├── jsdoc/ │ │ ├── default.nix │ │ ├── node-env.nix │ │ ├── node-packages.nix │ │ └── package.json │ ├── lib/ │ │ ├── default.nix │ │ └── update.nix │ ├── nixpkgs.json │ ├── phlay/ │ │ └── default.nix │ └── servo/ │ └── default.nix ├── release.nix ├── rust-overlay-install.sh ├── rust-overlay.nix ├── rust-src-overlay.nix └── update.nix
Condensed preview — 46 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (170K chars).
[
{
"path": ".gitignore",
"chars": 9,
"preview": "/result*\n"
},
{
"path": ".travis.yml",
"chars": 922,
"preview": "language: nix\naddons:\n ssh_known_hosts: floki.garbas.si\nenv:\n- STDENV=clang\n- STDENV=clang36\n- STDENV=clang37\n- STDENV="
},
{
"path": "CODE_OF_CONDUCT.md",
"chars": 691,
"preview": "# Community Participation Guidelines\n\nThis repository is governed by Mozilla's code of conduct and etiquette guidelines."
},
{
"path": "LICENSE",
"chars": 1047,
"preview": "Copyright 2017 Mozilla\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software an"
},
{
"path": "README.rst",
"chars": 11537,
"preview": "nixpkgs-mozilla\n===============\n\nGathering nix efforts in one repository.\n\n\nCurrent packages\n----------------\n\n- gecko ("
},
{
"path": "compilers-overlay.nix",
"chars": 4033,
"preview": "# This overlays add a customStdenv attribute which provide an stdenv with\n# different versions of the compilers. This ca"
},
{
"path": "default.nix",
"chars": 213,
"preview": "# Nixpkgs overlay which aggregates overlays for tools and products, used and\n# published by Mozilla.\nself: super:\n\nwith "
},
{
"path": "firefox-overlay.nix",
"chars": 10780,
"preview": "# This file provide the latest binary versions of Firefox published by Mozilla.\nself: super:\n\nlet\n # This URL needs to "
},
{
"path": "flake.nix",
"chars": 378,
"preview": "{\n description = \"Mozilla overlay for Nixpkgs\";\n\n outputs = { self, ... }: {\n # Default overlay.\n overlay = impo"
},
{
"path": "git-cinnabar-overlay.nix",
"chars": 187,
"preview": "self: super:\n\n{\n git-cinnabar = super.callPackage ./pkgs/git-cinnabar {\n # we need urllib to recognize ssh.\n # py"
},
{
"path": "lib/parseTOML.nix",
"chars": 6641,
"preview": "with builtins;\n\n# Tokenizer.\nlet\n layout_pat = \"[ \\n]+\";\n layout_pat_opt = \"[ \\n]*\";\n token_pat = ''=|[[][[][a-zA-Z0-"
},
{
"path": "lib-overlay.nix",
"chars": 89,
"preview": "self: super:\n\n{\n lib = super.lib // (import ./pkgs/lib/default.nix { pkgs = self; });\n}\n"
},
{
"path": "overlays.nix",
"chars": 98,
"preview": "[\n ./lib-overlay.nix\n ./rust-overlay.nix\n ./firefox-overlay.nix\n ./git-cinnabar-overlay.nix\n]\n"
},
{
"path": "package-set.nix",
"chars": 168,
"preview": "{ pkgs }:\n\nwith pkgs.lib;\nlet\n self = foldl'\n (prev: overlay: prev // (overlay (pkgs // self) (pkgs // prev)))\n {"
},
{
"path": "phlay-overlay.nix",
"chars": 63,
"preview": "self: super:\n\n{\n phlay = super.callPackage ./pkgs/phlay {};\n}\n"
},
{
"path": "pinned.nix",
"chars": 1132,
"preview": "# This script extends nixpkgs with mozilla packages.\n#\n# First it imports the <nixpkgs> in the environment and depends o"
},
{
"path": "pkgs/cbindgen/default.nix",
"chars": 1205,
"preview": "### NOTE: This file is a copy of the one from Nixpkgs repository\n### (taken 2020 February) from commit 82d9ce45fe0b67e37"
},
{
"path": "pkgs/clang/bug-14435.patch",
"chars": 1661,
"preview": "diff -x _inst -x _build -x .svn -ur libcxx.old/include/cstdio libcxx.new/include/cstdio\n--- libcxx.old/include/cstdio "
},
{
"path": "pkgs/firefox-nightly-bin/update.nix",
"chars": 1166,
"preview": "{ name\n, writeScript\n, xidel\n, coreutils\n, gnused\n, gnugrep\n, curl\n, jq\n}:\n\nlet\n version = (builtins.parseDrvName name)"
},
{
"path": "pkgs/gcc-4.7/arm-eabi.patch",
"chars": 11362,
"preview": "Index: gcc-4_7-branch/libstdc++-v3/configure.host\n===================================================================\n--"
},
{
"path": "pkgs/gcc-4.7/builder.sh",
"chars": 8920,
"preview": "source $stdenv/setup\n\n\nexport NIX_FIXINC_DUMMY=$NIX_BUILD_TOP/dummy\nmkdir $NIX_FIXINC_DUMMY\n\n\nif test \"$staticCompiler\" "
},
{
"path": "pkgs/gcc-4.7/default.nix",
"chars": 21781,
"preview": "{ stdenv, lib, fetchurl, noSysDirs\n, langC ? true, langCC ? true, langFortran ? false\n, langJava ? false\n, langAda ? fal"
},
{
"path": "pkgs/gcc-4.7/gfortran-driving.patch",
"chars": 717,
"preview": "This patch fixes interaction with Libtool.\nSee <http://thread.gmane.org/gmane.comp.gcc.patches/258777>, for details.\n\n--"
},
{
"path": "pkgs/gcc-4.7/gnat-cflags.patch",
"chars": 1343,
"preview": "diff --git a/libada/Makefile.in b/libada/Makefile.in\nindex f5057a0..337e0c6 100644\n--- a/libada/Makefile.in\n+++ b/libada"
},
{
"path": "pkgs/gcc-4.7/java-jvgenmain-link.patch",
"chars": 700,
"preview": "The `jvgenmain' executable must be linked against `vec.o', among others,\nsince it uses its vector API.\n\n--- gcc-4.3.3/gc"
},
{
"path": "pkgs/gcc-4.7/libstdc++-target.patch",
"chars": 1517,
"preview": "Patch to make the target libraries 'configure' scripts find the proper CPP.\nI noticed that building the mingw32 cross co"
},
{
"path": "pkgs/gcc-4.7/no-sys-dirs.patch",
"chars": 1659,
"preview": "diff -ru gcc-4.3.1-orig/gcc/cppdefault.c gcc-4.3.1/gcc/cppdefault.c\n--- gcc-4.3.1-orig/gcc/cppdefault.c\t2007-07-26 10:37"
},
{
"path": "pkgs/gcc-4.7/parallel-bconfig-4.7.patch",
"chars": 1374,
"preview": "diff --git a/gcc/Makefile.in b/gcc/Makefile.in\nindex 0f6735a..ba93e9b 100644\n--- a/gcc/Makefile.in\n+++ b/gcc/Makefile.in"
},
{
"path": "pkgs/gecko/default.nix",
"chars": 6715,
"preview": "{ geckoSrc ? null, lib\n, stdenv, fetchFromGitHub, pythonFull, which, autoconf213, m4\n, perl, unzip, zip, gnumake, yasm, "
},
{
"path": "pkgs/gecko/source.json",
"chars": 169,
"preview": "{\n \"owner\": \"mozilla\",\n \"repo\": \"gecko-dev\",\n \"rev\": \"fee636af734a0ce6dc7335691cc94664bafc385d\",\n \"sha256\": \"0nnkqmg"
},
{
"path": "pkgs/git-cinnabar/default.nix",
"chars": 2287,
"preview": "{ stdenv, fetchFromGitHub, autoconf\n, zlib\n, python\n, perl\n, gettext\n, git\n, mercurial\n, curl\n}:\n\n# NOTE: git-cinnabar d"
},
{
"path": "pkgs/jsdoc/default.nix",
"chars": 502,
"preview": "# This file has been generated by node2nix 1.9.0. Do not edit!\n\n{pkgs ? import <nixpkgs> {\n inherit system;\n }, syst"
},
{
"path": "pkgs/jsdoc/node-env.nix",
"chars": 20335,
"preview": "# This file originates from node2nix\n\n{lib, stdenv, nodejs, python2, pkgs, libtool, runCommand, writeTextFile}:\n\nlet\n #"
},
{
"path": "pkgs/jsdoc/node-packages.nix",
"chars": 9231,
"preview": "# This file has been generated by node2nix 1.9.0. Do not edit!\n\n{nodeEnv, fetchurl, fetchgit, nix-gitignore, stdenv, lib"
},
{
"path": "pkgs/jsdoc/package.json",
"chars": 10,
"preview": "[\"jsdoc\"]\n"
},
{
"path": "pkgs/lib/default.nix",
"chars": 102,
"preview": "{ pkgs }:\n\nlet\n update = import ./update.nix { inherit pkgs; };\nin\n { inherit update; }\n // update\n"
},
{
"path": "pkgs/lib/update.nix",
"chars": 1216,
"preview": "{ pkgs }:\n\nlet\n inherit (pkgs) cacert nix jq curl gnused gnugrep coreutils;\nin {\n\n updateFromGitHub = { owner, repo, p"
},
{
"path": "pkgs/nixpkgs.json",
"chars": 174,
"preview": "{\n \"owner\": \"NixOS\",\n \"repo\": \"nixpkgs-channels\",\n \"rev\": \"ed070354a9e307fdf20a94cb2af749738562385d\",\n \"sha256\": \"05"
},
{
"path": "pkgs/phlay/default.nix",
"chars": 805,
"preview": "{ fetchFromGitHub\n, python3Packages\n}:\npython3Packages.buildPythonApplication rec {\n name = \"phlay-${version}\";\n versi"
},
{
"path": "pkgs/servo/default.nix",
"chars": 2206,
"preview": "{ servoSrc ? null\n, lib\n, rustPlatform\n, pkgs\n}:\n\nlet\n\n inherit (lib) updateFromGitHub;\n inherit (pkgs) fetchFromGitHu"
},
{
"path": "release.nix",
"chars": 3499,
"preview": "# To pin a specific version of nixpkgs, change the nixpkgsSrc argument.\n{ nixpkgsSrc ? <nixpkgs>\n, supportedSystems ? [ "
},
{
"path": "rust-overlay-install.sh",
"chars": 218,
"preview": "#!/bin/sh -e\n\ncd \"$(dirname \"$0\")\" || exit\n\noverlay_dir=$HOME/.config/nixpkgs/overlays\nname=rust-overlay.nix\n\necho Insta"
},
{
"path": "rust-overlay.nix",
"chars": 16700,
"preview": "# This file provide a Rust overlay, which provides pre-packaged bleeding edge versions of rustc\n# and cargo.\nself: super"
},
{
"path": "rust-src-overlay.nix",
"chars": 496,
"preview": "# Overlay that builds on top of rust-overlay.nix.\n# Adds rust-src component to all channels which is helpful for racer, "
},
{
"path": "update.nix",
"chars": 4545,
"preview": "let\n _pkgs = import <nixpkgs> {};\n _nixpkgs = _pkgs.fetchFromGitHub (_pkgs.lib.importJSON ./pkgs/nixpkgs.json);\nin\n\n{ "
}
]
// ... and 1 more files (download for full content)
About this extraction
This page contains the full source code of the mozilla/nixpkgs-mozilla GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 46 files (156.8 KB), approximately 46.2k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.