Repository: mozilla/nixpkgs-mozilla Branch: master Commit: 58356aa021c5 Files: 46 Total size: 156.8 KB Directory structure: gitextract_gw3xxd2j/ ├── .gitignore ├── .travis.yml ├── CODE_OF_CONDUCT.md ├── LICENSE ├── README.rst ├── compilers-overlay.nix ├── default.nix ├── deploy_rsa.enc ├── firefox-overlay.nix ├── flake.nix ├── git-cinnabar-overlay.nix ├── lib/ │ └── parseTOML.nix ├── lib-overlay.nix ├── overlays.nix ├── package-set.nix ├── phlay-overlay.nix ├── pinned.nix ├── pkgs/ │ ├── cbindgen/ │ │ └── default.nix │ ├── clang/ │ │ └── bug-14435.patch │ ├── firefox-nightly-bin/ │ │ └── update.nix │ ├── gcc-4.7/ │ │ ├── arm-eabi.patch │ │ ├── builder.sh │ │ ├── default.nix │ │ ├── gfortran-driving.patch │ │ ├── gnat-cflags.patch │ │ ├── java-jvgenmain-link.patch │ │ ├── libstdc++-target.patch │ │ ├── no-sys-dirs.patch │ │ └── parallel-bconfig-4.7.patch │ ├── gecko/ │ │ ├── default.nix │ │ └── source.json │ ├── git-cinnabar/ │ │ └── default.nix │ ├── jsdoc/ │ │ ├── default.nix │ │ ├── node-env.nix │ │ ├── node-packages.nix │ │ └── package.json │ ├── lib/ │ │ ├── default.nix │ │ └── update.nix │ ├── nixpkgs.json │ ├── phlay/ │ │ └── default.nix │ └── servo/ │ └── default.nix ├── release.nix ├── rust-overlay-install.sh ├── rust-overlay.nix ├── rust-src-overlay.nix └── update.nix ================================================ FILE CONTENTS ================================================ ================================================ FILE: .gitignore ================================================ /result* ================================================ FILE: .travis.yml ================================================ language: nix addons: ssh_known_hosts: floki.garbas.si env: - STDENV=clang - STDENV=clang36 - STDENV=clang37 - STDENV=clang38 - STDENV=gcc - STDENV=gcc49 - STDENV=gcc48 script: - if [ "$TRAVIS_EVENT_TYPE" == "cron" ]; then nix-shell update.nix --pure; fi - if [ "$TRAVIS_PULL_REQUEST" != "true" -a "$TRAVIS_BRANCH" = "master" ]; then nix-build release.nix -A gecko."x86_64-linux"."$STDENV"; mkdir nars/; nix-push --dest "$PWD/nars/" --force ./result; fi before_install: - openssl aes-256-cbc -K $encrypted_be02022e0814_key -iv $encrypted_be02022e0814_iv -in deploy_rsa.enc -out deploy_rsa -d before_deploy: - eval "$(ssh-agent -s)" - chmod 600 $TRAVIS_BUILD_DIR/deploy_rsa - ssh-add $TRAVIS_BUILD_DIR/deploy_rsa deploy: provider: script skip_cleanup: true script: rsync -avh --ignore-existing $TRAVIS_BUILD_DIR/nars/ travis@floki.garbas.si:/var/travis/nixpkgs-mozilla/ on: branch: master ================================================ FILE: CODE_OF_CONDUCT.md ================================================ # Community Participation Guidelines This repository is governed by Mozilla's code of conduct and etiquette guidelines. For more details, please read the [Mozilla Community Participation Guidelines](https://www.mozilla.org/about/governance/policies/participation/). ## How to Report For more information on how to report violations of the Community Participation Guidelines, please read our '[How to Report](https://www.mozilla.org/about/governance/policies/participation/reporting/)' page. ================================================ FILE: LICENSE ================================================ Copyright 2017 Mozilla Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ================================================ FILE: README.rst ================================================ nixpkgs-mozilla =============== Gathering nix efforts in one repository. Current packages ---------------- - gecko (https://github.com/mozilla/gecko-dev) - firefox-bin variants including Nightly firefox-bin variants -------------------- Nixpkgs already has definitions for `firefox `_, which is built from source, as well as `firefox-bin `_, which is the binary Firefox version built by Mozilla. The ``firefox-overlay.nix`` in this repository adds definitions for some other firefox-bin variants that Mozilla ships: ``firefox-nightly-bin``, ``firefox-beta-bin``, and ``firefox-esr-bin``. All are exposed under a ``latest`` attribute, e.g. ``latest.firefox-nightly-bin``. Unfortunately, these variants do not auto-update, and you may see some annoying pop-ups complaining about this. Note that all the ``-bin`` packages are "unfree" (because of the Firefox trademark, held by Mozilla), so you will need to set ``nixpkgs.config.allowUnfree`` in order to use them. More info `here `_. Rust overlay ------------ **NOTE:** Nix overlays only works on up-to-date versions of NixOS/nixpkgs, starting from 17.03. A nixpkgs overlay is provided to contain all of the latest rust releases. To use the rust overlay run the ``./rust-overlay-install.sh`` command. It will link the current ``./rust-overlay.nix`` into your ``~/.config/nixpkgs/overlays`` folder. Once this is done, use ``nix-env -iA nixpkgs.latest.rustChannels.nightly.rust`` for example. Replace the ``nixpkgs.`` prefix with ``nixos.`` on NixOS. Using in nix expressions ------------------------ Example of using in ```shell.nix```: .. code:: nix let moz_overlay = import (builtins.fetchTarball https://github.com/mozilla/nixpkgs-mozilla/archive/master.tar.gz); nixpkgs = import { overlays = [ moz_overlay ]; }; in with nixpkgs; stdenv.mkDerivation { name = "moz_overlay_shell"; buildInputs = [ # to use the latest nightly: nixpkgs.latest.rustChannels.nightly.rust # to use a specific nighly: (nixpkgs.rustChannelOf { date = "2018-04-11"; channel = "nightly"; }).rust # to use the project's rust-toolchain file: (nixpkgs.rustChannelOf { rustToolchain = ./rust-toolchain; }).rust ]; } Flake usage ----------- This repository contains a minimal flake interface for the various overlays in this repository. To use it in your own flake, add it as an input to your ``flake.nix``: .. code:: nix { inputs.nixpkgs.url = github:NixOS/nixpkgs; inputs.nixpkgs-mozilla.url = github:mozilla/nixpkgs-mozilla; outputs = { self, nixpkgs, nixpkgs-mozilla }: { devShell."x86_64-linux" = let pkgs = import nixpkgs { system = "x86_64-linux"; overlays = [ nixpkgs-mozilla.overlay ]; }; in pkgs.mkShell { buildInputs = [ pkgs.latest.rustChannels.nightly.rust ]; }; }; } The available overlays are ``nixpkgs-mozilla.overlay`` for the default overlay containing everything, and ``nixpkgs-mozilla.overlays.{lib, rust, rr, firefox, git-cinnabar}`` respectively. Depending on your use case, you might need to set the ``--impure`` flag when invoking the ``nix`` command. This is because this repository fetches resources from non-pinned URLs non-reproducibly. Using Custom Version of Firefox ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ To use with a custom version of Firefox, you would have multiple choices to provide either provide the full file information, such as the url and the checksum of the file. .. code:: nix { inputs.nixpkgs.url = nixpkgs/nixos-unstable; inputs.nixpkgs-mozilla.url = github:mozilla/nixpkgs-mozilla; outputs = { self, nixpkgs, nixpkgs-mozilla }: { devShell."x86_64-linux" = let pkgs = import nixpkgs { system = "x86_64-linux"; overlays = [ nixpkgs-mozilla.overlay ]; }; firefox-nightly150 = pkgs.lib.firefoxOverlay.firefoxVersion { info = { url = "https://download.cdn.mozilla.net/pub/firefox/nightly/2026/03/2026-03-05-00-23-19-mozilla-central/firefox-150.0a1.en-US.linux-x86_64.tar.xz"; sha512 = "8faa93d786d618963a7e5f5bf16488523aac2faeb9a27051b1002a44f56a31c93af72db16a00af9ec97786666ff0498a7fc9e95480e967ff1aa55346e57fcef3"; verifiedByHand = true; }; }; in pkgs.mkShell { buildInputs = [ firefox-nightly150 ];}; }; } In this example, the checksum is taken out of the checksums file, which is in the same directory as the compressed package. In this particular case, as this is a nightly version, we could have set the release attribute to `false`, with the version and the timestamp of the directory: .. code:: nix firefox-nightly150 = pkgs.lib.firefoxOverlay.firefoxVersion { release = false; version = "150.0a1"; timestamp = "2026-03-05-00-23-19"; }; And we could have omitted the timestamp if we only cared about the latest nightly version. To grab a beta or release version, we only have to specify the version number and setting the release attribute to `true`: .. code:: nix firefox-nightly149 = pkgs.lib.firefoxOverlay.firefoxVersion { release = true; version = "149.0b1"; }; Firefox Development Environment ------------------------------- This repository provides several tools to facilitate development on Firefox. Firefox is built on an engine called Gecko, which lends its name to some of the files and derivations in this repo. Checking out Firefox ~~~~~~~~~~~~~~~~~~~~ To build Firefox from source, it is best to have a local checkout of ``mozilla-central``. ``mozilla-central`` is hosted in Mercurial, but some people prefer to access it using ``git`` and ``git-cinnabar``. The tools in this repo support either using mercurial or git. This repository provides a ``git-cinnabar-overlay.nix`` which defines a ``git-cinnabar`` derivation. This overlay can be used to install ``git-cinnabar``, either using ``nix-env`` or as part of a system-wide ``configuration.nix``. Building Firefox ~~~~~~~~~~~~~~~~ The ``firefox-overlay.nix`` provides an environment to build Firefox from its sources, once you have finished the checkout of ``mozilla-central``. You can use ``nix-shell`` to enter this environment to launch ``mach`` commands to build Firefox and test your build. Some debugging tools are available in this environment as well, but other development tools (such as those used to submit changes for review) are outside the scope of this environment. The ``nix-shell`` environment is available in the ``gecko..`` attribute of the ``release.nix`` file provided in this repository. The ```` attribute is either ``x86_64-linux`` or ``i686-linux``. The first one would create a native toolchain for compiling on x64, while the second one would give a native toolchain for compiling on x86. Note that due to the size of the compilation units on x86, the compilation might not be able to complete, but some sub part of Gecko, such as SpiderMonkey would compile fine. The ```` attribute is either ``gcc`` or ``clang``, or any specific version of the compiler available in the ``compiler-overlay.nix`` file which is repeated in ``release.nix``. This compiler would only be used for compiling Gecko, and the rest of the toolchain is compiled against the default ``stdenv`` of the architecture. When first entering the ``nix-shell``, the toolchain will pull and build all the dependencies necessary to build Gecko, this includes might take some time. This work will not be necessary the second time, unless you use a different toolchain or architecture. .. code:: sh ~/$ cd mozilla-central ~/mozilla-central$ nix-shell ../nixpkgs-mozilla/release.nix -A gecko.x86_64-linux.gcc --pure ... pull the rust compiler ... compile the toolchain # First time only - initialize virtualenv [~/mozilla-central] python ./mach create-mach-environment ... create .mozbuild/_virtualenvs/mach [~/mozilla-central] python ./mach build ... build firefox desktop [~/mozilla-central] python ./mach run ... run firefox When entering the ``nix-shell``, the ``MOZCONFIG`` environment variable is set to a local file, named ``.mozconfig.nix-shell``, created each time you enter the ``nix-shell``. You can create your own ``.mozconfig`` file which extends the default one, with your own options. .. code:: sh ~/mozilla-central$ nix-shell ../nixpkgs-mozilla/release.nix -A gecko.x86_64-linux.gcc --pure [~/mozilla-central] cat .mozconfig # Import current nix-shell config. . .mozconfig.nix-shell ac_add_options --enable-js-shell ac_add_options --disable-tests [~/mozilla-central] export MOZCONFIG="$(pwd)/.mozconfig" [~/mozilla-central] python ./mach build To avoid repeating yourself, you can also rely on the ``NIX_SHELL_HOOK`` environment variable, to reset the ``MOZCONFIG`` environment variable for you. .. code:: sh ~/mozilla-central$ export NIX_SHELL_HOOK="export MOZCONFIG=$(pwd)/.mozconfig;" ~/mozilla-central$ nix-shell ../nixpkgs-mozilla/release.nix -A gecko.x86_64-linux.gcc --pure [~/mozilla-central] python ./mach build Submitting Firefox patches ~~~~~~~~~~~~~~~~~~~~~~~~~~ Firefox development happens in `Mozilla Phabricator `_. Mozilla Phabricator docs are `here `_. To get your commits into Phabricator, some options include: - Arcanist, the upstream tool for interacting with Phabricator. Arcanist is packaged in nixpkgs already; you can find it in `nixos.arcanist`. Unfortunately, as of this writing, upstream Arcanist does not support ``git-cinnabar`` (according to `the "Setting up Arcanist" `_ documentation). `Mozilla maintains a fork of Arcanist `_ but it isn't yet packaged. (PRs welcome.) - `moz-phab `_, an in-house CLI for Phabricator. It's available in nix packages (unstable channel). - `phlay `_, a small Python script that speaks to the Phabricator API directly. This repository ships a ``phlay-overlay.nix`` that you can use to make ``phlay`` available in a nix-shell or nix-env. Note: although the ``nix-shell`` from the previous section may have all the tools you would normally use to do Firefox development, it isn't recommended that you use that shell for anything besides tasks that involve running ``mach``. Other development tasks such as committing code and submitting patches to code review are best handled in a separate nix-shell. TODO ---- - setup hydra to have binary channels - make sure pinned revisions get updated automatically (if build passes we should update revisions in default.nix) - pin to specific (working) nixpkgs revision (as we do for other sources) - can we make this work on darwin as well? - assign maintainers for our packages that will montior that it "always" builds - hook it with vulnix report to monitor CVEs (once vulnix is ready, it must be ready soon :P) ================================================ FILE: compilers-overlay.nix ================================================ # This overlays add a customStdenv attribute which provide an stdenv with # different versions of the compilers. This can be used to test Gecko builds # against different compiler settings, or different compiler versions. # # See release.nix "builder" function, to understand how these different stdenv # are used. self: super: let noSysDirs = (super.stdenv.system != "x86_64-darwin" && super.stdenv.system != "x86_64-freebsd" && super.stdenv.system != "i686-freebsd" && super.stdenv.system != "x86_64-kfreebsd-gnu"); crossSystem = null; gcc473 = super.wrapCC (super.callPackage ./pkgs/gcc-4.7 (with self; { inherit noSysDirs; texinfo = texinfo4; # I'm not sure if profiling with enableParallelBuilding helps a lot. # We can enable it back some day. This makes the *gcc* builds faster now. profiledCompiler = false; # When building `gcc.crossDrv' (a "Canadian cross", with host == target # and host != build), `cross' must be null but the cross-libc must still # be passed. cross = null; libcCross = if crossSystem != null then libcCross else null; libpthreadCross = if crossSystem != null && crossSystem.config == "i586-pc-gnu" then gnu.libpthreadCross else null; })); # By default wrapCC keep the same header files, but NixOS is using the # latest header files from GCC, which are not supported by clang, because # clang implement a different set of locking primitives than GCC. This # expression is used to wrap clang with a matching verion of the libc++. maybeWrapClang = cc: cc; /* if cc ? clang then clangWrapCC cc else cc; */ clangWrapCC = llvmPackages: let libcxx = super.lib.overrideDerivation llvmPackages.libcxx (drv: { # https://bugzilla.mozilla.org/show_bug.cgi?id=1277619 # https://llvm.org/bugs/show_bug.cgi?id=14435 patches = drv.patches ++ [ ./pkgs/clang/bug-14435.patch ]; }); in super.callPackage { cc = llvmPackages.clang-unwrapped or llvmPackages.clang; isClang = true; stdenv = self.clangStdenv; libc = self.glibc; # cc-wrapper pulls gcc headers, which are not compatible with features # implemented in clang. These packages are used to override that. extraPackages = [ self.libcxx llvmPackages.libcxxabi ]; nativeTools = false; nativeLibc = false; }; buildWithCompiler = cc: super.stdenvAdapters.overrideCC self.stdenv (maybeWrapClang cc); chgCompilerSource = cc: name: src: cc.override (conf: if conf ? gcc then # Nixpkgs 14.12 { gcc = super.lib.overrideDerivation conf.gcc (old: { inherit name src; }); } else # Nixpkgs 15.05 { cc = super.lib.overrideDerivation conf.cc (old: { inherit name src; }); } ); compilersByName = with self; { clang = llvmPackages.clang; clang36 = llvmPackages_36.clang; clang37 = llvmPackages_37.clang; clang38 = llvmPackages_38.clang; # not working yet. clang5 = llvmPackages_5.clang or llvmPackages.clang; clang6 = llvmPackages_6.clang or llvmPackages.clang; clang7 = llvmPackages_7.clang or llvmPackages.clang; clang12 = llvmPackages_12.clang or llvmPackages.clang; clang13 = llvmPackages_13.clang or llvmPackages.clang; gcc = gcc; gcc6 = gcc6; gcc5 = gcc5; gcc49 = gcc49; gcc48 = gcc48; gcc474 = chgCompilerSource gcc473 "gcc-4.7.4" (fetchurl { url = "mirror://gnu/gcc/gcc-4.7.4/gcc-4.7.4.tar.bz2"; sha256 = "10k2k71kxgay283ylbbhhs51cl55zn2q38vj5pk4k950qdnirrlj"; }); gcc473 = gcc473; # Version used on Linux slaves, except Linux x64 ASAN. gcc472 = chgCompilerSource gcc473 "gcc-4.7.2" (fetchurl { url = "mirror://gnu/gcc/gcc-4.7.2/gcc-4.7.2.tar.bz2"; sha256 = "115h03hil99ljig8lkrq4qk426awmzh0g99wrrggxf8g07bq74la"; }); }; in { customStdenvs = super.lib.mapAttrs (name: value: buildWithCompiler value) compilersByName; } ================================================ FILE: default.nix ================================================ # Nixpkgs overlay which aggregates overlays for tools and products, used and # published by Mozilla. self: super: with super.lib; (foldl' (flip extends) (_: super) (map import (import ./overlays.nix))) self ================================================ FILE: firefox-overlay.nix ================================================ # This file provide the latest binary versions of Firefox published by Mozilla. self: super: let # This URL needs to be updated about every 2 years when the subkey is rotated. pgpKey = super.fetchurl { url = "https://download.cdn.mozilla.net/pub/firefox/candidates/138.0b1-candidates/build1/KEY"; hash = "sha256-FOGtyDxtZpW6AbNdSj0QoK1AYkQYxHPypT8zJr2XYQk="; }; # This file is currently maintained manually, if this Nix expression attempt # to download the wrong version, this is likely to be the problem. # # Open a pull request against https://github.com/mozilla-releng/shipit to # update the version, as done in # https://github.com/mozilla-releng/shipit/pull/1467 firefox_versions = with builtins; fromJSON (readFile (fetchurl "https://product-details.mozilla.org/1.0/firefox_versions.json")); arch = if self.stdenv.system == "i686-linux" then "linux-i686" else "linux-x86_64"; yearOf = with super.lib; yyyymmddhhmmss: head (splitString "-" yyyymmddhhmmss); monthOf = with super.lib; yyyymmddhhmmss: head (tail (splitString "-" yyyymmddhhmmss)); # Given SHA512SUMS file contents and file name, extract matching sha512sum. extractSha512Sum = sha512sums: file: with builtins; # Nix 1.x do not have `builtins.split`. # Nix 2.0 have an bug in `builtins.match` (see https://github.com/NixOS/nix/issues/2147). # So I made separate logic for Nix 1.x and Nix 2.0. if builtins ? split then substring 0 128 (head (super.lib.filter (s: isString s && substring 128 (stringLength s) s == " ${file}") (split "\n" sha512sums))) else head (match ".*[\n]([0-9a-f]*) ${file}.*" sha512sums); # The timestamp argument is a yyyy-mm-dd-hh-mm-ss date, which corresponds to # one specific version. This is used mostly for bisecting. versionInfo = { name, version, release, system ? arch, timestamp ? null, info ? null, ... }: with builtins; if (info != null) then info else if release then # For versions such as Beta & Release: # https://download.cdn.mozilla.net/pub/firefox/releases/55.0b3/SHA256SUMS let dir = "https://download.cdn.mozilla.net/pub/firefox/releases/${version}"; # After version 134 firefox switched to using tar.xz instead of tar.bz2 majorVersion = super.lib.strings.toInt ( builtins.elemAt (super.lib.strings.splitString "." version) 0 ); extension = if majorVersion > 134 then "tar.xz" else "tar.bz2"; file = "${system}/en-US/firefox-${version}.${extension}"; sha512Of = chksum: file: extractSha512Sum (readFile (fetchurl chksum)) file; in rec { chksum = "${dir}/SHA512SUMS"; chksumSig = "${chksum}.asc"; chksumSha256 = hashFile "sha256" (fetchurl "${dir}/SHA512SUMS"); chksumSigSha256 = hashFile "sha256" (fetchurl "${chksum}.asc"); inherit file; url = "${dir}/${file}"; sha512 = sha512Of chksum file; sig = null; sigSha512 = null; } else # For Nightly versions: # https://download.cdn.mozilla.net/pub/firefox/nightly/latest-mozilla-central/firefox-56.0a1.en-US.linux-x86_64.checksums let dir = if timestamp == null then let buildhubJSON = with builtins; fromJSON (readFile (fetchurl "https://download.cdn.mozilla.net/pub/firefox/nightly/latest-mozilla-central/firefox-${version}.en-US.${system}.buildhub.json")); in builtins.replaceStrings [ "/${file}" ] [ "" ] buildhubJSON.download.url else "https://download.cdn.mozilla.net/pub/firefox/nightly/${yearOf timestamp}/${monthOf timestamp}/${timestamp}-mozilla-central" ; file = "firefox-${version}.en-US.${system}.tar.xz"; sha512Of = chksum: file: head (match ".*[\n]([0-9a-f]*) sha512 [0-9]* ${file}[\n].*" (readFile (fetchurl chksum))); in rec { chksum = "${dir}/firefox-${version}.en-US.${system}.checksums"; chksumSig = null; # file content: # sha512 62733881 firefox-56.0a1.en-US.linux-x86_64.tar.bz2 # sha256 62733881 firefox-56.0a1.en-US.linux-x86_64.tar.bz2 url = "${dir}/${file}"; sha512 = sha512Of chksum file; sig = "${dir}/${file}.asc"; sigSha512 = sha512Of chksum "${file}.asc"; }; # From the version info, check the authenticity of the check sum file, such # that we guarantee that we have verifyFileAuthenticity = { file, sha512, chksum, chksumSig }: assert extractSha512Sum (builtins.readFile chksum) file == sha512; super.runCommand "check-firefox-signature" { buildInputs = [ self.gnupg ]; FILE = chksum; ASC = chksumSig; } '' set -eu gpg --dearmor < ${pgpKey} > keyring.gpg gpgv --keyring=./keyring.gpg $ASC $FILE mkdir $out ''; # From the version info, create a fetchurl derivation which will get the # sources from the remote. fetchVersion = info: if info.verifiedByHand or false then # Set info.verifiedByHand = true; when testing with tarball. super.fetchurl { inherit (info) url sha512; } else if info.chksumSig != null then super.fetchurl { inherit (info) url sha512; # This is a fixed derivation, but we still add as a dependency the # verification of the checksum. Thus, this fetch script can only be # executed once the verifyAuthenticity script finished successfully. postFetch = '' : # Authenticity Check (${verifyFileAuthenticity { inherit (info) file sha512; chksum = builtins.fetchurl { url = info.chksum; sha256 = info.chksumSha256; }; chksumSig = builtins.fetchurl { url = info.chksumSig; sha256 = info.chksumSigSha256; }; }}) ''; } else super.fetchurl { inherit (info) url sha512; # This would download the tarball, and then verify that the content # match the signature file. Fortunately, any failure of this code would # prevent the output from being reused. postFetch = let asc = super.fetchurl { url = info.sig; sha512 = info.sigSha512; }; in '' : # Authenticity Check set -eu export PATH="$PATH:${self.gnupg}/bin/" gpg --dearmor < ${pgpKey} > keyring.gpg gpgv --keyring=./keyring.gpg ${asc} $out ''; }; versionWithDefaults = version: { name = "Firefox Twilight"; version = "0.0a1"; channel = "twilight"; wmClass = "firefox-twilight"; release = false; # info attribute set is either null, in which case it is infered by # versionInfo, or it should be an attribute set with either: # # 1. Manual verification of packages: # url = "..."; # sha512 = "..."; # verifiedByHand = true; # # 2. Using a checksum file, which is itself verified using the gpg key. # url = "..."; # file = "..."; # sha512 = "..."; # chksum = "..."; # chksumSha256 = "..."; # chksumSig = "..."; # chksumSigSha256 = "..."; # # 3. Using the gpg key on the archive # url = "..."; # sha512 = "..."; # sig = "..."; # sigSha512 = "..."; } // version; firefoxVersion = version': let version = versionWithDefaults version'; info = versionInfo version; pkg = ((self.firefox-bin-unwrapped.override ({ generated = { version = version.version; sources = { inherit (info) url sha512; }; }; } // super.lib.optionalAttrs (self.firefox-bin-unwrapped.passthru ? applicationName) { applicationName = version.name; })).overrideAttrs (old: { # Add a dependency on the signature check. src = fetchVersion info; })); in super.wrapFirefox pkg ({ pname = "${pkg.binaryName}-bin"; wmClass = version.wmClass; } // super.lib.optionalAttrs (!self.firefox-bin-unwrapped.passthru ? applicationName) { desktopName = version.name; }); firefoxVariants = { firefox-nightly-bin = { name = "Firefox Nightly"; channel = "nightly"; wmClass = "firefox-nightly"; version = firefox_versions.FIREFOX_NIGHTLY; release = false; }; firefox-beta-bin = { name = "Firefox Beta"; channel = "beta"; wmClass = "firefox-beta"; version = firefox_versions.LATEST_FIREFOX_DEVEL_VERSION; release = true; }; firefox-bin = { name = "Firefox"; channel = "release"; wmClass = "firefox"; version = firefox_versions.LATEST_FIREFOX_VERSION; release = true; }; firefox-esr-bin = { name = "Firefox ESR"; channel = "release"; wmClass = "firefox"; version = firefox_versions.FIREFOX_ESR; release = true; }; }; in { lib = super.lib // { firefoxOverlay = { inherit pgpKey firefoxVersion versionInfo firefox_versions firefoxVariants; }; }; # Set of packages which are automagically updated. Do not rely on these for # reproducible builds. latest = (super.latest or {}) // (builtins.mapAttrs (n: v: firefoxVersion v) firefoxVariants); # Set of packages which used to build developer environment devEnv = (super.shell or {}) // { gecko = super.callPackage ./pkgs/gecko { inherit (self.python38Packages) setuptools; pythonFull = self.python38Full; nodejs = if builtins.compareVersions self.nodejs.name "nodejs-8.11.3" < 0 then self.nodejs-8_x else self.nodejs; rust-cbindgen = if !(self ? "rust-cbindgen") then self.rust-cbindgen-latest else if builtins.compareVersions self.rust-cbindgen.version self.rust-cbindgen-latest.version < 0 then self.rust-cbindgen-latest else self.rust-cbindgen; # Due to std::ascii::AsciiExt changes in 1.23, Gecko does not compile, so # use the latest Rust version before 1.23. # rust = (super.rustChannelOf { channel = "stable"; date = "2017-11-22"; }).rust; # rust = (super.rustChannelOf { channel = "stable"; date = "2020-03-12"; }).rust; inherit (self.latest.rustChannels.stable) rust; }; }; # Use rust-cbindgen imported from Nixpkgs (September 2018) unless the current # version of Nixpkgs already packages a version of rust-cbindgen. rust-cbindgen-latest = super.callPackage ./pkgs/cbindgen { rustPlatform = super.makeRustPlatform { cargo = self.latest.rustChannels.stable.rust; rustc = self.latest.rustChannels.stable.rust; }; }; jsdoc = super.callPackage ./pkgs/jsdoc {}; } ================================================ FILE: flake.nix ================================================ { description = "Mozilla overlay for Nixpkgs"; outputs = { self, ... }: { # Default overlay. overlay = import ./default.nix; # Inidividual overlays. overlays = { lib = import ./lib-overlay.nix; rust = import ./rust-overlay.nix; firefox = import ./firefox-overlay.nix; git-cinnabar = import ./git-cinnabar-overlay.nix; }; }; } ================================================ FILE: git-cinnabar-overlay.nix ================================================ self: super: { git-cinnabar = super.callPackage ./pkgs/git-cinnabar { # we need urllib to recognize ssh. # python = self.pythonFull; python = self.mercurial.python; }; } ================================================ FILE: lib/parseTOML.nix ================================================ with builtins; # Tokenizer. let layout_pat = "[ \n]+"; layout_pat_opt = "[ \n]*"; token_pat = ''=|[[][[][a-zA-Z0-9_."*-]+[]][]]|[[][a-zA-Z0-9_."*-]+[]]|[[][^]]+[]]|[a-zA-Z0-9_-]+|"[^"]*"''; #" tokenizer_1_11 = str: let tokenizer_rec = len: prevTokens: patterns: str: let pattern = head patterns; layoutAndTokens = match pattern str; matchLength = stringLength (head layoutAndTokens); tokens = prevTokens ++ tail layoutAndTokens; in if layoutAndTokens == null then # if we cannot reduce the pattern, return the list of token if tail patterns == [] then prevTokens # otherwise, take the next pattern, which only captures half the token. else tokenizer_rec len prevTokens (tail patterns) str else tokenizer_rec len tokens patterns (substring matchLength len str); avgTokenSize = 100; ceilLog2 = v: let inner = n: i: if i < v then inner (n + 1) (i * 2) else n; in inner 1 1; # The builtins.match function match the entire string, and generate a list of all captured # elements. This is the most efficient way to make a tokenizer, if we can make a pattern which # capture all token of the file. Unfortunately C++ std::regex does not support captures in # repeated patterns. As a work-around, we generate patterns which are matching tokens in multiple # of 2, such that we can avoid iterating too many times over the content. generatePatterns = str: let depth = ceilLog2 (stringLength str / avgTokenSize); inner = depth: if depth == 0 then [ "(${token_pat})" ] else let next = inner (depth - 1); in [ "${head next}${layout_pat}${head next}" ] ++ next; in map (pat: "(${layout_pat_opt}${pat}).*" ) (inner depth); in tokenizer_rec (stringLength str) [] (generatePatterns str) str; tokenizer_1_12 = str: let # Nix 1.12 has the builtins.split function which allow to tokenize the # file quickly. by iterating with a simple regexp. layoutTokenList = split "(${token_pat})" str; isLayout = s: match layout_pat_opt s != null; filterLayout = list: filter (s: if isString s then if isLayout s then false else throw "Error: Unexpected token: '${s}'" else true) list; removeTokenWrapper = list: map (x: assert tail x == []; head x) list; in removeTokenWrapper (filterLayout layoutTokenList); tokenizer = if builtins ? split then tokenizer_1_12 else tokenizer_1_11; in # Parse entry headers let unescapeString = str: # Let's ignore any escape character for the moment. assert match ''"[^"]*"'' str != null; #" substring 1 (stringLength str - 2) str; # Match the content of TOML format section names. ident_pat = ''[a-zA-Z0-9_-]+|"[^"]*"''; #" removeBraces = token: wrapLen: substring wrapLen (stringLength token - 2 * wrapLen) token; # Note, this implementation is limited to 11 identifiers. matchPathFun_1_11 = token: let # match header_pat "a.b.c" == [ "a" ".b" "b" ".c" "c" ] header_pat = foldl' (pat: n: "(${ident_pat})([.]${pat})?") "(${ident_pat})" (genList (n: 0) 10); matchPath = match header_pat token; filterDot = filter (s: substring 0 1 s != ".") matchPath; in filterDot; matchPathFun_1_12 = token: map (e: head e) (filter (s: isList s) (split "(${ident_pat})" token)); matchPathFun = if builtins ? split then matchPathFun_1_12 else matchPathFun_1_11; headerToPath = token: wrapLen: let token' = removeBraces token wrapLen; matchPath = matchPathFun token'; path = map (s: if substring 0 1 s != ''"'' then s #" else unescapeString s ) matchPath; in assert matchPath != null; # assert trace "Path: ${token'}; match as ${toString path}" true; path; in # Reconstruct the equivalent attribute set. let tokenToValue = token: if token == "true" then true else if token == "false" then false # TODO: convert the TOML list into a Nix list. else if match "[[][^]]+[]]" token != null then token else unescapeString token; parserInitState = { idx = 0; path = []; isList = false; output = []; elem = {}; }; # Imported from nixpkgs library. setAttrByPath = attrPath: value: if attrPath == [] then value else listToAttrs [ { name = head attrPath; value = setAttrByPath (tail attrPath) value; } ]; closeSection = state: state // { output = state.output ++ [ (setAttrByPath state.path ( if state.isList then [ state.elem ] else state.elem )) ]; }; readToken = state: token: # assert trace "Read '${token}'" true; if state.idx == 0 then if substring 0 2 token == "[[" then (closeSection state) // { path = headerToPath token 2; isList = true; elem = {}; } else if substring 0 1 token == "[" then (closeSection state) // { path = headerToPath token 1; isList = false; elem = {}; } else assert match "[a-zA-Z0-9_-]+" token != null; state // { idx = 1; name = token; } else if state.idx == 1 then assert token == "="; state // { idx = 2; } else assert state.idx == 2; state // { idx = 0; elem = state.elem // { "${state.name}" = tokenToValue token; }; }; # aggregate each section as individual attribute sets. parser = str: closeSection (foldl' readToken parserInitState (tokenizer str)); fromTOML = toml: let sections = (parser toml).output; # Inlined from nixpkgs library functions. zipAttrs = sets: listToAttrs (map (n: { name = n; value = let v = catAttrs n sets; in # assert trace "Visiting ${n}" true; if tail v == [] then head v else if isList (head v) then concatLists v else if isAttrs (head v) then zipAttrs v else throw "cannot merge sections"; }) (concatLists (map attrNames sets))); in zipAttrs sections; in { testing = fromTOML (builtins.readFile ./channel-rust-nightly.toml); testing_url = fromTOML (builtins.readFile (builtins.fetchurl "https://static.rust-lang.org/dist/channel-rust-nightly.toml")); inherit fromTOML; } ================================================ FILE: lib-overlay.nix ================================================ self: super: { lib = super.lib // (import ./pkgs/lib/default.nix { pkgs = self; }); } ================================================ FILE: overlays.nix ================================================ [ ./lib-overlay.nix ./rust-overlay.nix ./firefox-overlay.nix ./git-cinnabar-overlay.nix ] ================================================ FILE: package-set.nix ================================================ { pkgs }: with pkgs.lib; let self = foldl' (prev: overlay: prev // (overlay (pkgs // self) (pkgs // prev))) {} (map import (import ./overlays.nix)); in self ================================================ FILE: phlay-overlay.nix ================================================ self: super: { phlay = super.callPackage ./pkgs/phlay {}; } ================================================ FILE: pinned.nix ================================================ # This script extends nixpkgs with mozilla packages. # # First it imports the in the environment and depends on it # providing fetchFromGitHub and lib.importJSON. # # After that it loads a pinned release of nixos-unstable and uses that as the # base for the rest of packaging. One can pass it's own pkgsPath attribute if # desired, probably in the context of hydra. { pkgsPath ? null , overlays ? [] , system ? null , geckoSrc ? null }: # Pin a specific version of Nixpkgs. let _pkgs = import {}; _pkgsPath = if pkgsPath != null then pkgsPath else _pkgs.fetchFromGitHub (_pkgs.lib.importJSON ./pkgs/nixpkgs.json); nixpkgs = import _pkgsPath ({ overlays = import ./default.nix ++ overlays; } // (if system != null then { inherit system; } else {})); in nixpkgs // { # Do not add a name attribute attribute in an overlay !!! As this will cause # tons of recompilations. name = "nixpkgs"; updateScript = nixpkgs.lib.updateFromGitHub { owner = "NixOS"; repo = "nixpkgs-channels"; branch = "nixos-unstable-small"; path = "pkgs/nixpkgs.json"; }; } ================================================ FILE: pkgs/cbindgen/default.nix ================================================ ### NOTE: This file is a copy of the one from Nixpkgs repository ### (taken 2020 February) from commit 82d9ce45fe0b67e3708ab6ba47ffcb4bba09945d. ### It is used when the version of cbindgen in ### upstream nixpkgs is not up-to-date enough to compile Firefox. { stdenv, lib, fetchFromGitHub, rustPlatform # , Security }: rustPlatform.buildRustPackage rec { name = "rust-cbindgen-${version}"; version = "0.14.3"; src = fetchFromGitHub { owner = "eqrion"; repo = "cbindgen"; rev = "v${version}"; sha256 = "0pw55334i10k75qkig8bgcnlsy613zw2p5j4xyz8v71s4vh1a58j"; }; cargoSha256 = "0088ijnjhqfvdb1wxy9jc7hq8c0yxgj5brlg68n9vws1mz9rilpy"; # buildInputs = lib.optional stdenv.isDarwin Security; checkFlags = [ # https://github.com/eqrion/cbindgen/issues/338 "--skip test_expand" ]; # https://github.com/NixOS/nixpkgs/issues/61618 postConfigure = '' mkdir .cargo touch .cargo/.package-cache export HOME=`pwd` ''; meta = with lib; { description = "A project for generating C bindings from Rust code"; homepage = "https://github.com/eqrion/cbindgen"; license = licenses.mpl20; maintainers = with maintainers; [ jtojnar andir ]; }; } ================================================ FILE: pkgs/clang/bug-14435.patch ================================================ diff -x _inst -x _build -x .svn -ur libcxx.old/include/cstdio libcxx.new/include/cstdio --- libcxx.old/include/cstdio 2016-07-08 12:47:12.964181871 +0000 +++ libcxx.new/include/cstdio 2016-07-08 12:47:27.540149147 +0000 @@ -109,15 +109,15 @@ #endif #ifdef getc -inline _LIBCPP_INLINE_VISIBILITY int __libcpp_getc(FILE* __stream) {return getc(__stream);} +inline __attribute__ ((__always_inline__)) int __libcpp_getc(FILE* __stream) {return getc(__stream);} #undef getc -inline _LIBCPP_INLINE_VISIBILITY int getc(FILE* __stream) {return __libcpp_getc(__stream);} +inline __attribute__ ((__always_inline__)) int getc(FILE* __stream) {return __libcpp_getc(__stream);} #endif // getc #ifdef putc -inline _LIBCPP_INLINE_VISIBILITY int __libcpp_putc(int __c, FILE* __stream) {return putc(__c, __stream);} +inline __attribute__ ((__always_inline__)) int __libcpp_putc(int __c, FILE* __stream) {return putc(__c, __stream);} #undef putc -inline _LIBCPP_INLINE_VISIBILITY int putc(int __c, FILE* __stream) {return __libcpp_putc(__c, __stream);} +inline __attribute__ ((__always_inline__)) int putc(int __c, FILE* __stream) {return __libcpp_putc(__c, __stream);} #endif // putc #ifdef clearerr diff -x _inst -x _build -x .svn -ur libcxx.old/include/utility libcxx.new/include/utility --- libcxx.old/include/utility 2016-07-08 12:46:02.570334913 +0000 +++ libcxx.new/include/utility 2016-07-08 12:51:00.760636878 +0000 @@ -217,7 +217,7 @@ } template -inline _LIBCPP_INLINE_VISIBILITY +inline __attribute__ ((__always_inline__)) void swap(_Tp (&__a)[_Np], _Tp (&__b)[_Np]) _NOEXCEPT_(__is_nothrow_swappable<_Tp>::value) { ================================================ FILE: pkgs/firefox-nightly-bin/update.nix ================================================ { name , writeScript , xidel , coreutils , gnused , gnugrep , curl , jq }: let version = (builtins.parseDrvName name).version; in writeScript "update-firefox-nightly-bin" '' PATH=${coreutils}/bin:${gnused}/bin:${gnugrep}/bin:${xidel}/bin:${curl}/bin:${jq}/bin #set -eux pushd pkgs/firefox-nightly-bin tmpfile=`mktemp` url=https://archive.mozilla.org/pub/firefox/nightly/latest-mozilla-central/ nightly_file=`curl $url | \ xidel - --extract //a | \ grep firefox | \ grep linux-x86_64.json | \ tail -1 | \ sed -e 's/.json//'` nightly_json=`curl --silent $url$nightly_file.json` cat > $tmpfile < // Index: gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_unsigned/requirements/typedefs-2.cc =================================================================== --- gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_unsigned/requirements/typedefs-2.cc (revision 194579) +++ gcc-4_7-branch/libstdc++-v3/testsuite/20_util/make_unsigned/requirements/typedefs-2.cc (revision 194580) @@ -1,5 +1,5 @@ // { dg-options "-std=gnu++0x -funsigned-char -fshort-enums" } -// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } } +// { dg-options "-std=gnu++0x -funsigned-char -fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } } // 2007-05-03 Benjamin Kosnik // Index: gcc-4_7-branch/libjava/configure.ac =================================================================== --- gcc-4_7-branch/libjava/configure.ac (revision 194579) +++ gcc-4_7-branch/libjava/configure.ac (revision 194580) @@ -931,7 +931,7 @@ # on Darwin -single_module speeds up loading of the dynamic libraries. extra_ldflags_libjava=-Wl,-single_module ;; -arm*linux*eabi) +arm*-*-linux*eabi*) # Some of the ARM unwinder code is actually in libstdc++. We # could in principle replicate it in libgcj, but it's better to # have a dependency on libstdc++. Index: gcc-4_7-branch/libjava/configure =================================================================== --- gcc-4_7-branch/libjava/configure (revision 194579) +++ gcc-4_7-branch/libjava/configure (revision 194580) @@ -20542,7 +20542,7 @@ # on Darwin -single_module speeds up loading of the dynamic libraries. extra_ldflags_libjava=-Wl,-single_module ;; -arm*linux*eabi) +arm*-*-linux*eabi*) # Some of the ARM unwinder code is actually in libstdc++. We # could in principle replicate it in libgcj, but it's better to # have a dependency on libstdc++. Index: gcc-4_7-branch/libgcc/config.host =================================================================== --- gcc-4_7-branch/libgcc/config.host (revision 194579) +++ gcc-4_7-branch/libgcc/config.host (revision 194580) @@ -327,7 +327,7 @@ arm*-*-linux*) # ARM GNU/Linux with ELF tmake_file="${tmake_file} arm/t-arm t-fixedpoint-gnu-prefix" case ${host} in - arm*-*-linux-*eabi) + arm*-*-linux-*eabi*) tmake_file="${tmake_file} arm/t-elf arm/t-bpabi arm/t-linux-eabi t-slibgcc-libgcc" tm_file="$tm_file arm/bpabi-lib.h" unwind_header=config/arm/unwind-arm.h Index: gcc-4_7-branch/gcc/doc/install.texi =================================================================== --- gcc-4_7-branch/gcc/doc/install.texi (revision 194579) +++ gcc-4_7-branch/gcc/doc/install.texi (revision 194580) @@ -3222,7 +3222,7 @@ @heading @anchor{arm-x-eabi}arm-*-eabi ARM-family processors. Subtargets that use the ELF object format require GNU binutils 2.13 or newer. Such subtargets include: -@code{arm-*-netbsdelf}, @code{arm-*-*linux-gnueabi} +@code{arm-*-netbsdelf}, @code{arm-*-*linux-gnueabi*} and @code{arm-*-rtemseabi}. @html Index: gcc-4_7-branch/gcc/testsuite/gcc.target/arm/synchronize.c =================================================================== --- gcc-4_7-branch/gcc/testsuite/gcc.target/arm/synchronize.c (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/gcc.target/arm/synchronize.c (revision 194580) @@ -1,4 +1,4 @@ -/* { dg-final { scan-assembler "__sync_synchronize|dmb|mcr" { target arm*-*-linux-*eabi } } } */ +/* { dg-final { scan-assembler "__sync_synchronize|dmb|mcr" { target arm*-*-linux-*eabi* } } } */ void *foo (void) { Index: gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.jason/enum6.C =================================================================== --- gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.jason/enum6.C (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.jason/enum6.C (revision 194580) @@ -7,10 +7,10 @@ // enum-size attributes should only be emitted if there are values of // enum type that can escape the compilation unit, gcc cannot currently // detect this; if this facility is added then this linker option should -// not be needed. arm-*-linux*eabi should be a good approximation to +// not be needed. arm-*-linux*eabi* should be a good approximation to // those platforms where the EABI supplement defines enum values to be // 32 bits wide. -// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } } +// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } } #include Index: gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.other/enum4.C =================================================================== --- gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.other/enum4.C (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.other/enum4.C (revision 194580) @@ -9,10 +9,10 @@ // enum-size attributes should only be emitted if there are values of // enum type that can escape the compilation unit, gcc cannot currently // detect this; if this facility is added then this linker option should -// not be needed. arm-*-linux*eabi should be a good approximation to +// not be needed. arm-*-linux*eabi* should be a good approximation to // those platforms where the EABI supplement defines enum values to be // 32 bits wide. -// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } } +// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } } enum E { a = -312 Index: gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.law/enum9.C =================================================================== --- gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.law/enum9.C (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/g++.old-deja/g++.law/enum9.C (revision 194580) @@ -7,10 +7,10 @@ // enum-size attributes should only be emitted if there are values of // enum type that can escape the compilation unit, gcc cannot currently // detect this; if this facility is added then this linker option should -// not be needed. arm-*-linux*eabi should be a good approximation to +// not be needed. arm-*-linux*eabi* should be a good approximation to // those platforms where the EABI supplement defines enum values to be // 32 bits wide. -// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } } +// { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } } // GROUPS passed enums extern "C" int printf (const char *, ...); Index: gcc-4_7-branch/gcc/testsuite/lib/target-supports.exp =================================================================== --- gcc-4_7-branch/gcc/testsuite/lib/target-supports.exp (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/lib/target-supports.exp (revision 194580) @@ -3818,7 +3818,7 @@ } } "" }] - } elseif { [istarget arm*-*-linux-gnueabi] } { + } elseif { [istarget arm*-*-linux-gnueabi*] } { return [check_runtime sync_longlong_runtime { #include int main () @@ -3860,7 +3860,7 @@ || [istarget i?86-*-*] || [istarget x86_64-*-*] || [istarget alpha*-*-*] - || [istarget arm*-*-linux-gnueabi] + || [istarget arm*-*-linux-gnueabi*] || [istarget bfin*-*linux*] || [istarget hppa*-*linux*] || [istarget s390*-*-*] @@ -3890,7 +3890,7 @@ || [istarget i?86-*-*] || [istarget x86_64-*-*] || [istarget alpha*-*-*] - || [istarget arm*-*-linux-gnueabi] + || [istarget arm*-*-linux-gnueabi*] || [istarget hppa*-*linux*] || [istarget s390*-*-*] || [istarget powerpc*-*-*] Index: gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_9.f90 =================================================================== --- gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_9.f90 (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_9.f90 (revision 194580) @@ -1,6 +1,6 @@ ! { dg-do run } ! { dg-options "-fshort-enums" } -! { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } } +! { dg-options "-fshort-enums -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } } ! Program to test enumerations when option -fshort-enums is given program main Index: gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_10.f90 =================================================================== --- gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_10.f90 (revision 194579) +++ gcc-4_7-branch/gcc/testsuite/gfortran.dg/enum_10.f90 (revision 194580) @@ -1,7 +1,7 @@ ! { dg-do run } ! { dg-additional-sources enum_10.c } ! { dg-options "-fshort-enums -w" } -! { dg-options "-fshort-enums -w -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi } } +! { dg-options "-fshort-enums -w -Wl,--no-enum-size-warning" { target arm*-*-linux*eabi* } } ! Make sure short enums are indeed interoperable with the ! corresponding C type. Index: gcc-4_7-branch/gcc/ada/gcc-interface/Makefile.in =================================================================== --- gcc-4_7-branch/gcc/ada/gcc-interface/Makefile.in (revision 194579) +++ gcc-4_7-branch/gcc/ada/gcc-interface/Makefile.in (revision 194580) @@ -1866,7 +1866,7 @@ LIBRARY_VERSION := $(LIB_VERSION) endif -ifeq ($(strip $(filter-out arm% linux-gnueabi,$(arch) $(osys)-$(word 4,$(targ)))),) +ifeq ($(strip $(filter-out arm%-linux,$(arch)-$(osys)) $(if $(findstring eabi,$(word 4,$(targ))),,$(word 4,$(targ)))),) LIBGNAT_TARGET_PAIRS = \ a-intnam.ads zip != null && unzip != null && zlib != null && boehmgc != null && perl != null; # for `--enable-java-home' assert langAda -> gnatboot != null; assert langVhdl -> gnat != null; # LTO needs libelf and zlib. assert libelf != null -> zlib != null; # Make sure we get GNU sed. assert stdenv.isDarwin -> gnused != null; # The go frontend is written in c++ assert langGo -> langCC; with lib; with builtins; let version = "4.7.3"; # Whether building a cross-compiler for GNU/Hurd. crossGNU = cross != null && cross.config == "i586-pc-gnu"; /* gccinstall.info says that "parallel make is currently not supported since collisions in profile collecting may occur". Parallel make of gfortran is disabled because of an apparent race condition concerning the generation of "bconfig.h". Please try and re-enable parallel make for a later release of gfortran to check whether the error has been fixed. */ enableParallelBuilding = !profiledCompiler && !langFortran; patches = [] ++ optional enableParallelBuilding ./parallel-bconfig-4.7.patch ++ optional stdenv.isArm [ ./arm-eabi.patch ] ++ optional (cross != null) ./libstdc++-target.patch # ++ optional noSysDirs ./no-sys-dirs.patch # The GNAT Makefiles did not pay attention to CFLAGS_FOR_TARGET for its # target libraries and tools. ++ optional langAda ./gnat-cflags.patch ++ optional langFortran ./gfortran-driving.patch; javaEcj = fetchurl { # The `$(top_srcdir)/ecj.jar' file is automatically picked up at # `configure' time. # XXX: Eventually we might want to take it from upstream. url = "ftp://sourceware.org/pub/java/ecj-4.3.jar"; sha256 = "0jz7hvc0s6iydmhgh5h2m15yza7p2rlss2vkif30vm9y77m97qcx"; }; # Antlr (optional) allows the Java `gjdoc' tool to be built. We want a # binary distribution here to allow the whole chain to be bootstrapped. javaAntlr = fetchurl { url = "http://www.antlr.org/download/antlr-3.1.3.jar"; sha256 = "1f41j0y4kjydl71lqlvr73yagrs2jsg1fjymzjz66mjy7al5lh09"; }; xlibs = [ libX11 libXt libSM libICE libXtst libXrender libXrandr libXi xproto renderproto xextproto inputproto randrproto ]; javaAwtGtk = langJava && gtk != null; /* Platform flags */ platformFlags = let gccArch = lib.attrByPath [ "platform" "gcc" "arch" ] null stdenv; gccCpu = lib.attrByPath [ "platform" "gcc" "cpu" ] null stdenv; gccAbi = lib.attrByPath [ "platform" "gcc" "abi" ] null stdenv; gccFpu = lib.attrByPath [ "platform" "gcc" "fpu" ] null stdenv; gccFloat = lib.attrByPath [ "platform" "gcc" "float" ] null stdenv; gccMode = lib.attrByPath [ "platform" "gcc" "mode" ] null stdenv; withArch = if gccArch != null then " --with-arch=${gccArch}" else ""; withCpu = if gccCpu != null then " --with-cpu=${gccCpu}" else ""; withAbi = if gccAbi != null then " --with-abi=${gccAbi}" else ""; withFpu = if gccFpu != null then " --with-fpu=${gccFpu}" else ""; withFloat = if gccFloat != null then " --with-float=${gccFloat}" else ""; withMode = if gccMode != null then " --with-mode=${gccMode}" else ""; in (withArch + withCpu + withAbi + withFpu + withFloat + withMode); /* Cross-gcc settings */ crossMingw = (cross != null && cross.libc == "msvcrt"); crossConfigureFlags = let gccArch = lib.attrByPath [ "gcc" "arch" ] null cross; gccCpu = lib.attrByPath [ "gcc" "cpu" ] null cross; gccAbi = lib.attrByPath [ "gcc" "abi" ] null cross; gccFpu = lib.attrByPath [ "gcc" "fpu" ] null cross; gccFloat = lib.attrByPath [ "gcc" "float" ] null cross; gccMode = lib.attrByPath [ "gcc" "mode" ] null cross; withArch = if gccArch != null then " --with-arch=${gccArch}" else ""; withCpu = if gccCpu != null then " --with-cpu=${gccCpu}" else ""; withAbi = if gccAbi != null then " --with-abi=${gccAbi}" else ""; withFpu = if gccFpu != null then " --with-fpu=${gccFpu}" else ""; withFloat = if gccFloat != null then " --with-float=${gccFloat}" else ""; withMode = if gccMode != null then " --with-mode=${gccMode}" else ""; in "--target=${cross.config}" + withArch + withCpu + withAbi + withFpu + withFloat + withMode + (if crossMingw && crossStageStatic then " --with-headers=${libcCross}/include" + " --with-gcc" + " --with-gnu-as" + " --with-gnu-ld" + " --with-gnu-ld" + " --disable-shared" + " --disable-nls" + " --disable-debug" + " --enable-sjlj-exceptions" + " --enable-threads=win32" + " --disable-win32-registry" else if crossStageStatic then " --disable-libssp --disable-nls" + " --without-headers" + " --disable-threads " + " --disable-libmudflap " + " --disable-libgomp " + " --disable-libquadmath" + " --disable-shared" + " --disable-decimal-float" # libdecnumber requires libc else " --with-headers=${libcCross}/include" + " --enable-__cxa_atexit" + " --enable-long-long" + (if crossMingw then " --enable-threads=win32" + " --enable-sjlj-exceptions" + " --enable-hash-synchronization" + " --disable-libssp" + " --disable-nls" + " --with-dwarf2" + # I think noone uses shared gcc libs in mingw, so we better do the same. # In any case, mingw32 g++ linking is broken by default with shared libs, # unless adding "-lsupc++" to any linking command. I don't know why. " --disable-shared" + (if cross.config == "x86_64-w64-mingw32" then # To keep ABI compatibility with upstream mingw-w64 " --enable-fully-dynamic-string" else "") else (if cross.libc == "uclibc" then # In uclibc cases, libgomp needs an additional '-ldl' # and as I don't know how to pass it, I disable libgomp. " --disable-libgomp" else "") + " --enable-threads=posix" + " --enable-nls" + " --disable-decimal-float") # No final libdecnumber (it may work only in 386) ); stageNameAddon = if crossStageStatic then "-stage-static" else "-stage-final"; crossNameAddon = if cross != null then "-${cross.config}" + stageNameAddon else ""; bootstrap = cross == null && !stdenv.isArm && !stdenv.isMips; in # We need all these X libraries when building AWT with GTK+. assert gtk != null -> (filter (x: x == null) xlibs) == []; stdenv.mkDerivation ({ name = "${name}${if stripped then "" else "-debug"}-${version}" + crossNameAddon; builder = ./builder.sh; src = fetchurl { url = "mirror://gnu/gcc/gcc-${version}/gcc-${version}.tar.bz2"; sha256 = "1hx9h64ivarlzi4hxvq42as5m9vlr5cyzaaq4gzj4i619zmkfz1g"; }; inherit patches; postPatch = if (stdenv.isGNU || (libcCross != null # e.g., building `gcc.crossDrv' && libcCross ? crossConfig && libcCross.crossConfig == "i586-pc-gnu") || (crossGNU && libcCross != null)) then # On GNU/Hurd glibc refers to Hurd & Mach headers and libpthread is not # in glibc, so add the right `-I' flags to the default spec string. assert libcCross != null -> libpthreadCross != null; let libc = if libcCross != null then libcCross else stdenv.glibc; gnu_h = "gcc/config/gnu.h"; extraCPPDeps = libc.propagatedBuildInputs ++ lib.optional (libpthreadCross != null) libpthreadCross ++ lib.optional (libpthread != null) libpthread; extraCPPSpec = concatStrings (intersperse " " (map (x: "-I${x}/include") extraCPPDeps)); extraLibSpec = if libpthreadCross != null then "-L${libpthreadCross}/lib ${libpthreadCross.TARGET_LDFLAGS}" else "-L${libpthread}/lib"; in '' echo "augmenting \`CPP_SPEC' in \`${gnu_h}' with \`${extraCPPSpec}'..." sed -i "${gnu_h}" \ -es'|CPP_SPEC *"\(.*\)$|CPP_SPEC "${extraCPPSpec} \1|g' echo "augmenting \`LIB_SPEC' in \`${gnu_h}' with \`${extraLibSpec}'..." sed -i "${gnu_h}" \ -es'|LIB_SPEC *"\(.*\)$|LIB_SPEC "${extraLibSpec} \1|g' echo "setting \`NATIVE_SYSTEM_HEADER_DIR' and \`STANDARD_INCLUDE_DIR' to \`${libc}/include'..." sed -i "${gnu_h}" \ -es'|#define STANDARD_INCLUDE_DIR.*$|#define STANDARD_INCLUDE_DIR "${libc}/include"|g' '' else if cross != null || stdenv.gcc.libc != null then # On NixOS, use the right path to the dynamic linker instead of # `/lib/ld*.so'. let libc = if libcCross != null then libcCross else stdenv.gcc.libc; in '' echo "fixing the \`GLIBC_DYNAMIC_LINKER' and \`UCLIBC_DYNAMIC_LINKER' macros..." for header in "gcc/config/"*-gnu.h "gcc/config/"*"/"*.h do grep -q LIBC_DYNAMIC_LINKER "$header" || continue echo " fixing \`$header'..." sed -i "$header" \ -e 's|define[[:blank:]]*\([UCG]\+\)LIBC_DYNAMIC_LINKER\([0-9]*\)[[:blank:]]"\([^\"]\+\)"$|define \1LIBC_DYNAMIC_LINKER\2 "${libc}\3"|g' done '' else null; inherit noSysDirs staticCompiler langJava crossStageStatic libcCross crossMingw; nativeBuildInputs = [ texinfo which gettext ] ++ (optional (perl != null) perl) ++ (optional javaAwtGtk pkgconfig); buildInputs = [ gmp mpfr mpc libelf ] ++ (optional (ppl != null) ppl) ++ (optional (cloog != null) cloog) ++ (optional (zlib != null) zlib) ++ (optionals langJava [ boehmgc zip unzip ]) ++ (optionals javaAwtGtk ([ gtk libart_lgpl ] ++ xlibs)) ++ (optionals (cross != null) [binutilsCross]) ++ (optionals langAda [gnatboot]) ++ (optionals langVhdl [gnat]) # The builder relies on GNU sed (for instance, Darwin's `sed' fails with # "-i may not be used with stdin"), and `stdenvNative' doesn't provide it. ++ (optional stdenv.isDarwin gnused) ; NIX_LDFLAGS = lib.optionalString stdenv.isSunOS "-lm -ldl"; preConfigure = '' configureFlagsArray=( ${lib.optionalString (ppl != null && ppl ? dontDisableStatic && ppl.dontDisableStatic) "'--with-host-libstdcxx=-lstdc++ -lgcc_s'"} ${lib.optionalString (ppl != null && stdenv.isSunOS) "\"--with-host-libstdcxx=-Wl,-rpath,\$prefix/lib/amd64 -lstdc++\" \"--with-boot-ldflags=-L../prev-x86_64-pc-solaris2.11/libstdc++-v3/src/.libs\""} ); ${lib.optionalString (stdenv.isSunOS && stdenv.is64bit) '' export NIX_LDFLAGS=`echo $NIX_LDFLAGS | sed -e s~$prefix/lib~$prefix/lib/amd64~g` export LDFLAGS_FOR_TARGET="-Wl,-rpath,$prefix/lib/amd64 $LDFLAGS_FOR_TARGET" export CXXFLAGS_FOR_TARGET="-Wl,-rpath,$prefix/lib/amd64 $CXXFLAGS_FOR_TARGET" export CFLAGS_FOR_TARGET="-Wl,-rpath,$prefix/lib/amd64 $CFLAGS_FOR_TARGET" ''} ''; # 'iant' at #go-nuts@freenode, gccgo maintainer, said that # they have a bug in 4.7.1 if adding "--disable-static" dontDisableStatic = langGo || staticCompiler; configureFlags = " ${if stdenv.isSunOS then " --enable-long-long --enable-libssp --enable-threads=posix --disable-nls --enable-__cxa_atexit " + # On Illumos/Solaris GNU as is preferred " --with-gnu-as --without-gnu-ld " else ""} --enable-lto ${if enableMultilib then "" else "--disable-multilib"} ${if enableShared then "" else "--disable-shared"} ${if enablePlugin then "--enable-plugin" else "--disable-plugin"} ${if ppl != null then "--with-ppl=${ppl} --disable-ppl-version-check" else ""} ${if cloog != null then "--with-cloog=${cloog} --disable-cloog-version-check --enable-cloog-backend=isl" else ""} ${if langJava then "--with-ecj-jar=${javaEcj} " + # Follow Sun's layout for the convenience of IcedTea/OpenJDK. See # . "--enable-java-home --with-java-home=\${prefix}/lib/jvm/jre " else ""} ${if javaAwtGtk then "--enable-java-awt=gtk" else ""} ${if langJava && javaAntlr != null then "--with-antlr-jar=${javaAntlr}" else ""} --with-gmp=${gmp} --with-mpfr=${mpfr} --with-mpc=${mpc} ${if libelf != null then "--with-libelf=${libelf}" else ""} --disable-libstdcxx-pch --without-included-gettext --with-system-zlib --enable-languages=${ concatStrings (intersperse "," ( optional langC "c" ++ optional langCC "c++" ++ optional langFortran "fortran" ++ optional langJava "java" ++ optional langAda "ada" ++ optional langVhdl "vhdl" ++ optional langGo "go" ) ) } ${if (stdenv ? glibc && cross == null) then " --with-native-system-header-dir=${stdenv.glibc}/include" else ""} ${if langAda then " --enable-libada" else ""} ${if cross == null && stdenv.isi686 then "--with-arch=i686" else ""} ${if cross != null then crossConfigureFlags else ""} ${if !bootstrap then "--disable-bootstrap" else ""} ${if cross == null then platformFlags else ""} "; targetConfig = if cross != null then cross.config else null; buildFlags = if bootstrap then (if profiledCompiler then "profiledbootstrap" else "bootstrap") else ""; installTargets = if stripped then "install-strip" else "install"; crossAttrs = let xgccArch = lib.attrByPath [ "gcc" "arch" ] null stdenv.cross; xgccCpu = lib.attrByPath [ "gcc" "cpu" ] null stdenv.cross; xgccAbi = lib.attrByPath [ "gcc" "abi" ] null stdenv.cross; xgccFpu = lib.attrByPath [ "gcc" "fpu" ] null stdenv.cross; xgccFloat = lib.attrByPath [ "gcc" "float" ] null stdenv.cross; xwithArch = if xgccArch != null then " --with-arch=${xgccArch}" else ""; xwithCpu = if xgccCpu != null then " --with-cpu=${xgccCpu}" else ""; xwithAbi = if xgccAbi != null then " --with-abi=${xgccAbi}" else ""; xwithFpu = if xgccFpu != null then " --with-fpu=${xgccFpu}" else ""; xwithFloat = if xgccFloat != null then " --with-float=${xgccFloat}" else ""; in { AR = "${stdenv.cross.config}-ar"; LD = "${stdenv.cross.config}-ld"; CC = "${stdenv.cross.config}-gcc"; CXX = "${stdenv.cross.config}-gcc"; AR_FOR_TARGET = "${stdenv.cross.config}-ar"; LD_FOR_TARGET = "${stdenv.cross.config}-ld"; CC_FOR_TARGET = "${stdenv.cross.config}-gcc"; NM_FOR_TARGET = "${stdenv.cross.config}-nm"; CXX_FOR_TARGET = "${stdenv.cross.config}-g++"; # If we are making a cross compiler, cross != null NIX_GCC_CROSS = if cross == null then "${stdenv.gccCross}" else ""; dontStrip = true; configureFlags = '' ${if enableMultilib then "" else "--disable-multilib"} ${if enableShared then "" else "--disable-shared"} ${if ppl != null then "--with-ppl=${ppl.crossDrv}" else ""} ${if cloog != null then "--with-cloog=${cloog.crossDrv} --enable-cloog-backend=isl" else ""} ${if langJava then "--with-ecj-jar=${javaEcj.crossDrv}" else ""} ${if javaAwtGtk then "--enable-java-awt=gtk" else ""} ${if langJava && javaAntlr != null then "--with-antlr-jar=${javaAntlr.crossDrv}" else ""} --with-gmp=${gmp.crossDrv} --with-mpfr=${mpfr.crossDrv} --disable-libstdcxx-pch --without-included-gettext --with-system-zlib --enable-languages=${ concatStrings (intersperse "," ( optional langC "c" ++ optional langCC "c++" ++ optional langFortran "fortran" ++ optional langJava "java" ++ optional langAda "ada" ++ optional langVhdl "vhdl" ++ optional langGo "go" ) ) } ${if langAda then " --enable-libada" else ""} --target=${stdenv.cross.config} ${xwithArch} ${xwithCpu} ${xwithAbi} ${xwithFpu} ${xwithFloat} ''; buildFlags = ""; }; # Needed for the cross compilation to work AR = "ar"; LD = "ld"; # http://gcc.gnu.org/install/specific.html#x86-64-x-solaris210 CC = if stdenv.system == "x86_64-solaris" then "gcc -m64" else "gcc"; # Setting $CPATH and $LIBRARY_PATH to make sure both `gcc' and `xgcc' find # the library headers and binaries, regarless of the language being # compiled. # Note: When building the Java AWT GTK+ peer, the build system doesn't # honor `--with-gmp' et al., e.g., when building # `libjava/classpath/native/jni/java-math/gnu_java_math_GMP.c', so we just # add them to $CPATH and $LIBRARY_PATH in this case. # # Likewise, the LTO code doesn't find zlib. CPATH = concatStrings (intersperse ":" (map (x: x + "/include") (optionals (zlib != null) [ zlib ] ++ optionals langJava [ boehmgc ] ++ optionals javaAwtGtk xlibs ++ optionals javaAwtGtk [ gmp mpfr ] ++ optional (libpthread != null) libpthread ++ optional (libpthreadCross != null) libpthreadCross # On GNU/Hurd glibc refers to Mach & Hurd # headers. ++ optionals (libcCross != null && hasAttr "propagatedBuildInputs" libcCross) libcCross.propagatedBuildInputs))); LIBRARY_PATH = concatStrings (intersperse ":" (map (x: x + "/lib") (optionals (zlib != null) [ zlib ] ++ optionals langJava [ boehmgc ] ++ optionals javaAwtGtk xlibs ++ optionals javaAwtGtk [ gmp mpfr ] ++ optional (libpthread != null) libpthread))); EXTRA_TARGET_CFLAGS = if cross != null && libcCross != null then "-idirafter ${libcCross}/include" else null; EXTRA_TARGET_LDFLAGS = if cross != null && libcCross != null then "-B${libcCross}/lib -Wl,-L${libcCross}/lib" + (optionalString (libpthreadCross != null) " -L${libpthreadCross}/lib -Wl,${libpthreadCross.TARGET_LDFLAGS}") else null; passthru = { inherit langC langCC langAda langFortran langVhdl langGo enableMultilib version; }; inherit enableParallelBuilding; meta = { homepage = "http://gcc.gnu.org/"; license = "GPLv3+"; # runtime support libraries are typically LGPLv3+ description = "GNU Compiler Collection, version ${version}" + (if stripped then "" else " (with debugging info)"); longDescription = '' The GNU Compiler Collection includes compiler front ends for C, C++, Objective-C, Fortran, OpenMP for C/C++/Fortran, Java, and Ada, as well as libraries for these languages (libstdc++, libgcj, libgomp,...). GCC development is a part of the GNU Project, aiming to improve the compiler used in the GNU system including the GNU/Linux variant. ''; maintainers = [ lib.maintainers.ludo lib.maintainers.viric lib.maintainers.shlevy ]; # Volunteers needed for the {Cyg,Dar}win ports of *PPL. # gnatboot is not available out of linux platforms, so we disable the darwin build # for the gnat (ada compiler). platforms = lib.platforms.linux ++ optionals (langAda == false && libelf == null) [ "i686-darwin" ]; }; } // optionalAttrs (cross != null && cross.libc == "msvcrt" && crossStageStatic) { makeFlags = [ "all-gcc" "all-target-libgcc" ]; installTargets = "install-gcc install-target-libgcc"; } # Strip kills static libs of other archs (hence cross != null) // optionalAttrs (!stripped || cross != null) { dontStrip = true; NIX_STRIP_DEBUG = 0; } ) ================================================ FILE: pkgs/gcc-4.7/gfortran-driving.patch ================================================ This patch fixes interaction with Libtool. See , for details. --- a/gcc/fortran/gfortranspec.c +++ b/gcc/fortran/gfortranspec.c @@ -461,8 +461,15 @@ For more information about these matters, see the file named COPYING\n\n")); { fprintf (stderr, _("Driving:")); for (i = 0; i < g77_newargc; i++) + { + if (g77_new_decoded_options[i].opt_index == OPT_l) + /* Make sure no white space is inserted after `-l'. */ + fprintf (stderr, " -l%s", + g77_new_decoded_options[i].canonical_option[1]); + else fprintf (stderr, " %s", g77_new_decoded_options[i].orig_option_with_args_text); + } fprintf (stderr, "\n"); } ================================================ FILE: pkgs/gcc-4.7/gnat-cflags.patch ================================================ diff --git a/libada/Makefile.in b/libada/Makefile.in index f5057a0..337e0c6 100644 --- a/libada/Makefile.in +++ b/libada/Makefile.in @@ -55,7 +55,7 @@ GCC_WARN_CFLAGS = $(LOOSE_WARN) WARN_CFLAGS = @warn_cflags@ TARGET_LIBGCC2_CFLAGS= -GNATLIBCFLAGS= -g -O2 +GNATLIBCFLAGS= -g -O2 $(CFLAGS) GNATLIBCFLAGS_FOR_C = $(GNATLIBCFLAGS) $(TARGET_LIBGCC2_CFLAGS) -fexceptions \ -DIN_RTS @have_getipinfo@ --- a/gcc/ada/gcc-interface/Makefile.in +++ b/gcc/ada/gcc-interface/Makefile.in @@ -105,7 +105,7 @@ ADAFLAGS = -W -Wall -gnatpg -gnata SOME_ADAFLAGS =-gnata FORCE_DEBUG_ADAFLAGS = -g GNATLIBFLAGS = -gnatpg -nostdinc -GNATLIBCFLAGS = -g -O2 +GNATLIBCFLAGS = -g -O2 $(CFLAGS_FOR_TARGET) # Pretend that _Unwind_GetIPInfo is available for the target by default. This # should be autodetected during the configuration of libada and passed down to # here, but we need something for --disable-libada and hope for the best. @@ -193,7 +193,7 @@ RTSDIR = rts$(subst /,_,$(MULTISUBDIR)) # Link flags used to build gnat tools. By default we prefer to statically # link with libgcc to avoid a dependency on shared libgcc (which is tricky # to deal with as it may conflict with the libgcc provided by the system). -GCC_LINK_FLAGS=-static-libgcc +GCC_LINK_FLAGS=-static-libgcc $(CFLAGS_FOR_TARGET) # End of variables for you to override. ================================================ FILE: pkgs/gcc-4.7/java-jvgenmain-link.patch ================================================ The `jvgenmain' executable must be linked against `vec.o', among others, since it uses its vector API. --- gcc-4.3.3/gcc/java/Make-lang.in 2008-12-05 00:00:19.000000000 +0100 +++ gcc-4.3.3/gcc/java/Make-lang.in 2009-07-03 16:11:41.000000000 +0200 @@ -109,9 +109,9 @@ jcf-dump$(exeext): $(JCFDUMP_OBJS) $(LIB $(CC) $(ALL_CFLAGS) $(LDFLAGS) -o $@ $(JCFDUMP_OBJS) \ $(CPPLIBS) $(ZLIB) $(LDEXP_LIB) $(LIBS) -jvgenmain$(exeext): $(JVGENMAIN_OBJS) $(LIBDEPS) +jvgenmain$(exeext): $(JVGENMAIN_OBJS) $(LIBDEPS) $(BUILD_RTL) rm -f $@ - $(CC) $(ALL_CFLAGS) $(LDFLAGS) -o $@ $(JVGENMAIN_OBJS) $(LIBS) + $(CC) $(ALL_CFLAGS) $(LDFLAGS) -o $@ $(JVGENMAIN_OBJS) $(BUILD_RTL) $(LIBS) # # Build hooks: ================================================ FILE: pkgs/gcc-4.7/libstdc++-target.patch ================================================ Patch to make the target libraries 'configure' scripts find the proper CPP. I noticed that building the mingw32 cross compiler. Looking at the build script for mingw in archlinux, I think that only nixos needs this patch. I don't know why. diff --git a/Makefile.in b/Makefile.in index 93f66b6..d691917 100644 --- a/Makefile.in +++ b/Makefile.in @@ -266,6 +266,7 @@ BASE_TARGET_EXPORTS = \ AR="$(AR_FOR_TARGET)"; export AR; \ AS="$(COMPILER_AS_FOR_TARGET)"; export AS; \ CC="$(CC_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CC; \ + CPP="$(CC_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS -E"; export CC; \ CFLAGS="$(CFLAGS_FOR_TARGET)"; export CFLAGS; \ CONFIG_SHELL="$(SHELL)"; export CONFIG_SHELL; \ CPPFLAGS="$(CPPFLAGS_FOR_TARGET)"; export CPPFLAGS; \ @@ -291,11 +292,13 @@ BASE_TARGET_EXPORTS = \ RAW_CXX_TARGET_EXPORTS = \ $(BASE_TARGET_EXPORTS) \ CXX_FOR_TARGET="$(RAW_CXX_FOR_TARGET)"; export CXX_FOR_TARGET; \ - CXX="$(RAW_CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX; + CXX="$(RAW_CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX; \ + CXXCPP="$(RAW_CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS -E"; export CXX; NORMAL_TARGET_EXPORTS = \ $(BASE_TARGET_EXPORTS) \ - CXX="$(CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX; + CXX="$(CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS"; export CXX; \ + CXXCPP="$(CXX_FOR_TARGET) $(XGCC_FLAGS_FOR_TARGET) $$TFLAGS -E"; export CXX; # Where to find GMP HOST_GMPLIBS = @gmplibs@ ================================================ FILE: pkgs/gcc-4.7/no-sys-dirs.patch ================================================ diff -ru gcc-4.3.1-orig/gcc/cppdefault.c gcc-4.3.1/gcc/cppdefault.c --- gcc-4.3.1-orig/gcc/cppdefault.c 2007-07-26 10:37:01.000000000 +0200 +++ gcc-4.3.1/gcc/cppdefault.c 2008-06-25 17:48:23.000000000 +0200 @@ -41,6 +41,10 @@ # undef CROSS_INCLUDE_DIR #endif +#undef LOCAL_INCLUDE_DIR +#undef SYSTEM_INCLUDE_DIR +#undef STANDARD_INCLUDE_DIR + const struct default_include cpp_include_defaults[] #ifdef INCLUDE_DEFAULTS = INCLUDE_DEFAULTS; diff -ru gcc-4.3.1-orig/gcc/gcc.c gcc-4.3.1/gcc/gcc.c --- gcc-4.3.1-orig/gcc/gcc.c 2008-03-02 23:55:19.000000000 +0100 +++ gcc-4.3.1/gcc/gcc.c 2008-06-25 17:52:53.000000000 +0200 @@ -1478,10 +1478,10 @@ /* Default prefixes to attach to command names. */ #ifndef STANDARD_STARTFILE_PREFIX_1 -#define STANDARD_STARTFILE_PREFIX_1 "/lib/" +#define STANDARD_STARTFILE_PREFIX_1 "" #endif #ifndef STANDARD_STARTFILE_PREFIX_2 -#define STANDARD_STARTFILE_PREFIX_2 "/usr/lib/" +#define STANDARD_STARTFILE_PREFIX_2 "" #endif #ifdef CROSS_DIRECTORY_STRUCTURE /* Don't use these prefixes for a cross compiler. */ --- gcc-4.3.1-orig/gcc/Makefile.in 2008-05-11 20:54:15.000000000 +0200 +++ gcc-4.3.1/gcc/Makefile.in 2008-06-25 17:48:23.000000000 +0200 @@ -3277,7 +3281,7 @@ -DGPLUSPLUS_INCLUDE_DIR=\"$(gcc_gxx_include_dir)\" \ -DGPLUSPLUS_TOOL_INCLUDE_DIR=\"$(gcc_gxx_include_dir)/$(target_noncanonical)\" \ -DGPLUSPLUS_BACKWARD_INCLUDE_DIR=\"$(gcc_gxx_include_dir)/backward\" \ - -DLOCAL_INCLUDE_DIR=\"$(local_includedir)\" \ + -DLOCAL_INCLUDE_DIR=\"/no-such-dir\" \ -DCROSS_INCLUDE_DIR=\"$(CROSS_SYSTEM_HEADER_DIR)\" \ -DTOOL_INCLUDE_DIR=\"$(gcc_tooldir)/include\" \ -DPREFIX=\"$(prefix)/\" \ ================================================ FILE: pkgs/gcc-4.7/parallel-bconfig-4.7.patch ================================================ diff --git a/gcc/Makefile.in b/gcc/Makefile.in index 0f6735a..ba93e9b 100644 --- a/gcc/Makefile.in +++ b/gcc/Makefile.in @@ -3904,21 +3904,21 @@ build/genflags.o : genflags.c $(RTL_BASE_H) $(OBSTACK_H) $(BCONFIG_H) \ $(SYSTEM_H) coretypes.h $(GTM_H) errors.h $(READ_MD_H) gensupport.h build/gengenrtl.o : gengenrtl.c $(BCONFIG_H) $(SYSTEM_H) rtl.def gengtype-lex.o build/gengtype-lex.o : gengtype-lex.c gengtype.h $(SYSTEM_H) -gengtype-lex.o: $(CONFIG_H) +gengtype-lex.o: $(CONFIG_H) $(BCONFIG_H) build/gengtype-lex.o: $(BCONFIG_H) gengtype-parse.o build/gengtype-parse.o : gengtype-parse.c gengtype.h \ $(SYSTEM_H) -gengtype-parse.o: $(CONFIG_H) +gengtype-parse.o: $(CONFIG_H) $(BCONFIG_H) build/gengtype-parse.o: $(BCONFIG_H) gengtype-state.o build/gengtype-state.o: gengtype-state.c $(SYSTEM_H) \ gengtype.h errors.h double-int.h version.h $(HASHTAB_H) $(OBSTACK_H) \ $(XREGEX_H) -gengtype-state.o: $(CONFIG_H) +gengtype-state.o: $(CONFIG_H) $(BCONFIG_H) build/gengtype-state.o: $(BCONFIG_H) gengtype.o build/gengtype.o : gengtype.c $(SYSTEM_H) gengtype.h \ rtl.def insn-notes.def errors.h double-int.h version.h $(HASHTAB_H) \ $(OBSTACK_H) $(XREGEX_H) -gengtype.o: $(CONFIG_H) +gengtype.o: $(CONFIG_H) $(BCONFIG_H) build/gengtype.o: $(BCONFIG_H) build/genmddeps.o: genmddeps.c $(BCONFIG_H) $(SYSTEM_H) coretypes.h \ errors.h $(READ_MD_H) ================================================ FILE: pkgs/gecko/default.nix ================================================ { geckoSrc ? null, lib , stdenv, fetchFromGitHub, pythonFull, which, autoconf213, m4 , perl, unzip, zip, gnumake, yasm, pkgconfig, xlibs, gnome2, pango, freetype, fontconfig, cairo , dbus, dbus_glib, alsaLib, libpulseaudio , gtk3, glib, gobjectIntrospection, gdk_pixbuf, atk, gtk2 , git, mercurial, openssl, cmake, procps , libnotify , valgrind, gdb, rr , inotify-tools , setuptools , rust # rust & cargo bundled. (otheriwse use pkgs.rust.{rustc,cargo}) , buildFHSUserEnv # Build a FHS environment with all Gecko dependencies. , llvm, llvmPackages, nasm , ccache , zlib, xorg , rust-cbindgen , nodejs , jsdoc , fzf # needed by "mack try fuzzy" }: let inherit (lib) updateFromGitHub importJSON optionals inNixShell; gcc = if stdenv.cc.isGNU then stdenv.cc.cc else stdenv.cc.cc.stdenv.cc.cc; # Gecko sources are huge, we do not want to import them in the nix-store when # we use this expression for making a build environment. src = if inNixShell then null else if geckoSrc == null then fetchFromGitHub (importJSON ./source.json) else geckoSrc; version = "HEAD"; # XXX: builtins.readFile "${src}/browser/config/version.txt"; buildInputs = [ # Expected by "mach" pythonFull setuptools which autoconf213 m4 # Expected by the configure script perl unzip zip gnumake yasm pkgconfig xlibs.libICE xlibs.libSM xlibs.libX11 xlibs.libXau xlibs.libxcb xlibs.libXdmcp xlibs.libXext xlibs.libXt xlibs.libXtst xlibs.libXcomposite xlibs.libXfixes xlibs.libXdamage xlibs.libXrender ] ++ (if xlibs ? xproto then [ xlibs.damageproto xlibs.printproto xlibs.kbproto xlibs.renderproto xlibs.xextproto xlibs.xproto xlibs.compositeproto xlibs.fixesproto ] else [ xorg.xorgproto ]) ++ [ gnome2.libart_lgpl gnome2.libbonobo gnome2.libbonoboui gnome2.libgnome gnome2.libgnomecanvas gnome2.libgnomeui gnome2.libIDL pango freetype fontconfig cairo dbus dbus_glib alsaLib libpulseaudio gtk3 glib gobjectIntrospection gdk_pixbuf atk gtk2 gnome2.GConf rust # For building bindgen # Building bindgen is now done with the extra options added by genMozConfig # shellHook, do not include clang directly in order to avoid messing up with # the choices of the compilers. # clang llvm # mach mochitest procps # "mach vendor rust" wants to list modified files by using the vcs. git mercurial # needed for compiling cargo-vendor and its dependencies openssl cmake # Useful for getting notification at the end of the build. libnotify # cbindgen is used to generate C bindings for WebRender. rust-cbindgen # nasm is used to build libdav1d. nasm # NodeJS is used for tooling around JS development. nodejs # Used for building documentation. # jsdoc ] ++ optionals inNixShell [ valgrind gdb ccache (if stdenv.isAarch64 then null else rr) fzf # needed by "mach try fuzzy" inotify-tools # Workaround download of prebuilt binaries. ]; # bindgen.configure now has a rule to check that with-libclang-path matches CC # or CXX. Default to the stdenv compiler if we are compiling with clang. clang_path = if stdenv.cc.isGNU then "${llvmPackages.clang}/bin/clang" else "${stdenv.cc}/bin/cc"; libclang_path = if stdenv.cc.isGNU then "${llvmPackages.clang.cc.lib}/lib" else "${stdenv.cc.cc.lib}/lib"; genMozConfig = '' cxxLib=$( echo -n ${gcc}/include/c++/* ) archLib=$cxxLib/$( ${gcc}/bin/gcc -dumpmachine ) cat - > $MOZCONFIG <> $MOZCONFIG " # . $src/build/mozconfig.common ac_add_options --enable-application=browser mk_add_options MOZ_OBJDIR=$builddir ac_add_options --prefix=$out ac_add_options --enable-official-branding " ''; AUTOCONF = "${autoconf213}/bin/autoconf"; buildPhase = '' cd $builddir $src/mach build ''; installPhase = '' cd $builddir $src/mach install ''; # TODO: are there tests we would like to run? or should we package them separately? doCheck = false; doInstallCheck = false; # This is for debugging purposes, go to hell damn wrapper which are removing # all I need for debugging. hardeningDisable = [ "all" ]; passthru.updateScript = updateFromGitHub { owner = "mozilla"; repo = "gecko-dev"; branch = "master"; path = "pkgs/gecko/source.json"; }; passthru.fhs = fhs; # gecko.x86_64-linux.gcc.fhs.env } ================================================ FILE: pkgs/gecko/source.json ================================================ { "owner": "mozilla", "repo": "gecko-dev", "rev": "fee636af734a0ce6dc7335691cc94664bafc385d", "sha256": "0nnkqmglbi2znkz1avnyn064i5hngvsqrmhw8ccg6g4ga9bac8fv" } ================================================ FILE: pkgs/git-cinnabar/default.nix ================================================ { stdenv, fetchFromGitHub, autoconf , zlib , python , perl , gettext , git , mercurial , curl }: # NOTE: git-cinnabar depends on a specific version of git-core, thus you should # ensure that you install a git-cinnabar version which matches your git version. # # NOTE: This package only provides git-cinnabar tools, as a git users might want # to have additional commands not provided by this forked version of git-core. stdenv.mkDerivation rec { version = "0.5.4"; name = "git-cinnabar-${version}"; src = fetchFromGitHub { owner = "glandium"; repo = "git-cinnabar"; inherit name; rev = version; # tag name fetchSubmodules = true; sha256 = "1cjn2cc6mj4m736wxab9s6qx83p5n5ha8cr3x84s9ra6rxs8d7pi"; }; buildInputs = [ autoconf python gettext git curl ]; ZLIB_PATH = zlib; ZLIB_DEV_PATH = zlib.dev; PERL_PATH = "${perl}/bin/perl"; NO_TCLTK = true; V=1; preBuild = '' export ZLIB_PATH; export ZLIB_DEV_PATH; substituteInPlace git-core/Makefile --replace \ '$(ZLIB_PATH)/include' '$(ZLIB_DEV_PATH)/include' # Comment out calls to git to try to verify that git-core is up to date substituteInPlace Makefile \ --replace '$(eval $(call exec,git' '# $(eval $(call exec,git' export PERL_PATH; export NO_TCLTK export V; ''; makeFlags = "prefix=\${out}"; installTargets = "git-install"; postInstall = let mercurial-py = mercurial + "/" + mercurial.python.sitePackages; in '' # git-cinnabar rebuild git, we do not need that. rm -rf $out/bin/* $out/share $out/lib for f in $out/libexec/git-core/{git-remote-hg,git-cinnabar} ; do substituteInPlace $f --replace \ "sys.path.append(os.path.join(os.path.dirname(__file__), 'pythonlib'))" \ "sys.path.extend(['$out/libexec/git-core/pythonlib', '${mercurial-py}'])" mv $f $out/bin done mv $out/libexec/git-core/git-cinnabar-helper $out/bin/git-cinnabar-helper mv $out/libexec/git-core/pythonlib $out/pythonlib rm -rf $out/libexec/git-core/* mv $out/pythonlib $out/libexec/git-core/pythonlib substituteInPlace $out/libexec/git-core/pythonlib/cinnabar/helper.py \ --replace 'Git.config('cinnabar.helper')' "Git.config('cinnabar.helper') or '$out/bin/git-cinnabar-helper'" ''; } ================================================ FILE: pkgs/jsdoc/default.nix ================================================ # This file has been generated by node2nix 1.9.0. Do not edit! {pkgs ? import { inherit system; }, system ? builtins.currentSystem, nodejs ? pkgs."nodejs-12_x"}: let nodeEnv = import ./node-env.nix { inherit (pkgs) stdenv lib python2 runCommand writeTextFile; inherit pkgs nodejs; libtool = if pkgs.stdenv.isDarwin then pkgs.darwin.cctools else null; }; in import ./node-packages.nix { inherit (pkgs) fetchurl nix-gitignore stdenv lib fetchgit; inherit nodeEnv; } ================================================ FILE: pkgs/jsdoc/node-env.nix ================================================ # This file originates from node2nix {lib, stdenv, nodejs, python2, pkgs, libtool, runCommand, writeTextFile}: let # Workaround to cope with utillinux in Nixpkgs 20.09 and util-linux in Nixpkgs master utillinux = if pkgs ? utillinux then pkgs.utillinux else pkgs.util-linux; python = if nodejs ? python then nodejs.python else python2; # Create a tar wrapper that filters all the 'Ignoring unknown extended header keyword' noise tarWrapper = runCommand "tarWrapper" {} '' mkdir -p $out/bin cat > $out/bin/tar <> $out/nix-support/hydra-build-products ''; }; includeDependencies = {dependencies}: lib.optionalString (dependencies != []) (lib.concatMapStrings (dependency: '' # Bundle the dependencies of the package mkdir -p node_modules cd node_modules # Only include dependencies if they don't exist. They may also be bundled in the package. if [ ! -e "${dependency.name}" ] then ${composePackage dependency} fi cd .. '' ) dependencies); # Recursively composes the dependencies of a package composePackage = { name, packageName, src, dependencies ? [], ... }@args: builtins.addErrorContext "while evaluating node package '${packageName}'" '' DIR=$(pwd) cd $TMPDIR unpackFile ${src} # Make the base dir in which the target dependency resides first mkdir -p "$(dirname "$DIR/${packageName}")" if [ -f "${src}" ] then # Figure out what directory has been unpacked packageDir="$(find . -maxdepth 1 -type d | tail -1)" # Restore write permissions to make building work find "$packageDir" -type d -exec chmod u+x {} \; chmod -R u+w "$packageDir" # Move the extracted tarball into the output folder mv "$packageDir" "$DIR/${packageName}" elif [ -d "${src}" ] then # Get a stripped name (without hash) of the source directory. # On old nixpkgs it's already set internally. if [ -z "$strippedName" ] then strippedName="$(stripHash ${src})" fi # Restore write permissions to make building work chmod -R u+w "$strippedName" # Move the extracted directory into the output folder mv "$strippedName" "$DIR/${packageName}" fi # Unset the stripped name to not confuse the next unpack step unset strippedName # Include the dependencies of the package cd "$DIR/${packageName}" ${includeDependencies { inherit dependencies; }} cd .. ${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."} ''; pinpointDependencies = {dependencies, production}: let pinpointDependenciesFromPackageJSON = writeTextFile { name = "pinpointDependencies.js"; text = '' var fs = require('fs'); var path = require('path'); function resolveDependencyVersion(location, name) { if(location == process.env['NIX_STORE']) { return null; } else { var dependencyPackageJSON = path.join(location, "node_modules", name, "package.json"); if(fs.existsSync(dependencyPackageJSON)) { var dependencyPackageObj = JSON.parse(fs.readFileSync(dependencyPackageJSON)); if(dependencyPackageObj.name == name) { return dependencyPackageObj.version; } } else { return resolveDependencyVersion(path.resolve(location, ".."), name); } } } function replaceDependencies(dependencies) { if(typeof dependencies == "object" && dependencies !== null) { for(var dependency in dependencies) { var resolvedVersion = resolveDependencyVersion(process.cwd(), dependency); if(resolvedVersion === null) { process.stderr.write("WARNING: cannot pinpoint dependency: "+dependency+", context: "+process.cwd()+"\n"); } else { dependencies[dependency] = resolvedVersion; } } } } /* Read the package.json configuration */ var packageObj = JSON.parse(fs.readFileSync('./package.json')); /* Pinpoint all dependencies */ replaceDependencies(packageObj.dependencies); if(process.argv[2] == "development") { replaceDependencies(packageObj.devDependencies); } replaceDependencies(packageObj.optionalDependencies); /* Write the fixed package.json file */ fs.writeFileSync("package.json", JSON.stringify(packageObj, null, 2)); ''; }; in '' node ${pinpointDependenciesFromPackageJSON} ${if production then "production" else "development"} ${lib.optionalString (dependencies != []) '' if [ -d node_modules ] then cd node_modules ${lib.concatMapStrings (dependency: pinpointDependenciesOfPackage dependency) dependencies} cd .. fi ''} ''; # Recursively traverses all dependencies of a package and pinpoints all # dependencies in the package.json file to the versions that are actually # being used. pinpointDependenciesOfPackage = { packageName, dependencies ? [], production ? true, ... }@args: '' if [ -d "${packageName}" ] then cd "${packageName}" ${pinpointDependencies { inherit dependencies production; }} cd .. ${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."} fi ''; # Extract the Node.js source code which is used to compile packages with # native bindings nodeSources = runCommand "node-sources" {} '' tar --no-same-owner --no-same-permissions -xf ${nodejs.src} mv node-* $out ''; # Script that adds _integrity fields to all package.json files to prevent NPM from consulting the cache (that is empty) addIntegrityFieldsScript = writeTextFile { name = "addintegrityfields.js"; text = '' var fs = require('fs'); var path = require('path'); function augmentDependencies(baseDir, dependencies) { for(var dependencyName in dependencies) { var dependency = dependencies[dependencyName]; // Open package.json and augment metadata fields var packageJSONDir = path.join(baseDir, "node_modules", dependencyName); var packageJSONPath = path.join(packageJSONDir, "package.json"); if(fs.existsSync(packageJSONPath)) { // Only augment packages that exist. Sometimes we may have production installs in which development dependencies can be ignored console.log("Adding metadata fields to: "+packageJSONPath); var packageObj = JSON.parse(fs.readFileSync(packageJSONPath)); if(dependency.integrity) { packageObj["_integrity"] = dependency.integrity; } else { packageObj["_integrity"] = "sha1-000000000000000000000000000="; // When no _integrity string has been provided (e.g. by Git dependencies), add a dummy one. It does not seem to harm and it bypasses downloads. } if(dependency.resolved) { packageObj["_resolved"] = dependency.resolved; // Adopt the resolved property if one has been provided } else { packageObj["_resolved"] = dependency.version; // Set the resolved version to the version identifier. This prevents NPM from cloning Git repositories. } if(dependency.from !== undefined) { // Adopt from property if one has been provided packageObj["_from"] = dependency.from; } fs.writeFileSync(packageJSONPath, JSON.stringify(packageObj, null, 2)); } // Augment transitive dependencies if(dependency.dependencies !== undefined) { augmentDependencies(packageJSONDir, dependency.dependencies); } } } if(fs.existsSync("./package-lock.json")) { var packageLock = JSON.parse(fs.readFileSync("./package-lock.json")); if(![1, 2].includes(packageLock.lockfileVersion)) { process.stderr.write("Sorry, I only understand lock file versions 1 and 2!\n"); process.exit(1); } if(packageLock.dependencies !== undefined) { augmentDependencies(".", packageLock.dependencies); } } ''; }; # Reconstructs a package-lock file from the node_modules/ folder structure and package.json files with dummy sha1 hashes reconstructPackageLock = writeTextFile { name = "addintegrityfields.js"; text = '' var fs = require('fs'); var path = require('path'); var packageObj = JSON.parse(fs.readFileSync("package.json")); var lockObj = { name: packageObj.name, version: packageObj.version, lockfileVersion: 1, requires: true, dependencies: {} }; function augmentPackageJSON(filePath, dependencies) { var packageJSON = path.join(filePath, "package.json"); if(fs.existsSync(packageJSON)) { var packageObj = JSON.parse(fs.readFileSync(packageJSON)); dependencies[packageObj.name] = { version: packageObj.version, integrity: "sha1-000000000000000000000000000=", dependencies: {} }; processDependencies(path.join(filePath, "node_modules"), dependencies[packageObj.name].dependencies); } } function processDependencies(dir, dependencies) { if(fs.existsSync(dir)) { var files = fs.readdirSync(dir); files.forEach(function(entry) { var filePath = path.join(dir, entry); var stats = fs.statSync(filePath); if(stats.isDirectory()) { if(entry.substr(0, 1) == "@") { // When we encounter a namespace folder, augment all packages belonging to the scope var pkgFiles = fs.readdirSync(filePath); pkgFiles.forEach(function(entry) { if(stats.isDirectory()) { var pkgFilePath = path.join(filePath, entry); augmentPackageJSON(pkgFilePath, dependencies); } }); } else { augmentPackageJSON(filePath, dependencies); } } }); } } processDependencies("node_modules", lockObj.dependencies); fs.writeFileSync("package-lock.json", JSON.stringify(lockObj, null, 2)); ''; }; prepareAndInvokeNPM = {packageName, bypassCache, reconstructLock, npmFlags, production}: let forceOfflineFlag = if bypassCache then "--offline" else "--registry http://www.example.com"; in '' # Pinpoint the versions of all dependencies to the ones that are actually being used echo "pinpointing versions of dependencies..." source $pinpointDependenciesScriptPath # Patch the shebangs of the bundled modules to prevent them from # calling executables outside the Nix store as much as possible patchShebangs . # Deploy the Node.js package by running npm install. Since the # dependencies have been provided already by ourselves, it should not # attempt to install them again, which is good, because we want to make # it Nix's responsibility. If it needs to install any dependencies # anyway (e.g. because the dependency parameters are # incomplete/incorrect), it fails. # # The other responsibilities of NPM are kept -- version checks, build # steps, postprocessing etc. export HOME=$TMPDIR cd "${packageName}" runHook preRebuild ${lib.optionalString bypassCache '' ${lib.optionalString reconstructLock '' if [ -f package-lock.json ] then echo "WARNING: Reconstruct lock option enabled, but a lock file already exists!" echo "This will most likely result in version mismatches! We will remove the lock file and regenerate it!" rm package-lock.json else echo "No package-lock.json file found, reconstructing..." fi node ${reconstructPackageLock} ''} node ${addIntegrityFieldsScript} ''} npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} rebuild if [ "''${dontNpmInstall-}" != "1" ] then # NPM tries to download packages even when they already exist if npm-shrinkwrap is used. rm -f npm-shrinkwrap.json npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} install fi ''; # Builds and composes an NPM package including all its dependencies buildNodePackage = { name , packageName , version , dependencies ? [] , buildInputs ? [] , production ? true , npmFlags ? "" , dontNpmInstall ? false , bypassCache ? false , reconstructLock ? false , preRebuild ? "" , dontStrip ? true , unpackPhase ? "true" , buildPhase ? "true" , ... }@args: let extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "preRebuild" "unpackPhase" "buildPhase" ]; in stdenv.mkDerivation ({ name = "node_${name}-${version}"; buildInputs = [ tarWrapper python nodejs ] ++ lib.optional (stdenv.isLinux) utillinux ++ lib.optional (stdenv.isDarwin) libtool ++ buildInputs; inherit nodejs; inherit dontStrip; # Stripping may fail a build for some package deployments inherit dontNpmInstall preRebuild unpackPhase buildPhase; compositionScript = composePackage args; pinpointDependenciesScript = pinpointDependenciesOfPackage args; passAsFile = [ "compositionScript" "pinpointDependenciesScript" ]; installPhase = '' # Create and enter a root node_modules/ folder mkdir -p $out/lib/node_modules cd $out/lib/node_modules # Compose the package and all its dependencies source $compositionScriptPath ${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }} # Create symlink to the deployed executable folder, if applicable if [ -d "$out/lib/node_modules/.bin" ] then ln -s $out/lib/node_modules/.bin $out/bin fi # Create symlinks to the deployed manual page folders, if applicable if [ -d "$out/lib/node_modules/${packageName}/man" ] then mkdir -p $out/share for dir in "$out/lib/node_modules/${packageName}/man/"* do mkdir -p $out/share/man/$(basename "$dir") for page in "$dir"/* do ln -s $page $out/share/man/$(basename "$dir") done done fi # Run post install hook, if provided runHook postInstall ''; } // extraArgs); # Builds a node environment (a node_modules folder and a set of binaries) buildNodeDependencies = { name , packageName , version , src , dependencies ? [] , buildInputs ? [] , production ? true , npmFlags ? "" , dontNpmInstall ? false , bypassCache ? false , reconstructLock ? false , dontStrip ? true , unpackPhase ? "true" , buildPhase ? "true" , ... }@args: let extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" ]; in stdenv.mkDerivation ({ name = "node-dependencies-${name}-${version}"; buildInputs = [ tarWrapper python nodejs ] ++ lib.optional (stdenv.isLinux) utillinux ++ lib.optional (stdenv.isDarwin) libtool ++ buildInputs; inherit dontStrip; # Stripping may fail a build for some package deployments inherit dontNpmInstall unpackPhase buildPhase; includeScript = includeDependencies { inherit dependencies; }; pinpointDependenciesScript = pinpointDependenciesOfPackage args; passAsFile = [ "includeScript" "pinpointDependenciesScript" ]; installPhase = '' mkdir -p $out/${packageName} cd $out/${packageName} source $includeScriptPath # Create fake package.json to make the npm commands work properly cp ${src}/package.json . chmod 644 package.json ${lib.optionalString bypassCache '' if [ -f ${src}/package-lock.json ] then cp ${src}/package-lock.json . fi ''} # Go to the parent folder to make sure that all packages are pinpointed cd .. ${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."} ${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }} # Expose the executables that were installed cd .. ${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."} mv ${packageName} lib ln -s $out/lib/node_modules/.bin $out/bin ''; } // extraArgs); # Builds a development shell buildNodeShell = { name , packageName , version , src , dependencies ? [] , buildInputs ? [] , production ? true , npmFlags ? "" , dontNpmInstall ? false , bypassCache ? false , reconstructLock ? false , dontStrip ? true , unpackPhase ? "true" , buildPhase ? "true" , ... }@args: let nodeDependencies = buildNodeDependencies args; in stdenv.mkDerivation { name = "node-shell-${name}-${version}"; buildInputs = [ python nodejs ] ++ lib.optional (stdenv.isLinux) utillinux ++ buildInputs; buildCommand = '' mkdir -p $out/bin cat > $out/bin/shell <&1 | \ tail -1 } echo "=== ${owner}/${repo}@${branch} ===" echo -n "Looking up latest revision ... " rev=$(github_rev "${owner}" "${repo}" "${branch}"); echo "revision is \`$rev\`." sha256=$(github_sha256 "${owner}" "${repo}" "$rev"); echo "sha256 is \`$sha256\`." if [ "$sha256" == "" ]; then echo "sha256 is not valid!" exit 2 fi source_file=${path} echo "Content of source file (``$source_file``) written." cat < , supportedSystems ? [ "x86_64-linux" "i686-linux" /* "x86_64-darwin" */ "aarch64-linux" ] }: let lib = (import nixpkgsSrc {}).lib; # Make an attribute set for each system, the builder is then specialized to # use the selected system. forEachSystem = systems: builder /* system -> stdenv -> pkgs */: lib.genAttrs systems builder; # Make an attribute set for each compiler, the builder is then be specialized # to use the selected compiler. forEachCompiler = compilers: builder: system: builtins.listToAttrs (map (compiler: { name = compiler; value = builder compiler system; }) compilers); # Overide the previous derivation, with a different stdenv. builder = path: compiler: system: lib.getAttrFromPath path (import nixpkgsSrc { inherit system; overlays = [ # Add all packages from nixpkgs-mozilla. (import ./default.nix) # Define customStdenvs, which is a set of various compilers which can be # used to compile the given package against. (import ./compilers-overlay.nix) # Use the following overlay to override the requested package from # nixpkgs, with a custom stdenv taken from the compilers-overlay. (self: super: if compiler == null then {} else lib.setAttrByPath path ((lib.getAttrFromPath path super).override { stdenv = self.customStdenvs."${compiler}"; })) ]; }); build = path: { systems ? supportedSystems, compilers ? null }: forEachSystem systems ( if compilers == null then builder path null else forEachCompiler compilers (builder path) ); geckoCompilers = [ "clang" "clang36" "clang37" "clang38" "clang5" "clang6" "clang7" "clang12" "clang13" "gcc" "gcc6" "gcc5" "gcc49" "gcc48" #"gcc474" #"gcc473" #"gcc472" ]; jobs = { # For each system, and each compiler, create an attribute with the name of # the system and compiler. Use this attribute name to select which # environment you are interested in for building firefox. These can be # build using the following command: # # $ nix-build release.nix -A gecko.x86_64-linux.clang -o firefox-x64 # $ nix-build release.nix -A gecko.i686-linux.gcc48 -o firefox-x86 # # If you are only interested in getting a build environment, the use the # nix-shell command instead, which will skip the copy of Firefox sources, # and pull the the dependencies needed for building firefox with this # environment. # # $ nix-shell release.nix -A gecko.i686-linux.gcc --pure --command '$CC --version' # $ nix-shell release.nix -A gecko.x86_64-linux.clang --pure # # As some of the test script of Gecko are checking against absolute path, a # fake-FHS is provided for Gecko. It can be accessed by appending # ".fhs.env" behind the previous commands: # # $ nix-shell release.nix -A gecko.x86_64-linux.gcc.fhs.env # # Which will spawn a new shell where the closure of everything used to build # Gecko would be part of the fake-root. gecko = build [ "devEnv" "gecko" ] { compilers = geckoCompilers; }; latest = { "firefox-nightly-bin" = build [ "latest" "firefox-nightly-bin" ]; }; git-cinnabar = build [ "git-cinnabar" ]; }; in jobs ================================================ FILE: rust-overlay-install.sh ================================================ #!/bin/sh -e cd "$(dirname "$0")" || exit overlay_dir=$HOME/.config/nixpkgs/overlays name=rust-overlay.nix echo Installing $name as an overlay set -x mkdir -p "$overlay_dir" ln -s "$PWD/$name" "$overlay_dir/$name" ================================================ FILE: rust-overlay.nix ================================================ # This file provide a Rust overlay, which provides pre-packaged bleeding edge versions of rustc # and cargo. self: super: let fromTOML = # nix 2.1 added the fromTOML builtin if builtins ? fromTOML then builtins.fromTOML else (import ./lib/parseTOML.nix).fromTOML; parseRustToolchain = file: with builtins; if file == null then { } # Parse *.toml files as TOML else if self.lib.strings.hasSuffix ".toml" file then ({ channel ? null, date ? null, ... }: { inherit channel date; }) (fromTOML (readFile file)).toolchain else # Otherwise, assume the file contains just a rust version string let str = readFile file; # Match toolchain descriptions of type "nightly" or "nightly-2020-01-01" channel_by_name = match "([a-z]+)(-([0-9]{4}-[0-9]{2}-[0-9]{2}))?.*" str; # Match toolchain descriptions of type "1.34.0" or "1.34.0-2019-04-10" channel_by_version = match "([0-9]+\\.[0-9]+\\.[0-9]+)(-([0-9]{4}-[0-9]{2}-[0-9]{2}))?.*" str; in (x: { channel = head x; date = (head (tail (tail x))); }) ( if channel_by_name != null then channel_by_name else channel_by_version ); # In NixOS 24.11, the `pkgs.rust.toRustTarget` has become deprecated in favor of the # `.rust.rustcTarget` attribute of the platform. This function provides backwards compatibility in # case the caller is using a nixpkgs older than NixOS 24.11. toRustTargetCompat = platform: if platform ? rust && platform.rust ? rustcTarget then platform.rust.rustcTarget else super.rust.toRustTarget platform; # See https://github.com/rust-lang-nursery/rustup.rs/blob/master/src/dist/src/dist.rs defaultDistRoot = "https://static.rust-lang.org"; manifest_v1_url = { dist_root ? defaultDistRoot + "/dist", date ? null, staging ? false, # A channel can be "nightly", "beta", "stable", or "\d{1}\.\d{1,3}\.\d{1,2}". channel ? "nightly", # A path that points to a rust-toolchain file, typically ./rust-toolchain. rustToolchain ? null, ... }: let args = { inherit channel date; } // parseRustToolchain rustToolchain; in let inherit (args) date channel; in if date == null && staging == false then "${dist_root}/channel-rust-${channel}" else if date != null && staging == false then "${dist_root}/${date}/channel-rust-${channel}" else if date == null && staging == true then "${dist_root}/staging/channel-rust-${channel}" else throw "not a real-world case"; manifest_v2_url = args: (manifest_v1_url args) + ".toml"; getComponentsWithFixedPlatform = pkgs: pkgname: stdenv: let pkg = pkgs.${pkgname}; srcInfo = pkg.target.${toRustTargetCompat stdenv.targetPlatform} or pkg.target."*"; components = srcInfo.components or []; componentNamesList = builtins.map (pkg: pkg.pkg) (builtins.filter (pkg: (pkg.target != "*")) components); in componentNamesList; getExtensions = pkgs: pkgname: stdenv: let inherit (super.lib) unique; pkg = pkgs.${pkgname}; srcInfo = pkg.target.${toRustTargetCompat stdenv.targetPlatform} or pkg.target."*"; extensions = srcInfo.extensions or []; extensionNamesList = unique (builtins.map (pkg: pkg.pkg) extensions); in extensionNamesList; hasTarget = pkgs: pkgname: target: pkgs ? ${pkgname}.target.${target}; getTuples = pkgs: name: targets: builtins.map (target: { inherit name target; }) (builtins.filter (target: hasTarget pkgs name target) targets); # In the manifest, a package might have different components which are bundled with it, as opposed as the extensions which can be added. # By default, a package will include the components for the same architecture, and offers them as extensions for other architectures. # # This functions returns a list of { name, target } attribute sets, which includes the current system package, and all its components for the selected targets. # The list contains the package for the pkgTargets as well as the packages for components for all compTargets getTargetPkgTuples = pkgs: pkgname: pkgTargets: compTargets: stdenv: let inherit (builtins) elem; inherit (super.lib) intersectLists; components = getComponentsWithFixedPlatform pkgs pkgname stdenv; extensions = getExtensions pkgs pkgname stdenv; compExtIntersect = intersectLists components extensions; tuples = (getTuples pkgs pkgname pkgTargets) ++ (builtins.map (name: getTuples pkgs name compTargets) compExtIntersect); in tuples; getFetchUrl = pkgs: pkgname: target: stdenv: fetchurl: let pkg = pkgs.${pkgname}; srcInfo = pkg.target.${target}; in (super.fetchurl { url = srcInfo.xz_url or srcInfo.url; sha256 = srcInfo.xz_hash or srcInfo.hash; }); checkMissingExtensions = pkgs: pkgname: stdenv: extensions: let inherit (builtins) head; inherit (super.lib) concatStringsSep subtractLists; availableExtensions = getExtensions pkgs pkgname stdenv; missingExtensions = subtractLists availableExtensions extensions; extensionsToInstall = if missingExtensions == [] then extensions else throw '' While compiling ${pkgname}: the extension ${head missingExtensions} is not available. Select extensions from the following list: ${concatStringsSep "\n" availableExtensions}''; in extensionsToInstall; getComponents = pkgs: pkgname: targets: extensions: targetExtensions: stdenv: fetchurl: let inherit (builtins) head map; inherit (super.lib) flatten remove subtractLists unique; targetExtensionsToInstall = checkMissingExtensions pkgs pkgname stdenv targetExtensions; extensionsToInstall = checkMissingExtensions pkgs pkgname stdenv extensions; hostTargets = [ "*" (toRustTargetCompat stdenv.hostPlatform) (toRustTargetCompat stdenv.targetPlatform) ]; pkgTuples = flatten (getTargetPkgTuples pkgs pkgname hostTargets targets stdenv); extensionTuples = flatten (map (name: getTargetPkgTuples pkgs name hostTargets targets stdenv) extensionsToInstall); targetExtensionTuples = flatten (map (name: getTargetPkgTuples pkgs name targets targets stdenv) targetExtensionsToInstall); pkgsTuples = pkgTuples ++ extensionTuples ++ targetExtensionTuples; missingTargets = subtractLists (map (tuple: tuple.target) pkgsTuples) (remove "*" targets); pkgsTuplesToInstall = if missingTargets == [] then pkgsTuples else throw '' While compiling ${pkgname}: the target ${head missingTargets} is not available for any package.''; in map (tuple: { name = tuple.name; src = (getFetchUrl pkgs tuple.name tuple.target stdenv fetchurl); }) pkgsTuplesToInstall; installComponents = stdenv: namesAndSrcs: let inherit (builtins) map; installComponent = name: src: stdenv.mkDerivation { inherit name; inherit src; # No point copying src to a build server, then copying back the # entire unpacked contents after just a little twiddling. preferLocalBuild = true; # (@nbp) TODO: Check on Windows and Mac. # This code is inspired by patchelf/setup-hook.sh to iterate over all binaries. installPhase = '' patchShebangs install.sh CFG_DISABLE_LDCONFIG=1 ./install.sh --prefix=$out --verbose setInterpreter() { local dir="$1" [ -e "$dir" ] || return 0 echo "Patching interpreter of ELF executables and libraries in $dir" local i while IFS= read -r -d ''$'\0' i; do if [[ "$i" =~ .build-id ]]; then continue; fi if ! isELF "$i"; then continue; fi echo "setting interpreter of $i" if [[ -x "$i" ]]; then # Handle executables patchelf \ --set-interpreter "$(cat $NIX_CC/nix-support/dynamic-linker)" \ --set-rpath "${super.lib.makeLibraryPath [ self.zlib ]}:$out/lib" \ "$i" || true else # Handle libraries patchelf \ --set-rpath "${super.lib.makeLibraryPath [ self.zlib ]}:$out/lib" \ "$i" || true fi done < <(find "$dir" -type f -print0) } setInterpreter $out ''; postFixup = '' # Function moves well-known files from etc/ handleEtc() { local oldIFS="$IFS" # Directories we are aware of, given as substitution lists for paths in \ "etc/bash_completion.d","share/bash_completion/completions","etc/bash_completions.d","share/bash_completions/completions"; do # Some directoties may be missing in some versions. If so we just skip them. # See https://github.com/mozilla/nixpkgs-mozilla/issues/48 for more infomation. if [ ! -e $paths ]; then continue; fi IFS="," set -- $paths IFS="$oldIFS" local orig_path="$1" local wanted_path="$2" # Rename the files if [ -d ./"$orig_path" ]; then mkdir -p "$(dirname ./"$wanted_path")" fi mv -v ./"$orig_path" ./"$wanted_path" # Fail explicitly if etc is not empty so we can add it to the list and/or report it upstream rmdir ./etc || { echo Installer tries to install to /etc: find ./etc exit 1 } done } if [ -d "$out"/etc ]; then pushd "$out" handleEtc popd fi ''; dontStrip = true; }; in map (nameAndSrc: (installComponent nameAndSrc.name nameAndSrc.src)) namesAndSrcs; # Manifest files are organized as follow: # { date = "2017-03-03"; # pkg.cargo.version= "0.18.0-nightly (5db6d64 2017-03-03)"; # pkg.cargo.target.x86_64-unknown-linux-gnu = { # available = true; # hash = "abce..."; # sha256 # url = "https://static.rust-lang.org/dist/....tar.gz"; # xz_hash = "abce..."; # sha256 # xz_url = "https://static.rust-lang.org/dist/....tar.xz"; # }; # } # # The packages available usually are: # cargo, rust-analysis, rust-docs, rust-src, rust-std, rustc, and # rust, which aggregates them in one package. # # For each package the following options are available: # extensions - The extensions that should be installed for the package. # For example, install the package rust and add the extension rust-src. # targets - The package will always be installed for the host system, but with this option # extra targets can be specified, e.g. "mips-unknown-linux-musl". The target # will only apply to components of the package that support being installed for # a different architecture. For example, the rust package will install rust-std # for the host system and the targets. # targetExtensions - If you want to force extensions to be installed for the given targets, this is your option. # All extensions in this list will be installed for the target architectures. # *Attention* If you want to install an extension like rust-src, that has no fixed architecture (arch *), # you will need to specify this extension in the extensions options or it will not be installed! fromManifestFile = manifest: { stdenv, lib, fetchurl, patchelf }: let inherit (builtins) elemAt; inherit (super) makeOverridable; inherit (super.lib) flip mapAttrs; pkgs = fromTOML (builtins.readFile manifest); in flip mapAttrs pkgs.pkg (name: pkg: makeOverridable ({extensions, targets, targetExtensions}: let version' = builtins.match "([^ ]*) [(]([^ ]*) ([^ ]*)[)]" pkg.version; version = "${elemAt version' 0}-${elemAt version' 2}-${elemAt version' 1}"; namesAndSrcs = getComponents pkgs.pkg name targets extensions targetExtensions stdenv fetchurl; components = installComponents stdenv namesAndSrcs; componentsOuts = builtins.map (comp: (super.lib.strings.escapeNixString (super.lib.getOutput "out" comp))) components; in super.pkgs.symlinkJoin { name = name + "-" + version; paths = components; postBuild = '' # If rustc or rustdoc is in the derivation, we need to copy their # executable into the final derivation. This is required # for making them find the correct SYSROOT. # Similarly, we copy the python files for gdb pretty-printers since # its auto-load-safe-path mechanism doesn't like symlinked files. for target in $out/bin/{rustc,rustdoc} $out/lib/rustlib/etc/*.py; do if [ -e $target ]; then cp --remove-destination "$(realpath -e $target)" $target # The SYSROOT is determined by using the librustc_driver-*.so. # So, we need to point to the *.so files in our derivation. chmod u+w $target patchelf --set-rpath "$out/lib" $target || true fi done # Here we copy the librustc_driver-*.so to our derivation. # The SYSROOT is determined based on the path of this library. if test "" != $out/lib/librustc_driver-*.so &> /dev/null; then RUSTC_DRIVER_PATH=$(realpath -e $out/lib/librustc_driver-*.so) rm $out/lib/librustc_driver-*.so cp $RUSTC_DRIVER_PATH $out/lib/ fi ''; # Export the manifest file as part of the nix-support files such # that one can compute the sha256 of a manifest to freeze it for # reproducible builds. MANIFEST_FILE = manifest; postInstall = '' mkdir $out/nix-support cp $MANIFEST_FILE $out/nix-support/manifest.toml ''; # Add the compiler as part of the propagated build inputs in order # to run: # # $ nix-shell -p rustChannels.stable.rust # # And get a fully working Rust compiler, with the stdenv linker. propagatedBuildInputs = [ stdenv.cc ]; meta.platforms = lib.platforms.all; } ) { extensions = []; targets = []; targetExtensions = []; } ); fromManifest = sha256: manifest: { stdenv, lib, fetchurl, patchelf }: let manifestFile = if sha256 == null then builtins.fetchurl manifest else fetchurl { url = manifest; inherit sha256; }; in fromManifestFile manifestFile { inherit stdenv lib fetchurl patchelf; }; in rec { lib = super.lib // { inherit fromTOML; rustLib = { inherit fromManifest fromManifestFile manifest_v2_url; }; }; rustChannelOf = { sha256 ? null, ... } @ manifest_args: fromManifest sha256 (manifest_v2_url manifest_args) { inherit (super) lib; inherit (self) stdenv fetchurl patchelf; } ; # Set of packages which are automagically updated. Do not rely on these for # reproducible builds. latest = (super.latest or {}) // { rustChannels = { nightly = rustChannelOf { channel = "nightly"; }; beta = rustChannelOf { channel = "beta"; }; stable = rustChannelOf { channel = "stable"; }; }; }; # Helper builder rustChannelOfTargets = channel: date: targets: (rustChannelOf { inherit channel date; }) .rust.override { inherit targets; }; # For backward compatibility rustChannels = latest.rustChannels; # For each channel: # latest.rustChannels.nightly.cargo # latest.rustChannels.nightly.rust # Aggregate all others. (recommended) # latest.rustChannels.nightly.rustc # latest.rustChannels.nightly.rust-analysis # latest.rustChannels.nightly.rust-docs # latest.rustChannels.nightly.rust-src # latest.rustChannels.nightly.rust-std # For a specific date: # (rustChannelOf { date = "2017-06-06"; channel = "beta"; }).rust } ================================================ FILE: rust-src-overlay.nix ================================================ # Overlay that builds on top of rust-overlay.nix. # Adds rust-src component to all channels which is helpful for racer, intellij, ... self: super: let mapAttrs = super.lib.mapAttrs; flip = super.lib.flip; in { # install stable rust with rust-src: # "nix-env -i -A nixos.latest.rustChannels.stable.rust" latest.rustChannels = flip mapAttrs super.latest.rustChannels (name: value: value // { rust = value.rust.override { extensions = ["rust-src"]; }; }); } ================================================ FILE: update.nix ================================================ let _pkgs = import {}; _nixpkgs = _pkgs.fetchFromGitHub (_pkgs.lib.importJSON ./pkgs/nixpkgs.json); in { pkgs ? import _nixpkgs {} , package ? null , maintainer ? null , dont_prompt ? false }: # TODO: add assert statements let pkgs-mozilla = import ./default.nix { inherit pkgs; }; dont_prompt_str = if dont_prompt then "yes" else "no"; packagesWith = cond: return: set: pkgs.lib.flatten (pkgs.lib.mapAttrsToList (name: pkg: let result = builtins.tryEval ( if pkgs.lib.isDerivation pkg && cond name pkg then [(return name pkg)] else if pkg.recurseForDerivations or false || pkg.recurseForRelease or false then packagesWith cond return pkg else [] ); in if result.success then result.value else [] ) set ); packagesWithUpdateScriptAndMaintainer = maintainer': let maintainer = if ! builtins.hasAttr maintainer' pkgs.lib.maintainers then builtins.throw "Maintainer with name `${maintainer'} does not exist in `lib/maintainers.nix`." else builtins.getAttr maintainer' pkgs.lib.maintainers; in packagesWith (name: pkg: builtins.hasAttr "updateScript" pkg && (if builtins.hasAttr "maintainers" pkg.meta then (if builtins.isList pkg.meta.maintainers then builtins.elem maintainer pkg.meta.maintainers else maintainer == pkg.meta.maintainers ) else false ) ) (name: pkg: pkg) pkgs-mozilla; packageByName = name: let package = pkgs.lib.attrByPath (pkgs.lib.splitString "." name) null pkgs-mozilla; in if package == null then builtins.throw "Package with an attribute name `${name}` does not exists." else if ! builtins.hasAttr "updateScript" package then builtins.throw "Package with an attribute name `${name}` does have an `passthru.updateScript` defined." else package; packages = if package != null then [ (packageByName package) ] else if maintainer != null then packagesWithUpdateScriptAndMaintainer maintainer else builtins.throw "No arguments provided.\n\n${helpText}"; helpText = '' Please run: % nix-shell maintainers/scripts/update.nix --argstr maintainer garbas to run all update scripts for all packages that lists \`garbas\` as a maintainer and have \`updateScript\` defined, or: % nix-shell maintainers/scripts/update.nix --argstr package garbas to run update script for specific package. ''; runUpdateScript = package: '' echo -ne " - ${package.name}: UPDATING ..."\\r ${package.updateScript} &> ${(builtins.parseDrvName package.name).name}.log CODE=$? if [ "$CODE" != "0" ]; then echo " - ${package.name}: ERROR " echo "" echo "--- SHOWING ERROR LOG FOR ${package.name} ----------------------" echo "" cat ${(builtins.parseDrvName package.name).name}.log echo "" echo "--- SHOWING ERROR LOG FOR ${package.name} ----------------------" exit $CODE else rm ${(builtins.parseDrvName package.name).name}.log fi echo " - ${package.name}: DONE. " ''; in pkgs.stdenv.mkDerivation { name = "nixpkgs-mozilla-update-script"; buildCommand = '' echo "" echo "----------------------------------------------------------------" echo "" echo "Not possible to update packages using \`nix-build\`" echo "" echo "${helpText}" echo "----------------------------------------------------------------" exit 1 ''; shellHook = '' echo "" echo "Going to be running update for following packages:" echo "${builtins.concatStringsSep "\n" (map (x: " - ${x.name}") packages)}" echo "" if [ "${dont_prompt_str}" = "no" ]; then read -n1 -r -p "Press space to continue..." confirm else confirm="" fi if [ "$confirm" = "" ]; then echo "" echo "Running update for:" ${builtins.concatStringsSep "\n" (map runUpdateScript packages)} echo "" echo "Packages updated!" exit 0 else echo "Aborting!" exit 1 fi ''; }